WorldWideScience

Sample records for model validation results

  1. Challenges in validating model results for first year ice

    Science.gov (United States)

    Melsom, Arne; Eastwood, Steinar; Xie, Jiping; Aaboe, Signe; Bertino, Laurent

    2017-04-01

    In order to assess the quality of model results for the distribution of first year ice, a comparison with a product based on observations from satellite-borne instruments has been performed. Such a comparison is not straightforward due to the contrasting algorithms that are used in the model product and the remote sensing product. The implementation of the validation is discussed in light of the differences between this set of products, and validation results are presented. The model product is the daily updated 10-day forecast from the Arctic Monitoring and Forecasting Centre in CMEMS. The forecasts are produced with the assimilative ocean prediction system TOPAZ. Presently, observations of sea ice concentration and sea ice drift are introduced in the assimilation step, but data for sea ice thickness and ice age (or roughness) are not included. The model computes the age of the ice by recording and updating the time passed after ice formation as sea ice grows and deteriorates as it is advected inside the model domain. Ice that is younger than 365 days is classified as first year ice. The fraction of first-year ice is recorded as a tracer in each grid cell. The Ocean and Sea Ice Thematic Assembly Centre in CMEMS redistributes a daily product from the EUMETSAT OSI SAF of gridded sea ice conditions which include "ice type", a representation of the separation of regions between those infested by first year ice, and those infested by multi-year ice. The ice type is parameterized based on data for the gradient ratio GR(19,37) from SSMIS observations, and from the ASCAT backscatter parameter. This product also includes information on ambiguity in the processing of the remote sensing data, and the product's confidence level, which have a strong seasonal dependency.

  2. ExEP yield modeling tool and validation test results

    Science.gov (United States)

    Morgan, Rhonda; Turmon, Michael; Delacroix, Christian; Savransky, Dmitry; Garrett, Daniel; Lowrance, Patrick; Liu, Xiang Cate; Nunez, Paul

    2017-09-01

    EXOSIMS is an open-source simulation tool for parametric modeling of the detection yield and characterization of exoplanets. EXOSIMS has been adopted by the Exoplanet Exploration Programs Standards Definition and Evaluation Team (ExSDET) as a common mechanism for comparison of exoplanet mission concept studies. To ensure trustworthiness of the tool, we developed a validation test plan that leverages the Python-language unit-test framework, utilizes integration tests for selected module interactions, and performs end-to-end crossvalidation with other yield tools. This paper presents the test methods and results, with the physics-based tests such as photometry and integration time calculation treated in detail and the functional tests treated summarily. The test case utilized a 4m unobscured telescope with an idealized coronagraph and an exoplanet population from the IPAC radial velocity (RV) exoplanet catalog. The known RV planets were set at quadrature to allow deterministic validation of the calculation of physical parameters, such as working angle, photon counts and integration time. The observing keepout region was tested by generating plots and movies of the targets and the keepout zone over a year. Although the keepout integration test required the interpretation of a user, the test revealed problems in the L2 halo orbit and the parameterization of keepout applied to some solar system bodies, which the development team was able to address. The validation testing of EXOSIMS was performed iteratively with the developers of EXOSIMS and resulted in a more robust, stable, and trustworthy tool that the exoplanet community can use to simulate exoplanet direct-detection missions from probe class, to WFIRST, up to large mission concepts such as HabEx and LUVOIR.

  3. Satellite data for systematic validation of wave model results in the Black Sea

    Science.gov (United States)

    Behrens, Arno; Staneva, Joanna

    2017-04-01

    The Black Sea is with regard to the availability of traditional in situ wave measurements recorded by usual waverider buoys a data sparse semi-enclosed sea. The only possibility for systematic validations of wave model results in such a regional area is the use of satellite data. In the frame of the COPERNICUS Marine Evolution System for the Black Sea that requires wave predictions, the third-generation spectral wave model WAM is used. The operational system is demonstrated based on four years' systematic comparisons with satellite data. The aim of this investigation was to answer two questions. Is the wave model able to provide a reliable description of the wave conditions in the Black Sea and are the satellite measurements suitable for validation purposes on such a regional scale ? Detailed comparisons between measured data and computed model results for the Black Sea including yearly statistics have been done for about 300 satellite overflights per year. The results discussed the different verification schemes needed to review the forecasting skills of the operational system. The good agreement between measured and modeled data supports the expectation that the wave model provides reasonable results and that the satellite data is of good quality and offer an appropriate validation alternative to buoy measurements. This is the required step towards further use of those satellite data for assimilation into the wave fields to improve the wave predictions. Additional support for the good quality of the wave predictions is provided by comparisons between ADCP measurements that are available for a short time period in February 2012 and the corresponding model results at a location near the Bulgarian coast in the western Black Sea. Sensitivity tests with different wave model options and different driving wind fields have been done which identify the appropriate model configuration that provides the best wave predictions. In addition to the comparisons between measured

  4. Furthering our Understanding of Land Surface Interactions using SVAT modelling: Results from SimSphere's Validation

    Science.gov (United States)

    North, Matt; Petropoulos, George; Ireland, Gareth; Rendal, Daisy; Carlson, Toby

    2015-04-01

    With current predicted climate change, there is an increased requirement to gain knowledge on the terrestrial biosphere, for numerous agricultural, hydrological and meteorological applications. To this end, Soil Vegetation Atmospheric Transfer (SVAT) models are quickly becoming the preferred scientific tool to monitor, at fine temporal and spatial resolutions, detailed information on numerous parameters associated with Earth system interactions. Validation of any model is critical to assess its accuracy, generality and realism to distinctive ecosystems and subsequently acts as important step before its operational distribution. In this study, the SimSphere SVAT model has been validated to fifteen different sites of the FLUXNET network, where model performance was statistically evaluated by directly comparing the model predictions vs in situ data, for cloud free days with a high energy balance closure. Specific focus is given to the models ability to simulate parameters associated with the energy balance, namely Shortwave Incoming Solar Radiation (Rg), Net Radiation (Rnet), Latent Heat (LE), Sensible Heat (H), Air Temperature at 1.3m (Tair 1.3m) and Air temperature at 50m (Tair 50m). Comparisons were performed for a number distinctive ecosystem types and for 150 days in total using in-situ data from ground observational networks acquired from the year 2011 alone. Evaluation of the models' coherence to reality was evaluated on the basis of a series of statistical parameters including RMSD, R2, Scatter, Bias, MAE , NASH index, Slope and Intercept. Results showed good to very good agreement between predicted and observed datasets, particularly so for LE, H, Tair 1.3m and Tair 50m where mean error distribution values indicated excellent model performance. Due to the systematic underestimation, poorer simulation accuracies were exhibited for Rg and Rnet, yet all values reported are still analogous to other validatory studies of its kind. In overall, the model

  5. Additional Model Datasets and Results to Accelerate the Verification and Validation of RELAP-7

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Jun Soo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Choi, Yong Joon [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-11-01

    The RELAP-7 code verification and validation activities are ongoing under the code assessment plan proposed in the previous document (INL-EXT-16-40015). Among the list of V&V test problems in the ‘RELAP-7 code V&V RTM (Requirements Traceability Matrix)’, the RELAP-7 7-equation model has been tested with additional demonstration problems and the results of these tests are reported in this document. In this report, we describe the testing process, the test cases that were conducted, and the results of the evaluation.

  6. Results of the Independent Verification and Validation Study for the D2-Puff Model

    National Research Council Canada - National Science Library

    Bowers, J

    1999-01-01

    .... The independent verification and validation (IV&V) study of D2-Puff Version 2.0.6 focused on these accreditation requirements and the implicit requirement that the model provide safe-sided hazard estimates...

  7. Challenges of forest landscape modeling - simulating large landscapes and validating results

    Science.gov (United States)

    Hong S. He; Jian Yang; Stephen R. Shifley; Frank R. Thompson

    2011-01-01

    Over the last 20 years, we have seen a rapid development in the field of forest landscape modeling, fueled by both technological and theoretical advances. Two fundamental challenges have persisted since the inception of FLMs: (1) balancing realistic simulation of ecological processes at broad spatial and temporal scales with computing capacity, and (2) validating...

  8. ASTER Global Digital Elevation Model Version 2 - summary of validation results

    Science.gov (United States)

    Tachikawa, Tetushi; Kaku, Manabu; Iwasaki, Akira; Gesch, Dean B.; Oimoen, Michael J.; Zhang, Z.; Danielson, Jeffrey J.; Krieger, Tabatha; Curtis, Bill; Haase, Jeff; Abrams, Michael; Carabajal, C.; Meyer, Dave

    2011-01-01

    On June 29, 2009, NASA and the Ministry of Economy, Trade and Industry (METI) of Japan released a Global Digital Elevation Model (GDEM) to users worldwide at no charge as a contribution to the Global Earth Observing System of Systems (GEOSS). This “version 1” ASTER GDEM (GDEM1) was compiled from over 1.2 million scenebased DEMs covering land surfaces between 83°N and 83°S latitudes. A joint U.S.-Japan validation team assessed the accuracy of the GDEM1, augmented by a team of 20 cooperators. The GDEM1 was found to have an overall accuracy of around 20 meters at the 95% confidence level. The team also noted several artifacts associated with poor stereo coverage at high latitudes, cloud contamination, water masking issues and the stacking process used to produce the GDEM1 from individual scene-based DEMs (ASTER GDEM Validation Team, 2009). Two independent horizontal resolution studies estimated the effective spatial resolution of the GDEM1 to be on the order of 120 meters.

  9. The Dutch Rhine-Meuse delta in 3D: A validation of model results

    NARCIS (Netherlands)

    Maljers, D.; Stafleu, J.; Busschers, F.; Gunnink, J.L.

    2010-01-01

    The Geological Survey of the Netherlands aims at building a 3D geological property model of the upper 30 meters of the Dutch subsurface. This model, called GeoTOP, provides a basis for answering subsurface related questions on, amongst others, sand and gravel resources. Modelling is carried out per

  10. Validation Techniques of network harmonic models based on switching of a series linear component and measuring resultant harmonic increments

    DEFF Research Database (Denmark)

    Wiechowski, Wojciech Tomasz; Lykkegaard, Jan; Bak, Claus Leth

    2007-01-01

    In this paper two methods of validation of transmission network harmonic models are introduced. The methods were developed as a result of the work presented in [1]. The first method allows calculating the transfer harmonic impedance between two nodes of a network. Switching a linear, series network......, as for example a transmission line. Both methods require that harmonic measurements performed at two ends of the disconnected element are precisely synchronized....... are used for calculation of the transfer harmonic impedance between the nodes. The determined transfer harmonic impedance can be used to validate a computer model of the network. The second method is an extension of the fist one. It allows switching a series element that contains a shunt branch...

  11. Validating management simulation models and implications for communicating results to stakeholders

    NARCIS (Netherlands)

    Pastoors, M.A.; Poos, J.J.; Kraak, S.B.M.; Machiels, M.A.M.

    2007-01-01

    Simulations of management plans generally aim to demonstrate the robustness of the plans to assumptions about population dynamics and fleet dynamics. Such modelling is characterized by specification of an operating model (OM) representing the underlying truth and a management procedure that mimics

  12. Simulator validation results and proposed reporting format from flight testing a software model of a complex, high-performance airplane.

    Science.gov (United States)

    2008-01-01

    Computer simulations are often used in aviation studies. These simulation tools may require complex, high-fidelity aircraft models. Since many of the flight models used are third-party developed products, independent validation is desired prior to im...

  13. Out-of-plane buckling of pantographic fabrics in displacement-controlled shear tests: experimental results and model validation

    Science.gov (United States)

    Barchiesi, Emilio; Ganzosch, Gregor; Liebold, Christian; Placidi, Luca; Grygoruk, Roman; Müller, Wolfgang H.

    2018-01-01

    Due to the latest advancements in 3D printing technology and rapid prototyping techniques, the production of materials with complex geometries has become more affordable than ever. Pantographic structures, because of their attractive features, both in dynamics and statics and both in elastic and inelastic deformation regimes, deserve to be thoroughly investigated with experimental and theoretical tools. Herein, experimental results relative to displacement-controlled large deformation shear loading tests of pantographic structures are reported. In particular, five differently sized samples are analyzed up to first rupture. Results show that the deformation behavior is strongly nonlinear, and the structures are capable of undergoing large elastic deformations without reaching complete failure. Finally, a cutting edge model is validated by means of these experimental results.

  14. Presal36: a high resolution ocean current model for Brazilian pre-salt area: implementation and validation results

    Energy Technology Data Exchange (ETDEWEB)

    Schoellkopf, Jacques P. [Advanced Subsea do Brasil Ltda., Rio de Janeiro, RJ (Brazil)

    2012-07-01

    The PRESAL 36 JIP is a project for the development of a powerful Ocean Current Model of 1/36 of a degree resolution, nested in an existing Global Ocean global Model, Mercator PSY4 (1/12-a-degree resolution ), with tide corrections, improved bathymetry accuracy and high frequency atmospheric forcing (every 3 hours). The simulation outputs will be the 3 dimensional structure of the velocity fields (u,v,w) at 50 vertical levels over the water column, including geostrophic, Ekman and tidal currents, together with Temperature, Salinity and sea surface height at a sub-mesoscale spatial resolution. Simulations will run in hindcast, nowcast and forecast modes, with a temporal resolution of 3 hours . This Ocean current model will allow to perform detailed statistical studies on various areas using conditions analysed using hindcast mode, short term operational condition prediction for various surface and sub sea operations using realtime and Forecast modes. The paper presents a publication of significant results of the project, in term of pre-sal zoomed model implementation, and high resolution model validation. It demonstrate the capability to properly describe ocean current phenomenon at beyond mesoscale frontier. This project demonstrate the feasibility of obtaining accurate information for engineering studies and operational conditions, based on a 'zoom technique' starting from global ocean models. (author)

  15. Validation of model-based brain shift correction in neurosurgery via intraoperative magnetic resonance imaging: preliminary results

    Science.gov (United States)

    Luo, Ma; Frisken, Sarah F.; Weis, Jared A.; Clements, Logan W.; Unadkat, Prashin; Thompson, Reid C.; Golby, Alexandra J.; Miga, Michael I.

    2017-03-01

    The quality of brain tumor resection surgery is dependent on the spatial agreement between preoperative image and intraoperative anatomy. However, brain shift compromises the aforementioned alignment. Currently, the clinical standard to monitor brain shift is intraoperative magnetic resonance (iMR). While iMR provides better understanding of brain shift, its cost and encumbrance is a consideration for medical centers. Hence, we are developing a model-based method that can be a complementary technology to address brain shift in standard resections, with resource-intensive cases as referrals for iMR facilities. Our strategy constructs a deformation `atlas' containing potential deformation solutions derived from a biomechanical model that account for variables such as cerebrospinal fluid drainage and mannitol effects. Volumetric deformation is estimated with an inverse approach that determines the optimal combinatory `atlas' solution fit to best match measured surface deformation. Accordingly, preoperative image is updated based on the computed deformation field. This study is the latest development to validate our methodology with iMR. Briefly, preoperative and intraoperative MR images of 2 patients were acquired. Homologous surface points were selected on preoperative and intraoperative scans as measurement of surface deformation and used to drive the inverse problem. To assess the model accuracy, subsurface shift of targets between preoperative and intraoperative states was measured and compared to model prediction. Considering subsurface shift above 3 mm, the proposed strategy provides an average shift correction of 59% across 2 cases. While further improvements in both the model and ability to validate with iMR are desired, the results reported are encouraging.

  16. Validation of satellite SAR offshore wind speed maps to in-situ data, microscala and mesoscale model results

    Energy Technology Data Exchange (ETDEWEB)

    Hasager, C.B.; Astrup, P.; Barthelmie, R.; Dellwik, E.; Hoffmann Joergensen, B.; Gylling Mortensen, N.; Nielsen, M.; Pryor, S.; Rathmann, O.

    2002-05-01

    A validation study has been performed in order to investigate the precision and accuracy of the satellite-derived ERS-2 SAR wind products in offshore regions. The overall project goal is to develop a method for utilizing the satellite wind speed maps for offshore wind resources, e.g. in future planning of offshore wind farms. The report describes the validation analysis in detail for three sites in Denmark, Italy and Egypt. The site in Norway is analyzed by the Nansen Environmental and Remote Sensing Centre (NERSC). Wind speed maps and wind direction maps from Earth Observation data recorded by the ERS-2 SAR satellite have been obtained from the NERSC. For the Danish site the wind speed and wind direction maps have been compared to in-situ observations from a met-mast at Horns Rev in the North Sea located 14 km offshore. The SAR wind speeds have been area-averaged by simple and advanced footprint modelling, ie. the upwind conditions to the meteorological mast are explicitly averaged in the SAR wind speed maps before comparison. The comparison results are very promising with a standard error of {+-} 0.61 m s{sup -1}, a bias {approx}2 m s{sup -1} and R{sup 2} {approx}0.88 between in-situ wind speed observations and SAR footprint averaged values at 10 m level. Wind speeds predicted by the local scale model LINCOM and the mesoscale model KAMM2 have been compared to the spatial variations in the SAR wind speed maps. The finding is a good correspondence between SAR observations and model results. Near the coast is an 800 m wide band in which the SAR wind speed observations have a strong negative bias. The bathymetry of Horns Rev combined with tidal currents give rise to bias in the SAR wind speed maps near areas of shallow, complex bottom topography in some cases. A total of 16 cases were analyzed for Horns Rev. For Maddalena in Italy five cases were analyzed. At the Italian site the SAR wind speed maps were compared to WAsP and KAMM2 model results. The WAsP model

  17. Model Validation Status Review

    International Nuclear Information System (INIS)

    E.L. Hardin

    2001-01-01

    The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified, and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M and O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and

  18. Model Validation Status Review

    Energy Technology Data Exchange (ETDEWEB)

    E.L. Hardin

    2001-11-28

    The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified, and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M&O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and

  19. Validation of simulation models

    DEFF Research Database (Denmark)

    Rehman, Muniza; Pedersen, Stig Andur

    2012-01-01

    In philosophy of science, the interest for computational models and simulations has increased heavily during the past decades. Different positions regarding the validity of models have emerged but the views have not succeeded in capturing the diversity of validation methods. The wide variety...

  20. Condensation in the cold leg as results of ECC water injection during A LOCA: modeling and validation

    International Nuclear Information System (INIS)

    Liao, J.; Frepoli, C.; Ohkawa, K.

    2011-01-01

    During postulated LOCA events in pressurized water reactors, cold water is injected into cold legs by emergency core cooling system (ECCS). As the ECC water comes into contact with steam, the amount of condensation in the cold legs which results from mixing of the two phases is expected to have an effect on the thermal hydraulic behavior of the system. During boil off period and recovery period of a small break LOCA, the condensation in the cold leg is enhanced by the impingement of the ECC jet on the layer of liquid, when the flow in the cold leg is expected to be horizontal stratified. Consequently, the reactor coolant system (RCS) depressurization is accelerated, which in turn increases ECC flow rate and promotes accumulator injection. For a large break LOCA, the condensation process in the cold leg during refill period helps to reduce bypass flow at the top of downcomer, promoting ECC penetration. The condensation in the cold leg during reflood period is an important factor in determining the ECC bypass, the break flow rate, the downcomer and core water inventory, and the liquid subcooling in the downcomer, which in turn impacts the peak cladding temperature during reflood. A cold leg condensation model was considered for the new release of WCOBRA/TRAC-TF2 safety analysis code and presented in an authors' previous work. The model was further improved to better capture relevant data and a revised model was found to be in better agreement with such experimental data. The intent of this paper is to present the validation for the cold leg condensation model. The improved cold leg condensation model is assessed against various small break and large break LOCA separate effects tests such as COSI experiments, ROSA experiments and UPTF experiments. Those experiments cover a wide range of cold leg dimensions, system pressures, mass flow rates, and fluid properties. All the predicted condensation results match reasonably well with the experimental data. (author)

  1. HEDR model validation plan

    International Nuclear Information System (INIS)

    Napier, B.A.; Gilbert, R.O.; Simpson, J.C.; Ramsdell, J.V. Jr.; Thiede, M.E.; Walters, W.H.

    1993-06-01

    The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computational ''tools'' for estimating the possible radiation dose that individuals may have received from past Hanford Site operations. This document describes the planned activities to ''validate'' these tools. In the sense of the HEDR Project, ''validation'' is a process carried out by comparing computational model predictions with field observations and experimental measurements that are independent of those used to develop the model

  2. Application of regional physically-based landslide early warning model: tuning of the input parameters and validation of the results

    Science.gov (United States)

    D'Ambrosio, Michele; Tofani, Veronica; Rossi, Guglielmo; Salvatici, Teresa; Tacconi Stefanelli, Carlo; Rosi, Ascanio; Benedetta Masi, Elena; Pazzi, Veronica; Vannocci, Pietro; Catani, Filippo; Casagli, Nicola

    2017-04-01

    runs in real-time by assimilating weather data and uses Monte Carlo simulation techniques to manage the geotechnical and hydrological input parameters. In this context, an assessment of the factors controlling the geotechnical and hydrological features is crucial in order to understand the occurrence of slope instability mechanisms and to provide reliable forecasting of the hydrogeological hazard occurrence, especially in relation to weather events. In particular, the model and the soil characterization were applied in back analysis, in order to assess the reliability of the model through validation of the results with landslide events that occurred during the period. The validation was performed on four past events of intense rainfall that have affected Valle d'Aosta region between 2008 and 2010 years triggering fast shallows landslides. The simulations show substantial improvement of the reliability of the results compared to the use of literature parameters. A statistical analysis of the HIRESSS outputs in terms of failure probability has been carried out in order to define reliable alert levels for regional landslide early warning systems.

  3. Validated dynamic flow model

    DEFF Research Database (Denmark)

    Knudsen, Torben

    2011-01-01

    model structure suggested by University of Lund the WP4 leader. This particular model structure has the advantages that it fits better into the control design frame work used by WP3-4 compared to the model structures previously developed in WP2. The different model structures are first summarised....... Then issues dealing with optimal experimental design is considered. Finally the parameters are estimated in the chosen static and dynamic models and a validation is performed. Two of the static models, one of them the additive model, explains the data well. In case of dynamic models the suggested additive...

  4. Data Quality in Institutional Arthroplasty Registries: Description of a Model of Validation and Report of Preliminary Results.

    Science.gov (United States)

    Bautista, Maria P; Bonilla, Guillermo A; Mieth, Klaus W; Llinás, Adolfo M; Rodríguez, Fernanda; Cárdenas, Laura L

    2017-07-01

    Arthroplasty registries are a relevant source of information for research and quality improvement in patient care and its value depends on the quality of the recorded data. The purpose of this study is to describe a model of validation and present the findings of validation of an Institutional Arthroplasty Registry (IAR). Information from 209 primary arthroplasties and revision surgeries of the hip, knee, and shoulder recorded in the IAR between March and September 2015 were analyzed in the following domains. Adherence is defined as the proportion of patients included in the registry, completeness is defined as the proportion of data effectively recorded, and accuracy is defined as the proportion of data consistent with medical records. A random sample of 53 patients (25.4%) was selected to assess the latest 2 domains. A direct comparison between the registry's database and medical records was performed. In total, 324 variables containing information on demographic data, surgical procedure, clinical outcomes, and key performance indicators were analyzed. Two hundred nine of 212 patients who underwent surgery during the study period were included in the registry, accounting for an adherence of 98.6%. Completeness was 91.7% and accuracy was 85.8%. Most errors were found in the preoperative range of motion and timely administration of prophylactic antibiotics and thromboprophylaxis. This model provides useful information regarding the quality of the recorded data since it identified deficient areas within the IAR. We recommend that institutional arthroplasty registries be constantly monitored for data quality before using their information for research or quality improvement purposes. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. Physical control oriented model of large scale refrigerators to synthesize advanced control schemes. Design, validation, and first control results

    Science.gov (United States)

    Bonne, François; Alamir, Mazen; Bonnay, Patrick

    2014-01-01

    In this paper, a physical method to obtain control-oriented dynamical models of large scale cryogenic refrigerators is proposed, in order to synthesize model-based advanced control schemes. These schemes aim to replace classical user experience designed approaches usually based on many independent PI controllers. This is particularly useful in the case where cryoplants are submitted to large pulsed thermal loads, expected to take place in the cryogenic cooling systems of future fusion reactors such as the International Thermonuclear Experimental Reactor (ITER) or the Japan Torus-60 Super Advanced Fusion Experiment (JT-60SA). Advanced control schemes lead to a better perturbation immunity and rejection, to offer a safer utilization of cryoplants. The paper gives details on how basic components used in the field of large scale helium refrigeration (especially those present on the 400W @1.8K helium test facility at CEA-Grenoble) are modeled and assembled to obtain the complete dynamic description of controllable subsystems of the refrigerator (controllable subsystems are namely the Joule-Thompson Cycle, the Brayton Cycle, the Liquid Nitrogen Precooling Unit and the Warm Compression Station). The complete 400W @1.8K (in the 400W @4.4K configuration) helium test facility model is then validated against experimental data and the optimal control of both the Joule-Thompson valve and the turbine valve is proposed, to stabilize the plant under highly variable thermals loads. This work is partially supported through the European Fusion Development Agreement (EFDA) Goal Oriented Training Program, task agreement WP10-GOT-GIRO.

  6. Testing the validity of the neoclassical migration model: Overall and age-group specific estimation results for German spatial planning regions

    OpenAIRE

    Mitze, Timo; Reinkowski, Janina

    2010-01-01

    This paper assess the empirical validity of the neoclassical migration model to predict German internal migration flows driven by regional labour market disparities. We estimate static and dynamic migration functions for 97 Spatial Planning Regions between 1996--2006 using key labour market signals including income and unemployment differences among a broader set of explanatory variables. Beside an aggregate specification we also estimate the model for age-group related subsamples. Our result...

  7. Examining construct and predictive validity of the Health-IT Usability Evaluation Scale: confirmatory factor analysis and structural equation modeling results

    Science.gov (United States)

    Yen, Po-Yin; Sousa, Karen H; Bakken, Suzanne

    2014-01-01

    Background In a previous study, we developed the Health Information Technology Usability Evaluation Scale (Health-ITUES), which is designed to support customization at the item level. Such customization matches the specific tasks/expectations of a health IT system while retaining comparability at the construct level, and provides evidence of its factorial validity and internal consistency reliability through exploratory factor analysis. Objective In this study, we advanced the development of Health-ITUES to examine its construct validity and predictive validity. Methods The health IT system studied was a web-based communication system that supported nurse staffing and scheduling. Using Health-ITUES, we conducted a cross-sectional study to evaluate users’ perception toward the web-based communication system after system implementation. We examined Health-ITUES's construct validity through first and second order confirmatory factor analysis (CFA), and its predictive validity via structural equation modeling (SEM). Results The sample comprised 541 staff nurses in two healthcare organizations. The CFA (n=165) showed that a general usability factor accounted for 78.1%, 93.4%, 51.0%, and 39.9% of the explained variance in ‘Quality of Work Life’, ‘Perceived Usefulness’, ‘Perceived Ease of Use’, and ‘User Control’, respectively. The SEM (n=541) supported the predictive validity of Health-ITUES, explaining 64% of the variance in intention for system use. Conclusions The results of CFA and SEM provide additional evidence for the construct and predictive validity of Health-ITUES. The customizability of Health-ITUES has the potential to support comparisons at the construct level, while allowing variation at the item level. We also illustrate application of Health-ITUES across stages of system development. PMID:24567081

  8. Roll-up of validation results to a target application.

    Energy Technology Data Exchange (ETDEWEB)

    Hills, Richard Guy

    2013-09-01

    Suites of experiments are preformed over a validation hierarchy to test computational simulation models for complex applications. Experiments within the hierarchy can be performed at different conditions and configurations than those for an intended application, with each experiment testing only part of the physics relevant for the application. The purpose of the present work is to develop methodology to roll-up validation results to an application, and to assess the impact the validation hierarchy design has on the roll-up results. The roll-up is accomplished through the development of a meta-model that relates validation measurements throughout a hierarchy to the desired response quantities for the target application. The meta-model is developed using the computation simulation models for the experiments and the application. The meta-model approach is applied to a series of example transport problems that represent complete and incomplete coverage of the physics of the target application by the validation experiments.

  9. A 3D active-passive numerical skeletal muscle model incorporating initial tissue strains. Validation with experimental results on rat tibialis anterior muscle.

    Science.gov (United States)

    Grasa, J; Ramírez, A; Osta, R; Muñoz, M J; Soteras, F; Calvo, B

    2011-10-01

    This paper presents a three-dimensional finite element model of skeletal muscle and its validation incorporating inital tissue strains. A constitutive relation was determined by using a convex free strain energy function (SEF) where active and passive response contributions were obtained fitting experimental data from the rat tibialis anterior (TA) muscle. The passive and active finite strains response was modelled within the framework of continuum mechanics by a quasi-incompressible transversely isotropic material formulation. Magnetic resonance images (MRI) were obtained to reconstruct the external geometry of the TA. This geometry includes initial strains also taken into account in the numerical model. The numerical results show excellent agreement with the experimental results when comparing reaction force-extension curves both in passive and active tests. The proposed constitutive model for the muscle is implemented in a subroutine in the commercial finite element software package ABAQUS.

  10. Validation of Erosion 3D in Lower Saxony - Comparison between modelled soil erosion events and results of a long term monitoring project

    Science.gov (United States)

    Bug, Jan; Mosimann, Thomas

    2013-04-01

    Since 2000 water erosion has been surveyed on 400 ha arable land in three different regions of Lower Saxony (Mosimann et al. 2009). The results of this long-term survey are used for the validation of the soil erosion models such as USLE and Erosion 3D. The validation of the physically-based model Erosion 3D (Schmidt & Werner 2000) is possible because the survey analyses the effects (soil loss, sediment yield, deposition on site) of single thunder storm events and also maps major factors of soil erosion (soil, crop, tillage). A 12.5 m Raster DEM was used to model the soil erosion events.Rainfalldata was acquired from climate stations. Soil and landuse parameters were derived from the "Parameterkatalog Sachsen"(Michael et al. 1996). During thirteen years of monitoring, high intensity storms fell less frequently than expected. High intensity rainfalls with a return period of five or ten years usually occurred during periods of maximum plant cover.Winter events were ruled out because dataon snow melt and rainfallwere not measured. The validation is therefore restricted to 80 events. The validation consists of three parts. The first part compares the spatial distribution of the mapped soil erosion with the model results. The second part calculates the difference in the amount of redistributed soil. The third part analyses off-site effects such as sediment yield and pollution of water bodies. The validation shows that the overall result of erosion 3D is quite good. Spatial hotspots of soil erosion and of off-site effects are predicted correctly in most cases. However, quantitative comparison is more problematic, because the mapping allows only the quantification of rillerosion and not of sheet erosion. So as a rule,the predicted soil loss is higher than the mapped. The prediction of rill development is also problematic. While the model is capable of predicting rills in thalwegs, the modelling of erosion in tractor tracks and headlands is more complicated. In order to

  11. Validating Animal Models

    Directory of Open Access Journals (Sweden)

    Nina Atanasova

    2015-06-01

    Full Text Available In this paper, I respond to the challenge raised against contemporary experimental neurobiology according to which the field is in a state of crisis because of the multiple experimental protocols employed in different laboratories and strengthening their reliability that presumably preclude the validity of neurobiological knowledge. I provide an alternative account of experimentation in neurobiology which makes sense of its experimental practices. I argue that maintaining a multiplicity of experimental protocols and strengthening their reliability are well justified and they foster rather than preclude the validity of neurobiological knowledge. Thus, their presence indicates thriving rather than crisis of experimental neurobiology.

  12. PEMFC modeling and experimental validation

    Energy Technology Data Exchange (ETDEWEB)

    Vargas, J.V.C. [Federal University of Parana (UFPR), Curitiba, PR (Brazil). Dept. of Mechanical Engineering], E-mail: jvargas@demec.ufpr.br; Ordonez, J.C.; Martins, L.S. [Florida State University, Tallahassee, FL (United States). Center for Advanced Power Systems], Emails: ordonez@caps.fsu.edu, martins@caps.fsu.edu

    2009-07-01

    In this paper, a simplified and comprehensive PEMFC mathematical model introduced in previous studies is experimentally validated. Numerical results are obtained for an existing set of commercial unit PEM fuel cells. The model accounts for pressure drops in the gas channels, and for temperature gradients with respect to space in the flow direction, that are investigated by direct infrared imaging, showing that even at low current operation such gradients are present in fuel cell operation, and therefore should be considered by a PEMFC model, since large coolant flow rates are limited due to induced high pressure drops in the cooling channels. The computed polarization and power curves are directly compared to the experimentally measured ones with good qualitative and quantitative agreement. The combination of accuracy and low computational time allow for the future utilization of the model as a reliable tool for PEMFC simulation, control, design and optimization purposes. (author)

  13. Base Flow Model Validation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation is the systematic "building-block" validation of CFD/turbulence models employing a GUI driven CFD code (RPFM) and existing as well as new data sets to...

  14. Validating Dart Model

    Directory of Open Access Journals (Sweden)

    Mazur Jolanta

    2014-12-01

    Full Text Available The primary objective of the study was to quantitatively test the DART model, which despite being one of the most popular representations of co-creation concept was so far studied almost solely with qualitative methods. To this end, the researchers developed a multiple measurement scale and employed it in interviewing managers. The statistical evidence for adequacy of the model was obtained through CFA with AMOS software. The findings suggest that the DART model may not be an accurate representation of co-creation practices in companies. From the data analysis it was evident that the building blocks of DART had too much of conceptual overlap to be an effective framework for quantitative analysis. It was also implied that the phenomenon of co-creation is so rich and multifaceted that it may be more adequately captured by a measurement model where co-creation is conceived as a third-level factor with two layers of intermediate latent variables.

  15. Comparison and validation of HEU and LEU modeling results to HEU experimental benchmark data for the Massachusetts Institute of Technology MITR reactor.

    Energy Technology Data Exchange (ETDEWEB)

    Newton, T. H.; Wilson, E. H; Bergeron, A.; Horelik, N.; Stevens, J. (Nuclear Engineering Division); (MIT Nuclear Reactor Lab.)

    2011-03-02

    The Massachusetts Institute of Technology Reactor (MITR-II) is a research reactor in Cambridge, Massachusetts designed primarily for experiments using neutron beam and in-core irradiation facilities. It delivers a neutron flux comparable to current LWR power reactors in a compact 6 MW core using Highly Enriched Uranium (HEU) fuel. In the framework of its non-proliferation policies, the international community presently aims to minimize the amount of nuclear material available that could be used for nuclear weapons. In this geopolitical context, most research and test reactors both domestic and international have started a program of conversion to the use of Low Enriched Uranium (LEU) fuel. A new type of LEU fuel based on an alloy of uranium and molybdenum (UMo) is expected to allow the conversion of U.S. domestic high performance reactors like the MITR-II reactor. Towards this goal, comparisons of MCNP5 Monte Carlo neutronic modeling results for HEU and LEU cores have been performed. Validation of the model has been based upon comparison to HEU experimental benchmark data for the MITR-II. The objective of this work was to demonstrate a model which could represent the experimental HEU data, and therefore could provide a basis to demonstrate LEU core performance. This report presents an overview of MITR-II model geometry and material definitions which have been verified, and updated as required during the course of validation to represent the specifications of the MITR-II reactor. Results of calculations are presented for comparisons to historical HEU start-up data from 1975-1976, and to other experimental benchmark data available for the MITR-II Reactor through 2009. This report also presents results of steady state neutronic analysis of an all-fresh LEU fueled core. Where possible, HEU and LEU calculations were performed for conditions equivalent to HEU experiments, which serves as a starting point for safety analyses for conversion of MITR-II from the use of HEU

  16. Results of the AVATAR project for the validation of 2D aerodynamic models with experimental data of the DU95W180 airfoil with unsteady flap

    DEFF Research Database (Denmark)

    Ferreira, C.; Gonzalez, A.; Baldacchino, D.

    2016-01-01

    is the application of flow control devices, such as flaps. In Task 3.2: Development of aerodynamic codes for modelling of flow devices on aerofoils and, rotors of the Avatar project, aerodynamic codes are benchmarked and validated against the experimental data of a DU95W180 airfoil in steady and unsteady flow......, for different angle of attack and flap settings, including unsteady oscillatory trailing-edge-flap motion, carried out within the framework of WP3: Models for Flow Devices and Flow Control, Task 3.1: CFD and Experimental Database. The aerodynamics codes are: AdaptFoil2D, Foil2W, FLOWer, MaPFlow, OpenFOAM, Q3UIC......, ATEFlap. The codes include unsteady Eulerian CFD simulations with grid deformation, panel models and indicial engineering models. The validation cases correspond to 18 steady flow cases, and 42 unsteady flow cases, for varying angle of attack, flap deflection and reduced frequency, with free and forced...

  17. Validity Examination of EFQM’s Results by DEA Models = Examen de la validez de los resultados de EFQM mediante modelos DEA

    Directory of Open Access Journals (Sweden)

    Ben Mustafa, Adli

    2008-01-01

    Full Text Available Validity Examination of EFQM’s Results by DEA Models = Examen de la validez de los resultados de EFQM mediante modelos DEAAbstract: The European Foundation Quality Management is one of the models which deal with the assessment of function of an organization using a self-assessment for measuring the concepts some of which are more and more qualitative. Consequently, complete understanding and correct usage of this model in an organization depend on the comprehensive recognition of that model and different strategies of self-assessment. The process of self-assessment on the basis of this model in an organization needs to use the experienced auditors. This leads to reduce the wrong privilege making to the criteria and to subcriteria probable way. In this paper, first some of the weaknesses of the EFQM model are studied, then with the usage of structure of input-output governing of the model and using of Data Envelopment Analysis, a method is offered to recognize the lack of the proportion between Enablers and the results of organization which may occur due to problems and obstacles hidden in the heart of organization. = La Fundación Europea de Gestión de la Calidad (EFQM significa uno de los modelos para la evaluación de las funciones de las organizaciones, utilizando la autoevaluación para medir aspectos que, algunos de los cuales, son cada vez más cualitativos. Consecuentemente, la comprensión completa y el uso correcto de este modelo en una organización dependen del conocimiento profundo del modelo y de las diferentes estrategias de autoevaluación. El proceso de autoevaluación en la base de este modelo, en cualquier organización, necesita la intervención de auditores experimentados. Esto es precisamente lo que lleva a reducir el uso incorrecto de los criterios y de los subcriterios. En este artículo, primero se estudian algunas de las debilidades del modelo EFQM y después, mediante la utilización de estructura de control de

  18. Validation process of simulation model

    International Nuclear Information System (INIS)

    San Isidro, M. J.

    1998-01-01

    It is presented a methodology on empirical validation about any detailed simulation model. This king of validation it is always related with an experimental case. The empirical validation has a residual sense, because the conclusions are based on comparisons between simulated outputs and experimental measurements. This methodology will guide us to detect the fails of the simulation model. Furthermore, it can be used a guide in the design of posterior experiments. Three steps can be well differentiated: Sensitivity analysis. It can be made with a DSA, differential sensitivity analysis, and with a MCSA, Monte-Carlo sensitivity analysis. Looking the optimal domains of the input parameters. It has been developed a procedure based on the Monte-Carlo methods and Cluster techniques, to find the optimal domains of these parameters. Residual analysis. This analysis has been made on the time domain and on the frequency domain, it has been used the correlation analysis and spectral analysis. As application of this methodology, it is presented the validation carried out on a thermal simulation model on buildings, Esp., studying the behavior of building components on a Test Cell of LECE of CIEMAT. (Author) 17 refs

  19. Validating the passenger traffic model for Copenhagen

    DEFF Research Database (Denmark)

    Overgård, Christian Hansen; VUK, Goran

    2006-01-01

    The paper presents a comprehensive validation procedure for the passenger traffic model for Copenhagen based on external data from the Danish national travel survey and traffic counts. The model was validated for the years 2000 to 2004, with 2004 being of particular interest because the Copenhagen...... matched the observed traffic better than those of the transit assignment model. With respect to the metro forecasts, the model over-predicts metro passenger flows by 10% to 50%. The wide range of findings from the project resulted in two actions. First, a project was started in January 2005 to upgrade...

  20. Evidential Model Validation under Epistemic Uncertainty

    Directory of Open Access Journals (Sweden)

    Wei Deng

    2018-01-01

    Full Text Available This paper proposes evidence theory based methods to both quantify the epistemic uncertainty and validate computational model. Three types of epistemic uncertainty concerning input model data, that is, sparse points, intervals, and probability distributions with uncertain parameters, are considered. Through the proposed methods, the given data will be described as corresponding probability distributions for uncertainty propagation in the computational model, thus, for the model validation. The proposed evidential model validation method is inspired by the idea of Bayesian hypothesis testing and Bayes factor, which compares the model predictions with the observed experimental data so as to assess the predictive capability of the model and help the decision making of model acceptance. Developed by the idea of Bayes factor, the frame of discernment of Dempster-Shafer evidence theory is constituted and the basic probability assignment (BPA is determined. Because the proposed validation method is evidence based, the robustness of the result can be guaranteed, and the most evidence-supported hypothesis about the model testing will be favored by the BPA. The validity of proposed methods is illustrated through a numerical example.

  1. Statistical validation of stochastic models

    Energy Technology Data Exchange (ETDEWEB)

    Hunter, N.F. [Los Alamos National Lab., NM (United States). Engineering Science and Analysis Div.; Barney, P.; Paez, T.L. [Sandia National Labs., Albuquerque, NM (United States). Experimental Structural Dynamics Dept.; Ferregut, C.; Perez, L. [Univ. of Texas, El Paso, TX (United States). Dept. of Civil Engineering

    1996-12-31

    It is common practice in structural dynamics to develop mathematical models for system behavior, and the authors are now capable of developing stochastic models, i.e., models whose parameters are random variables. Such models have random characteristics that are meant to simulate the randomness in characteristics of experimentally observed systems. This paper suggests a formal statistical procedure for the validation of mathematical models of stochastic systems when data taken during operation of the stochastic system are available. The statistical characteristics of the experimental system are obtained using the bootstrap, a technique for the statistical analysis of non-Gaussian data. The authors propose a procedure to determine whether or not a mathematical model is an acceptable model of a stochastic system with regard to user-specified measures of system behavior. A numerical example is presented to demonstrate the application of the technique.

  2. Verification and validation of models

    International Nuclear Information System (INIS)

    Herbert, A.W.; Hodgkinson, D.P.; Jackson, C.P.; Lever, D.A.; Robinson, P.C.

    1986-12-01

    The numerical accuracy of the computer models for groundwater flow and radionuclide transport that are to be used in repository safety assessment must be tested, and their ability to describe experimental data assessed: they must be verified and validated respectively. Also appropriate ways to use the codes in performance assessments, taking into account uncertainties in present data and future conditions, must be studied. These objectives are being met by participation in international exercises, by developing bench-mark problems, and by analysing experiments. In particular the project has funded participation in the HYDROCOIN project for groundwater flow models, the Natural Analogues Working Group, and the INTRAVAL project for geosphere models. (author)

  3. Atmospheric Deposition Modeling Results

    Data.gov (United States)

    U.S. Environmental Protection Agency — This asset provides data on model results for dry and total deposition of sulfur, nitrogen and base cation species. Components include deposition velocities, dry...

  4. Validation of the IMS CORE Diabetes Model.

    Science.gov (United States)

    McEwan, Phil; Foos, Volker; Palmer, James L; Lamotte, Mark; Lloyd, Adam; Grant, David

    2014-09-01

    The IMS CORE Diabetes Model (CDM) is a widely published and validated simulation model applied in both type 1 diabetes mellitus (T1DM) and type 2 diabetes mellitus (T2DM) analyses. Validation to external studies is an important part of demonstrating model credibility. Because the CDM is widely used to estimate long-term clinical outcomes in diabetes patients, the objective of this analysis was to validate the CDM to contemporary outcomes studies, including those with long-term follow-up periods. A total of 112 validation simulations were performed, stratified by study follow-up duration. For long-term results (≥15-year follow-up), simulation cohorts representing baseline Diabetes Control and Complications Trial (DCCT) and United Kingdom Prospective Diabetes Study (UKPDS) cohorts were generated and intensive and conventional treatment arms were defined in the CDM. Predicted versus observed macrovascular and microvascular complications and all-cause mortality were assessed using the coefficient of determination (R(2)) goodness-of-fit measure. Across all validation studies, the CDM simulations produced an R(2) statistic of 0.90. For validation studies with a follow-up duration of less than 15 years, R(2) values of 0.90 and 0.88 were achieved for T1DM and T2DM respectively. In T1DM, validating against 30-year outcomes data (DCCT) resulted in an R(2) of 0.72. In T2DM, validating against 20-year outcomes data (UKPDS) resulted in an R(2) of 0.92. This analysis supports the CDM as a credible tool for predicting the absolute number of clinical events in DCCT- and UKPDS-like populations. With increasing incidence of diabetes worldwide, the CDM is particularly important for health care decision makers, for whom the robust evaluation of health care policies is essential. Copyright © 2014 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  5. Validation of models with proportional bias

    Directory of Open Access Journals (Sweden)

    Salvador Medina-Peralta

    2017-01-01

    Full Text Available Objective. This paper presents extensions to Freese’s statistical method for model-validation when proportional bias (PB is present in the predictions. The method is illustrated with data from a model that simulates grassland growth. Materials and methods. The extensions to validate models with PB were: the maximum anticipated error for the original proposal, hypothesis testing, and the maximum anticipated error for the alternative proposal, and the confidence interval for a quantile of error distribution. Results. The tested model had PB, which once removed, and with a confidence level of 95%, the magnitude of error does not surpass 1225.564 kg ha-1. Therefore, the validated model can be used to predict grassland growth. However, it would require a fit of its structure based on the presence of PB. Conclusions. The extensions presented to validate models with PB are applied without modification in the model structure. Once PB is corrected, the confidence interval for the quantile 1-α of the error distribution enables a higher bound for the magnitude of the prediction error and it can be used to evaluate the evolution of the model for a system prediction.

  6. A discussion on validation of hydrogeological models

    International Nuclear Information System (INIS)

    Carrera, J.; Mousavi, S.F.; Usunoff, E.J.; Sanchez-Vila, X.; Galarza, G.

    1993-01-01

    Groundwater flow and solute transport are often driven by heterogeneities that elude easy identification. It is also difficult to select and describe the physico-chemical processes controlling solute behaviour. As a result, definition of a conceptual model involves numerous assumptions both on the selection of processes and on the representation of their spatial variability. Validating a numerical model by comparing its predictions with actual measurements may not be sufficient for evaluating whether or not it provides a good representation of 'reality'. Predictions will be close to measurements, regardless of model validity, if these are taken from experiments that stress well-calibrated model modes. On the other hand, predictions will be far from measurements when model parameters are very uncertain, even if the model is indeed a very good representation of the real system. Hence, we contend that 'classical' validation of hydrogeological models is not possible. Rather, models should be viewed as theories about the real system. We propose to follow a rigorous modeling approach in which different sources of uncertainty are explicitly recognized. The application of one such approach is illustrated by modeling a laboratory uranium tracer test performed on fresh granite, which was used as Test Case 1b in INTRAVAL. (author)

  7. Global precipitation measurements for validating climate models

    Science.gov (United States)

    Tapiador, F. J.; Navarro, A.; Levizzani, V.; García-Ortega, E.; Huffman, G. J.; Kidd, C.; Kucera, P. A.; Kummerow, C. D.; Masunaga, H.; Petersen, W. A.; Roca, R.; Sánchez, J.-L.; Tao, W.-K.; Turk, F. J.

    2017-11-01

    The advent of global precipitation data sets with increasing temporal span has made it possible to use them for validating climate models. In order to fulfill the requirement of global coverage, existing products integrate satellite-derived retrievals from many sensors with direct ground observations (gauges, disdrometers, radars), which are used as reference for the satellites. While the resulting product can be deemed as the best-available source of quality validation data, awareness of the limitations of such data sets is important to avoid extracting wrong or unsubstantiated conclusions when assessing climate model abilities. This paper provides guidance on the use of precipitation data sets for climate research, including model validation and verification for improving physical parameterizations. The strengths and limitations of the data sets for climate modeling applications are presented, and a protocol for quality assurance of both observational databases and models is discussed. The paper helps elaborating the recent IPCC AR5 acknowledgment of large observational uncertainties in precipitation observations for climate model validation.

  8. Verifying and Validating Simulation Models

    Energy Technology Data Exchange (ETDEWEB)

    Hemez, Francois M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-02-23

    This presentation is a high-level discussion of the Verification and Validation (V&V) of computational models. Definitions of V&V are given to emphasize that “validation” is never performed in a vacuum; it accounts, instead, for the current state-of-knowledge in the discipline considered. In particular comparisons between physical measurements and numerical predictions should account for their respective sources of uncertainty. The differences between error (bias), aleatoric uncertainty (randomness) and epistemic uncertainty (ignorance, lack-of- knowledge) are briefly discussed. Four types of uncertainty in physics and engineering are discussed: 1) experimental variability, 2) variability and randomness, 3) numerical uncertainty and 4) model-form uncertainty. Statistical sampling methods are available to propagate, and analyze, variability and randomness. Numerical uncertainty originates from the truncation error introduced by the discretization of partial differential equations in time and space. Model-form uncertainty is introduced by assumptions often formulated to render a complex problem more tractable and amenable to modeling and simulation. The discussion concludes with high-level guidance to assess the “credibility” of numerical simulations, which stems from the level of rigor with which these various sources of uncertainty are assessed and quantified.

  9. Paleoclimate validation of a numerical climate model

    International Nuclear Information System (INIS)

    Schelling, F.J.; Church, H.W.; Zak, B.D.; Thompson, S.L.

    1994-01-01

    An analysis planned to validate regional climate model results for a past climate state at Yucca Mountain, Nevada, against paleoclimate evidence for the period is described. This analysis, which will use the GENESIS model of global climate nested with the RegCM2 regional climate model, is part of a larger study for DOE's Yucca Mountain Site Characterization Project that is evaluating the impacts of long term future climate change on performance of the potential high level nuclear waste repository at Yucca Mountain. The planned analysis and anticipated results are presented

  10. Cognitive Enhancement in Infants Associated with Increased Maternal Fruit Intake During Pregnancy: Results from a Birth Cohort Study with Validation in an Animal Model

    Directory of Open Access Journals (Sweden)

    Francois V. Bolduc

    2016-06-01

    Full Text Available In-utero nutrition is an under-studied aspect of cognitive development. Fruit has been an important dietary constituent for early hominins and humans. Among 808 eligible CHILD-Edmonton sub-cohort subjects, 688 (85% had 1-year cognitive outcome data. We found that each maternal daily serving of fruit (sum of fruit plus 100% fruit juice consumed during pregnancy was associated with a 2.38 point increase in 1-year cognitive development (95% CI 0.39, 4.37; p < 0.05. Consistent with this, we found 30% higher learning Performance index (PI scores in Drosophila offspring from parents who consumed 30% fruit juice supplementation prenatally (PI: 85.7; SE 1.8; p < 0.05 compared to the offspring of standard diet parents (PI: 65.0 SE 3.4. Using the Drosophila model, we also show that the cyclic adenylate monophosphate (cAMP pathway may be a major regulator of this effect, as prenatal fruit associated cognitive enhancement was blocked in Drosophila rutabaga mutants with reduced Ca2+-Calmodulin-dependent adenylyl cyclase. Moreover, gestation is a critical time for this effect as postnatal fruit intake did not enhance cognitive performance in either humans or Drosophila. Our study supports increased fruit consumption during pregnancy with significant increases in infant cognitive performance. Validation in Drosophila helps control for potential participant bias or unmeasured confounders.

  11. Geochemistry Model Validation Report: External Accumulation Model

    Energy Technology Data Exchange (ETDEWEB)

    K. Zarrabi

    2001-09-27

    The purpose of this Analysis and Modeling Report (AMR) is to validate the External Accumulation Model that predicts accumulation of fissile materials in fractures and lithophysae in the rock beneath a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. (Lithophysae are voids in the rock having concentric shells of finely crystalline alkali feldspar, quartz, and other materials that were formed due to entrapped gas that later escaped, DOE 1998, p. A-25.) The intended use of this model is to estimate the quantities of external accumulation of fissile material for use in external criticality risk assessments for different types of degrading WPs: U.S. Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The scope of the model validation is to (1) describe the model and the parameters used to develop the model, (2) provide rationale for selection of the parameters by comparisons with measured values, and (3) demonstrate that the parameters chosen are the most conservative selection for external criticality risk calculations. To demonstrate the applicability of the model, a Pu-ceramic WP is used as an example. The model begins with a source term from separately documented EQ6 calculations; where the source term is defined as the composition versus time of the water flowing out of a breached waste package (WP). Next, PHREEQC, is used to simulate the transport and interaction of the source term with the resident water and fractured tuff below the repository. In these simulations the primary mechanism for accumulation is mixing of the high pH, actinide-laden source term with resident water; thus lowering the pH values sufficiently for fissile minerals to become insoluble and precipitate. In the final section of the model, the outputs from PHREEQC, are processed to produce mass of accumulation

  12. Geochemistry Model Validation Report: External Accumulation Model

    International Nuclear Information System (INIS)

    Zarrabi, K.

    2001-01-01

    The purpose of this Analysis and Modeling Report (AMR) is to validate the External Accumulation Model that predicts accumulation of fissile materials in fractures and lithophysae in the rock beneath a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. (Lithophysae are voids in the rock having concentric shells of finely crystalline alkali feldspar, quartz, and other materials that were formed due to entrapped gas that later escaped, DOE 1998, p. A-25.) The intended use of this model is to estimate the quantities of external accumulation of fissile material for use in external criticality risk assessments for different types of degrading WPs: U.S. Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The scope of the model validation is to (1) describe the model and the parameters used to develop the model, (2) provide rationale for selection of the parameters by comparisons with measured values, and (3) demonstrate that the parameters chosen are the most conservative selection for external criticality risk calculations. To demonstrate the applicability of the model, a Pu-ceramic WP is used as an example. The model begins with a source term from separately documented EQ6 calculations; where the source term is defined as the composition versus time of the water flowing out of a breached waste package (WP). Next, PHREEQC, is used to simulate the transport and interaction of the source term with the resident water and fractured tuff below the repository. In these simulations the primary mechanism for accumulation is mixing of the high pH, actinide-laden source term with resident water; thus lowering the pH values sufficiently for fissile minerals to become insoluble and precipitate. In the final section of the model, the outputs from PHREEQC, are processed to produce mass of accumulation

  13. Validation of HEDR models. Hanford Environmental Dose Reconstruction Project

    Energy Technology Data Exchange (ETDEWEB)

    Napier, B.A.; Simpson, J.C.; Eslinger, P.W.; Ramsdell, J.V. Jr.; Thiede, M.E.; Walters, W.H.

    1994-05-01

    The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computer models for estimating the possible radiation doses that individuals may have received from past Hanford Site operations. This document describes the validation of these models. In the HEDR Project, the model validation exercise consisted of comparing computational model estimates with limited historical field measurements and experimental measurements that are independent of those used to develop the models. The results of any one test do not mean that a model is valid. Rather, the collection of tests together provide a level of confidence that the HEDR models are valid.

  14. On the validation of an Eulerian model with the Fukushima Nuclear Power Plant (FNPP) accident. Global and local results for Europe and Japan

    Energy Technology Data Exchange (ETDEWEB)

    Evangeliou, Nikolaos; Balkanski, Yves; Cozic, Anne [CEA-CNRS-UVSQ UMR 8212, IPSL/LSCE - Laboratoire des Sciences du Climat et de l' Environnement, L' Orme des Merisiers, 91191 Gif-sur-Yvette Cedex (France); Florou, Heleni; Eleftheriadis, Konstantinos; Kritidis, Panayotis [NCSR ' Demokritos' , Institute of Nuclear and Radiological Sciences and Technology, Energy and Safety (INRASTES), Environmental Radioactivity Laboratory, 15310 Athens (Greece)

    2014-07-01

    A large debate about the exact emissions after the Fukushima NPP accident is still ongoing more than 3 years after the original disaster. Terada et al. (2012) reported the total release of {sup 137}Cs to be 13 PBq (x10{sup 15} Bq), based on an inverse modelling using Japanese data only, whereas the IRSN reported releases of {sup 137}Cs to be 20.6 PBq (IRSN, 2011). In the present study, we used the emission inventories for {sup 137}Cs and {sup 133}Xe reported by Stohl et al. (2012) estimated by inverse modelling using the CTBTO (Comprehensive Nuclear Test Ban Treaty Organisation) and Japanese networks (36.7 PBq of {sup 137}Cs and 15.3 EBq of {sup 133}Xe). For the simulations of the accident, three different versions of the LMDZORINCA model were used; A regular one with a grid resolution of 2.50 deg. x 1.27 deg. for the global comparison with the CTBTO network (19 and 39 vertical layers), and a zoom version over Europe and Asia (0.45 deg. x 0.51 deg. for 19 levels) resulting after 'stretching' the grid using the same number of grid points to assess what happened in Greece, and Japan. Cesium isotopes were treated as sub-micronic aerosols, whereas {sup 133}Xe as a passive tracer within the model, whereas several other radionuclides were estimated from reported isotopic ratios. Our results for the global assessment fit well to the observations. They differ about 0.04% from the measurements for {sup 137}Cs, and around 40% for xenon. The most significant deviations were observed for the northernmost stations due to both scavenging processes and transport over the Arctic. Scattered measurements of several radionuclides from Japan were adopted from literature (Shininaga et al., 2014). The comparison showed a significant quality of our model, although some isotopes are miscalculated. This shows that the reported isotopic ratios might be biased somehow. Finally, in Greece, few measurements of {sup 131}I, {sup 134}Cs and {sup 137}Cs were adopted from Potiriadis et al

  15. Calibration and validation of rockfall models

    Science.gov (United States)

    Frattini, Paolo; Valagussa, Andrea; Zenoni, Stefania; Crosta, Giovanni B.

    2013-04-01

    Calibrating and validating landslide models is extremely difficult due to the particular characteristic of landslides: limited recurrence in time, relatively low frequency of the events, short durability of post-event traces, poor availability of continuous monitoring data, especially for small landslide and rockfalls. For this reason, most of the rockfall models presented in literature completely lack calibration and validation of the results. In this contribution, we explore different strategies for rockfall model calibration and validation starting from both an historical event and a full-scale field test. The event occurred in 2012 in Courmayeur (Western Alps, Italy), and caused serious damages to quarrying facilities. This event has been studied soon after the occurrence through a field campaign aimed at mapping the blocks arrested along the slope, the shape and location of the detachment area, and the traces of scars associated to impacts of blocks on the slope. The full-scale field test was performed by Geovert Ltd in the Christchurch area (New Zealand) after the 2011 earthquake. During the test, a number of large blocks have been mobilized from the upper part of the slope and filmed with high velocity cameras from different viewpoints. The movies of each released block were analysed to identify the block shape, the propagation path, the location of impacts, the height of the trajectory and the velocity of the block along the path. Both calibration and validation of rockfall models should be based on the optimization of the agreement between the actual trajectories or location of arrested blocks and the simulated ones. A measure that describe this agreement is therefore needed. For calibration purpose, this measure should simple enough to allow trial and error repetitions of the model for parameter optimization. In this contribution we explore different calibration/validation measures: (1) the percentage of simulated blocks arresting within a buffer of the

  16. A methodology for PSA model validation

    International Nuclear Information System (INIS)

    Unwin, S.D.

    1995-09-01

    This document reports Phase 2 of work undertaken by Science Applications International Corporation (SAIC) in support of the Atomic Energy Control Board's Probabilistic Safety Assessment (PSA) review. A methodology is presented for the systematic review and evaluation of a PSA model. These methods are intended to support consideration of the following question: To within the scope and depth of modeling resolution of a PSA study, is the resultant model a complete and accurate representation of the subject plant? This question was identified as a key PSA validation issue in SAIC's Phase 1 project. The validation methods are based on a model transformation process devised to enhance the transparency of the modeling assumptions. Through conversion to a 'success-oriented' framework, a closer correspondence to plant design and operational specifications is achieved. This can both enhance the scrutability of the model by plant personnel, and provide an alternative perspective on the model that may assist in the identification of deficiencies. The model transformation process is defined and applied to fault trees documented in the Darlington Probabilistic Safety Evaluation. A tentative real-time process is outlined for implementation and documentation of a PSA review based on the proposed methods. (author). 11 refs., 9 tabs., 30 refs

  17. Turbulence Modeling Validation, Testing, and Development

    Science.gov (United States)

    Bardina, J. E.; Huang, P. G.; Coakley, T. J.

    1997-01-01

    The primary objective of this work is to provide accurate numerical solutions for selected flow fields and to compare and evaluate the performance of selected turbulence models with experimental results. Four popular turbulence models have been tested and validated against experimental data often turbulent flows. The models are: (1) the two-equation k-epsilon model of Wilcox, (2) the two-equation k-epsilon model of Launder and Sharma, (3) the two-equation k-omega/k-epsilon SST model of Menter, and (4) the one-equation model of Spalart and Allmaras. The flows investigated are five free shear flows consisting of a mixing layer, a round jet, a plane jet, a plane wake, and a compressible mixing layer; and five boundary layer flows consisting of an incompressible flat plate, a Mach 5 adiabatic flat plate, a separated boundary layer, an axisymmetric shock-wave/boundary layer interaction, and an RAE 2822 transonic airfoil. The experimental data for these flows are well established and have been extensively used in model developments. The results are shown in the following four sections: Part A describes the equations of motion and boundary conditions; Part B describes the model equations, constants, parameters, boundary conditions, and numerical implementation; and Parts C and D describe the experimental data and the performance of the models in the free-shear flows and the boundary layer flows, respectively.

  18. Validation of systems biology models

    NARCIS (Netherlands)

    Hasdemir, D.

    2015-01-01

    The paradigm shift from qualitative to quantitative analysis of biological systems brought a substantial number of modeling approaches to the stage of molecular biology research. These include but certainly are not limited to nonlinear kinetic models, static network models and models obtained by the

  19. Feature Extraction for Structural Dynamics Model Validation

    Energy Technology Data Exchange (ETDEWEB)

    Farrar, Charles [Los Alamos National Laboratory; Nishio, Mayuko [Yokohama University; Hemez, Francois [Los Alamos National Laboratory; Stull, Chris [Los Alamos National Laboratory; Park, Gyuhae [Chonnam Univesity; Cornwell, Phil [Rose-Hulman Institute of Technology; Figueiredo, Eloi [Universidade Lusófona; Luscher, D. J. [Los Alamos National Laboratory; Worden, Keith [University of Sheffield

    2016-01-13

    As structural dynamics becomes increasingly non-modal, stochastic and nonlinear, finite element model-updating technology must adopt the broader notions of model validation and uncertainty quantification. For example, particular re-sampling procedures must be implemented to propagate uncertainty through a forward calculation, and non-modal features must be defined to analyze nonlinear data sets. The latter topic is the focus of this report, but first, some more general comments regarding the concept of model validation will be discussed.

  20. Model Validation in Ontology Based Transformations

    Directory of Open Access Journals (Sweden)

    Jesús M. Almendros-Jiménez

    2012-10-01

    Full Text Available Model Driven Engineering (MDE is an emerging approach of software engineering. MDE emphasizes the construction of models from which the implementation should be derived by applying model transformations. The Ontology Definition Meta-model (ODM has been proposed as a profile for UML models of the Web Ontology Language (OWL. In this context, transformations of UML models can be mapped into ODM/OWL transformations. On the other hand, model validation is a crucial task in model transformation. Meta-modeling permits to give a syntactic structure to source and target models. However, semantic requirements have to be imposed on source and target models. A given transformation will be sound when source and target models fulfill the syntactic and semantic requirements. In this paper, we present an approach for model validation in ODM based transformations. Adopting a logic programming based transformational approach we will show how it is possible to transform and validate models. Properties to be validated range from structural and semantic requirements of models (pre and post conditions to properties of the transformation (invariants. The approach has been applied to a well-known example of model transformation: the Entity-Relationship (ER to Relational Model (RM transformation.

  1. Model validation: Correlation for updating

    Indian Academy of Sciences (India)

    Department of Mechanical Engineering, Imperial College of Science, ... If we are unable to obtain a satisfactory degree of correlation between the initial theoretical model and the test data, then it is extremely unlikely that any form of model updating (correcting the model to match the test data) will succeed. Thus, a successful ...

  2. Statistical Validation of Normal Tissue Complication Probability Models

    Energy Technology Data Exchange (ETDEWEB)

    Xu Chengjian, E-mail: c.j.xu@umcg.nl [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schaaf, Arjen van der; Veld, Aart A. van' t; Langendijk, Johannes A. [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schilstra, Cornelis [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Radiotherapy Institute Friesland, Leeuwarden (Netherlands)

    2012-09-01

    Purpose: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. Methods and Materials: A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Results: Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Conclusion: Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use.

  3. Base Flow Model Validation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation is the systematic "building-block" validation of CFD/turbulence models employing a GUI driven CFD code (RPFM) and existing as well as new data sets to...

  4. Validating EHR clinical models using ontology patterns.

    Science.gov (United States)

    Martínez-Costa, Catalina; Schulz, Stefan

    2017-12-01

    Clinical models are artefacts that specify how information is structured in electronic health records (EHRs). However, the makeup of clinical models is not guided by any formal constraint beyond a semantically vague information model. We address this gap by advocating ontology design patterns as a mechanism that makes the semantics of clinical models explicit. This paper demonstrates how ontology design patterns can validate existing clinical models using SHACL. Based on the Clinical Information Modelling Initiative (CIMI), we show how ontology patterns detect both modeling and terminology binding errors in CIMI models. SHACL, a W3C constraint language for the validation of RDF graphs, builds on the concept of "Shape", a description of data in terms of expected cardinalities, datatypes and other restrictions. SHACL, as opposed to OWL, subscribes to the Closed World Assumption (CWA) and is therefore more suitable for the validation of clinical models. We have demonstrated the feasibility of the approach by manually describing the correspondences between six CIMI clinical models represented in RDF and two SHACL ontology design patterns. Using a Java-based SHACL implementation, we found at least eleven modeling and binding errors within these CIMI models. This demonstrates the usefulness of ontology design patterns not only as a modeling tool but also as a tool for validation. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. Results from the First Validation Phase of CAP code

    International Nuclear Information System (INIS)

    Choo, Yeon Joon; Hong, Soon Joon; Hwang, Su Hyun; Kim, Min Ki; Lee, Byung Chul; Ha, Sang Jun; Choi, Hoon

    2010-01-01

    The second stage of Safety Analysis Code Development for Nuclear Power Plants was lunched on Apirl, 2010 and is scheduled to be through 2012, of which the scope of work shall cover from code validation to licensing preparation. As a part of this project, CAP(Containment Analysis Package) will follow the same procedures. CAP's validation works are organized hieratically into four validation steps using; 1) Fundamental phenomena. 2) Principal phenomena (mixing and transport) and components in containment. 3) Demonstration test by small, middle, large facilities and International Standard Problems. 4) Comparison with other containment codes such as GOTHIC or COMTEMPT. In addition, collecting the experimental data related to containment phenomena and then constructing the database is one of the major works during the second stage as a part of this project. From the validation process of fundamental phenomenon, it could be expected that the current capability and the future improvements of CAP code will be revealed. For this purpose, simple but significant problems, which have the exact analytical solution, were selected and calculated for validation of fundamental phenomena. In this paper, some results of validation problems for the selected fundamental phenomena will be summarized and discussed briefly

  6. Verification and validation for waste disposal models

    International Nuclear Information System (INIS)

    1987-07-01

    A set of evaluation criteria has been developed to assess the suitability of current verification and validation techniques for waste disposal methods. A survey of current practices and techniques was undertaken and evaluated using these criteria with the items most relevant to waste disposal models being identified. Recommendations regarding the most suitable verification and validation practices for nuclear waste disposal modelling software have been made

  7. Atmospheric corrosion: statistical validation of models

    International Nuclear Information System (INIS)

    Diaz, V.; Martinez-Luaces, V.; Guineo-Cobs, G.

    2003-01-01

    In this paper we discuss two different methods for validation of regression models, applied to corrosion data. One of them is based on the correlation coefficient and the other one is the statistical test of lack of fit. Both methods are used here to analyse fitting of bi logarithmic model in order to predict corrosion for very low carbon steel substrates in rural and urban-industrial atmospheres in Uruguay. Results for parameters A and n of the bi logarithmic model are reported here. For this purpose, all repeated values were used instead of using average values as usual. Modelling is carried out using experimental data corresponding to steel substrates under the same initial meteorological conditions ( in fact, they are put in the rack at the same time). Results of correlation coefficient are compared with the lack of it tested at two different signification levels (α=0.01 and α=0.05). Unexpected differences between them are explained and finally, it is possible to conclude, at least in the studied atmospheres, that the bi logarithmic model does not fit properly the experimental data. (Author) 18 refs

  8. Geochemistry Model Validation Report: Material Degradation and Release Model

    Energy Technology Data Exchange (ETDEWEB)

    H. Stockman

    2001-09-28

    The purpose of this Analysis and Modeling Report (AMR) is to validate the Material Degradation and Release (MDR) model that predicts degradation and release of radionuclides from a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. This AMR is prepared according to ''Technical Work Plan for: Waste Package Design Description for LA'' (Ref. 17). The intended use of the MDR model is to estimate the long-term geochemical behavior of waste packages (WPs) containing U. S . Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The model is intended to predict (1) the extent to which criticality control material, such as gadolinium (Gd), will remain in the WP after corrosion of the initial WP, (2) the extent to which fissile Pu and uranium (U) will be carried out of the degraded WP by infiltrating water, and (3) the chemical composition and amounts of minerals and other solids left in the WP. The results of the model are intended for use in criticality calculations. The scope of the model validation report is to (1) describe the MDR model, and (2) compare the modeling results with experimental studies. A test case based on a degrading Pu-ceramic WP is provided to help explain the model. This model does not directly feed the assessment of system performance. The output from this model is used by several other models, such as the configuration generator, criticality, and criticality consequence models, prior to the evaluation of system performance. This document has been prepared according to AP-3.10Q, ''Analyses and Models'' (Ref. 2), and prepared in accordance with the technical work plan (Ref. 17).

  9. Geochemistry Model Validation Report: Material Degradation and Release Model

    International Nuclear Information System (INIS)

    Stockman, H.

    2001-01-01

    The purpose of this Analysis and Modeling Report (AMR) is to validate the Material Degradation and Release (MDR) model that predicts degradation and release of radionuclides from a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. This AMR is prepared according to ''Technical Work Plan for: Waste Package Design Description for LA'' (Ref. 17). The intended use of the MDR model is to estimate the long-term geochemical behavior of waste packages (WPs) containing U. S . Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The model is intended to predict (1) the extent to which criticality control material, such as gadolinium (Gd), will remain in the WP after corrosion of the initial WP, (2) the extent to which fissile Pu and uranium (U) will be carried out of the degraded WP by infiltrating water, and (3) the chemical composition and amounts of minerals and other solids left in the WP. The results of the model are intended for use in criticality calculations. The scope of the model validation report is to (1) describe the MDR model, and (2) compare the modeling results with experimental studies. A test case based on a degrading Pu-ceramic WP is provided to help explain the model. This model does not directly feed the assessment of system performance. The output from this model is used by several other models, such as the configuration generator, criticality, and criticality consequence models, prior to the evaluation of system performance. This document has been prepared according to AP-3.10Q, ''Analyses and Models'' (Ref. 2), and prepared in accordance with the technical work plan (Ref. 17)

  10. Model validation: Correlation for updating

    Indian Academy of Sciences (India)

    of refining the theoretical model which will be used for the design optimisation process. There are many different names given to the tasks involved in this refinement. .... slightly from the ideal line but in a systematic rather than a random fashion as this situation suggests that there is a specific characteristic responsible for the ...

  11. Assessment model validity document FARF31

    International Nuclear Information System (INIS)

    Elert, Mark; Gylling Bjoern; Lindgren, Maria

    2004-08-01

    The prime goal of model validation is to build confidence in the model concept and that the model is fit for its intended purpose. In other words: Does the model predict transport in fractured rock adequately to be used in repository performance assessments. Are the results reasonable for the type of modelling tasks the model is designed for. Commonly, in performance assessments a large number of realisations of flow and transport is made to cover the associated uncertainties. Thus, the flow and transport including radioactive chain decay are preferably calculated in the same model framework. A rather sophisticated concept is necessary to be able to model flow and radionuclide transport in the near field and far field of a deep repository, also including radioactive chain decay. In order to avoid excessively long computational times there is a need for well-based simplifications. For this reason, the far field code FARF31 is made relatively simple, and calculates transport by using averaged entities to represent the most important processes. FARF31 has been shown to be suitable for the performance assessments within the SKB studies, e.g. SR 97. Among the advantages are that it is a fast, simple and robust code, which enables handling of many realisations with wide spread in parameters in combination with chain decay of radionuclides. Being a component in the model chain PROPER, it is easy to assign statistical distributions to the input parameters. Due to the formulation of the advection-dispersion equation in FARF31 it is possible to perform the groundwater flow calculations separately.The basis for the modelling is a stream tube, i.e. a volume of rock including fractures with flowing water, with the walls of the imaginary stream tube defined by streamlines. The transport within the stream tube is described using a dual porosity continuum approach, where it is assumed that rock can be divided into two distinct domains with different types of porosity

  12. Assessment model validity document FARF31

    Energy Technology Data Exchange (ETDEWEB)

    Elert, Mark; Gylling Bjoern; Lindgren, Maria [Kemakta Konsult AB, Stockholm (Sweden)

    2004-08-01

    The prime goal of model validation is to build confidence in the model concept and that the model is fit for its intended purpose. In other words: Does the model predict transport in fractured rock adequately to be used in repository performance assessments. Are the results reasonable for the type of modelling tasks the model is designed for. Commonly, in performance assessments a large number of realisations of flow and transport is made to cover the associated uncertainties. Thus, the flow and transport including radioactive chain decay are preferably calculated in the same model framework. A rather sophisticated concept is necessary to be able to model flow and radionuclide transport in the near field and far field of a deep repository, also including radioactive chain decay. In order to avoid excessively long computational times there is a need for well-based simplifications. For this reason, the far field code FARF31 is made relatively simple, and calculates transport by using averaged entities to represent the most important processes. FARF31 has been shown to be suitable for the performance assessments within the SKB studies, e.g. SR 97. Among the advantages are that it is a fast, simple and robust code, which enables handling of many realisations with wide spread in parameters in combination with chain decay of radionuclides. Being a component in the model chain PROPER, it is easy to assign statistical distributions to the input parameters. Due to the formulation of the advection-dispersion equation in FARF31 it is possible to perform the groundwater flow calculations separately.The basis for the modelling is a stream tube, i.e. a volume of rock including fractures with flowing water, with the walls of the imaginary stream tube defined by streamlines. The transport within the stream tube is described using a dual porosity continuum approach, where it is assumed that rock can be divided into two distinct domains with different types of porosity

  13. The Mistra experiment for field containment code validation first results

    International Nuclear Information System (INIS)

    Caron-Charles, M.; Blumenfeld, L.

    2001-01-01

    The MISTRA facility is a large scale experiment, designed for the purpose of thermal-hydraulics multi-D codes validation. A short description of the facility, the set up of the instrumentation and the test program are presented. Then, the first experimental results, studying helium injection in the containment and their calculations are detailed. (author)

  14. BIOMOVS: an international model validation study

    International Nuclear Information System (INIS)

    Haegg, C.; Johansson, G.

    1988-01-01

    BIOMOVS (BIOspheric MOdel Validation Study) is an international study where models used for describing the distribution of radioactive and nonradioactive trace substances in terrestrial and aquatic environments are compared and tested. The main objectives of the study are to compare and test the accuracy of predictions between such models, explain differences in these predictions, recommend priorities for future research concerning the improvement of the accuracy of model predictions and act as a forum for the exchange of ideas, experience and information. (author)

  15. BIOMOVS: An international model validation study

    International Nuclear Information System (INIS)

    Haegg, C.; Johansson, G.

    1987-01-01

    BIOMOVS (BIOspheric MOdel Validation Study) is an international study where models used for describing the distribution of radioactive and nonradioactive trace substances in terrestrial and aquatic environments are compared and tested. The main objectives of the study are to compare and test the accuracy of predictions between such models, explain differences in these predictions, recommend priorities for future research concerning the improvement of the accuracy of model predictions and act as a forum for the exchange of ideas, experience and information. (orig.)

  16. On validation of multibody musculoskeletal models

    DEFF Research Database (Denmark)

    Lund, Morten Enemark; de Zee, Mark; Andersen, Michael Skipper

    2012-01-01

    This paper reviews the opportunities to validate multibody musculoskeletal models in view of the current transition of musculoskeletal modelling from a research topic to a practical simulation tool in product design, healthcare and other important applications. This transition creates a new need...

  17. System Advisor Model: Flat Plate Photovoltaic Performance Modeling Validation Report

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, Janine [National Renewable Energy Lab. (NREL), Golden, CO (United States); Whitmore, Jonathan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Kaffine, Leah [National Renewable Energy Lab. (NREL), Golden, CO (United States); Blair, Nate [National Renewable Energy Lab. (NREL), Golden, CO (United States); Dobos, Aron P. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2013-12-01

    The System Advisor Model (SAM) is a free software tool that performs detailed analysis of both system performance and system financing for a variety of renewable energy technologies. This report provides detailed validation of the SAM flat plate photovoltaic performance model by comparing SAM-modeled PV system generation data to actual measured production data for nine PV systems ranging from 75 kW to greater than 25 MW in size. The results show strong agreement between SAM predictions and field data, with annualized prediction error below 3% for all fixed tilt cases and below 8% for all one axis tracked cases. The analysis concludes that snow cover and system outages are the primary sources of disagreement, and other deviations resulting from seasonal biases in the irradiation models and one axis tracking issues are discussed in detail.

  18. Model performance analysis and model validation in logistic regression

    Directory of Open Access Journals (Sweden)

    Rosa Arboretti Giancristofaro

    2007-10-01

    Full Text Available In this paper a new model validation procedure for a logistic regression model is presented. At first, we illustrate a brief review of different techniques of model validation. Next, we define a number of properties required for a model to be considered "good", and a number of quantitative performance measures. Lastly, we describe a methodology for the assessment of the performance of a given model by using an example taken from a management study.

  19. Validation of A Global Hydrological Model

    Science.gov (United States)

    Doell, P.; Lehner, B.; Kaspar, F.; Vassolo, S.

    due to the precipitation mea- surement errors. Even though the explicit modeling of wetlands and lakes leads to a much improved modeling of both the vertical water balance and the lateral transport of water, not enough information is included in WGHM to accurately capture the hy- drology of these water bodies. Certainly, the reliability of model results is highest at the locations at which WGHM was calibrated. The validation indicates that reliability for cells inside calibrated basins is satisfactory if the basin is relatively homogeneous. Analyses of the few available stations outside of calibrated basins indicate a reason- ably high model reliability, particularly in humid regions.

  20. Structural system identification: Structural dynamics model validation

    Energy Technology Data Exchange (ETDEWEB)

    Red-Horse, J.R.

    1997-04-01

    Structural system identification is concerned with the development of systematic procedures and tools for developing predictive analytical models based on a physical structure`s dynamic response characteristics. It is a multidisciplinary process that involves the ability (1) to define high fidelity physics-based analysis models, (2) to acquire accurate test-derived information for physical specimens using diagnostic experiments, (3) to validate the numerical simulation model by reconciling differences that inevitably exist between the analysis model and the experimental data, and (4) to quantify uncertainties in the final system models and subsequent numerical simulations. The goal of this project was to develop structural system identification techniques and software suitable for both research and production applications in code and model validation.

  1. An Approach to Comprehensive and Sustainable Solar Wind Model Validation

    Science.gov (United States)

    Rastaetter, L.; MacNeice, P. J.; Mays, M. L.; Boblitt, J. M.; Wiegand, C.

    2017-12-01

    The number of models of the corona and inner heliosphere and of their updates and upgrades grows steadily, as does the number and character of the model inputs. Maintaining up to date validation of these models, in the face of this constant model evolution, is a necessary but very labor intensive activity. In the last year alone, both NASA's LWS program and the CCMC's ongoing support of model forecasting activities at NOAA SWPC have sought model validation reports on the quality of all aspects of the community's coronal and heliospheric models, including both ambient and CME related wind solutions at L1. In this presentation I will give a brief review of the community's previous model validation results of L1 wind representation. I will discuss the semi-automated web based system we are constructing at the CCMC to present comparative visualizations of all interesting aspects of the solutions from competing models.This system is designed to be easily queried to provide the essential comprehensive inputs to repeat andupdate previous validation studies and support extensions to them. I will illustrate this by demonstrating how the system is being used to support the CCMC/LWS Model Assessment Forum teams focused on the ambient and time dependent corona and solar wind, including CME arrival time and IMF Bz.I will also discuss plans to extend the system to include results from the Forum teams addressing SEP model validation.

  2. Validation of ecological state space models using the Laplace approximation

    DEFF Research Database (Denmark)

    Thygesen, Uffe Høgsbro; Albertsen, Christoffer Moesgaard; Berg, Casper Willestofte

    2017-01-01

    for estimation in general mixed effects models. Implementing one-step predictions in the R package Template Model Builder, we demonstrate that it is possible to perform model validation with little effort, even if the ecological model is multivariate, has non-linear dynamics, and whether observations...... are continuous or discrete. With both simulated data, and a real data set related to geolocation of seals, we demonstrate both the potential and the limitations of the techniques. Our results fill a need for convenient methods for validating a state space model, or alternatively, rejecting it while indicating...

  3. Validation of Hadronic Models in GEANT4

    Energy Technology Data Exchange (ETDEWEB)

    Koi, Tatsumi; Wright, Dennis H.; /SLAC; Folger, Gunter; Ivanchenko, Vladimir; Kossov, Mikhail; Starkov, Nikolai; /CERN; Heikkinen, Aatos; /Helsinki Inst. of Phys.; Truscott,; Lei, Fan; /QinetiQ; Wellisch, Hans-Peter

    2007-09-26

    Geant4 is a software toolkit for the simulation of the passage of particles through matter. It has abundant hadronic models from thermal neutron interactions to ultra relativistic hadrons. An overview of validations in Geant4 hadronic physics is presented based on thin target measurements. In most cases, good agreement is available between Monte Carlo prediction and experimental data; however, several problems have been detected which require some improvement in the models.

  4. Validation of Model Forecasts of the Ambient Solar Wind

    Science.gov (United States)

    Macneice, P. J.; Hesse, M.; Kuznetsova, M. M.; Rastaetter, L.; Taktakishvili, A.

    2009-01-01

    Independent and automated validation is a vital step in the progression of models from the research community into operational forecasting use. In this paper we describe a program in development at the CCMC to provide just such a comprehensive validation for models of the ambient solar wind in the inner heliosphere. We have built upon previous efforts published in the community, sharpened their definitions, and completed a baseline study. We also provide first results from this program of the comparative performance of the MHD models available at the CCMC against that of the Wang-Sheeley-Arge (WSA) model. An important goal of this effort is to provide a consistent validation to all available models. Clearly exposing the relative strengths and weaknesses of the different models will enable forecasters to craft more reliable ensemble forecasting strategies. Models of the ambient solar wind are developing rapidly as a result of improvements in data supply, numerical techniques, and computing resources. It is anticipated that in the next five to ten years, the MHD based models will supplant semi-empirical potential based models such as the WSA model, as the best available forecast models. We anticipate that this validation effort will track this evolution and so assist policy makers in gauging the value of past and future investment in modeling support.

  5. Validating agent based models through virtual worlds.

    Energy Technology Data Exchange (ETDEWEB)

    Lakkaraju, Kiran; Whetzel, Jonathan H.; Lee, Jina; Bier, Asmeret Brooke; Cardona-Rivera, Rogelio E.; Bernstein, Jeremy Ray Rhythm

    2014-01-01

    As the US continues its vigilance against distributed, embedded threats, understanding the political and social structure of these groups becomes paramount for predicting and dis- rupting their attacks. Agent-based models (ABMs) serve as a powerful tool to study these groups. While the popularity of social network tools (e.g., Facebook, Twitter) has provided extensive communication data, there is a lack of ne-grained behavioral data with which to inform and validate existing ABMs. Virtual worlds, in particular massively multiplayer online games (MMOG), where large numbers of people interact within a complex environ- ment for long periods of time provide an alternative source of data. These environments provide a rich social environment where players engage in a variety of activities observed between real-world groups: collaborating and/or competing with other groups, conducting battles for scarce resources, and trading in a market economy. Strategies employed by player groups surprisingly re ect those seen in present-day con icts, where players use diplomacy or espionage as their means for accomplishing their goals. In this project, we propose to address the need for ne-grained behavioral data by acquiring and analyzing game data a commercial MMOG, referred to within this report as Game X. The goals of this research were: (1) devising toolsets for analyzing virtual world data to better inform the rules that govern a social ABM and (2) exploring how virtual worlds could serve as a source of data to validate ABMs established for analogous real-world phenomena. During this research, we studied certain patterns of group behavior to compliment social modeling e orts where a signi cant lack of detailed examples of observed phenomena exists. This report outlines our work examining group behaviors that underly what we have termed the Expression-To-Action (E2A) problem: determining the changes in social contact that lead individuals/groups to engage in a particular behavior

  6. Transfer Entropy as a Tool for Hydrodynamic Model Validation

    Directory of Open Access Journals (Sweden)

    Alicia Sendrowski

    2018-01-01

    Full Text Available The validation of numerical models is an important component of modeling to ensure reliability of model outputs under prescribed conditions. In river deltas, robust validation of models is paramount given that models are used to forecast land change and to track water, solid, and solute transport through the deltaic network. We propose using transfer entropy (TE to validate model results. TE quantifies the information transferred between variables in terms of strength, timescale, and direction. Using water level data collected in the distributary channels and inter-channel islands of Wax Lake Delta, Louisiana, USA, along with modeled water level data generated for the same locations using Delft3D, we assess how well couplings between external drivers (river discharge, tides, wind and modeled water levels reproduce the observed data couplings. We perform this operation through time using ten-day windows. Modeled and observed couplings compare well; their differences reflect the spatial parameterization of wind and roughness in the model, which prevents the model from capturing high frequency fluctuations of water level. The model captures couplings better in channels than on islands, suggesting that mechanisms of channel-island connectivity are not fully represented in the model. Overall, TE serves as an additional validation tool to quantify the couplings of the system of interest at multiple spatial and temporal scales.

  7. Validation of the community radiative transfer model

    International Nuclear Information System (INIS)

    Ding Shouguo; Yang Ping; Weng Fuzhong; Liu Quanhua; Han Yong; Delst, Paul van; Li Jun; Baum, Bryan

    2011-01-01

    To validate the Community Radiative Transfer Model (CRTM) developed by the U.S. Joint Center for Satellite Data Assimilation (JCSDA), the discrete ordinate radiative transfer (DISORT) model and the line-by-line radiative transfer model (LBLRTM) are combined in order to provide a reference benchmark. Compared with the benchmark, the CRTM appears quite accurate for both clear sky and ice cloud radiance simulations with RMS errors below 0.2 K, except for clouds with small ice particles. In a computer CPU run time comparison, the CRTM is faster than DISORT by approximately two orders of magnitude. Using the operational MODIS cloud products and the European Center for Medium-range Weather Forecasting (ECMWF) atmospheric profiles as an input, the CRTM is employed to simulate the Atmospheric Infrared Sounder (AIRS) radiances. The CRTM simulations are shown to be in reasonably close agreement with the AIRS measurements (the discrepancies are within 2 K in terms of brightness temperature difference). Furthermore, the impact of uncertainties in the input cloud properties and atmospheric profiles on the CRTM simulations has been assessed. The CRTM-based brightness temperatures (BTs) at the top of the atmosphere (TOA), for both thin (τ 30) clouds, are highly sensitive to uncertainties in atmospheric temperature and cloud top pressure. However, for an optically thick cloud, the CRTM-based BTs are not sensitive to the uncertainties of cloud optical thickness, effective particle size, and atmospheric humidity profiles. On the contrary, the uncertainties of the CRTM-based TOA BTs resulting from effective particle size and optical thickness are not negligible in an optically thin cloud.

  8. Validation of the STAFF-5 computer model

    International Nuclear Information System (INIS)

    Fletcher, J.F.; Fields, S.R.

    1981-04-01

    STAFF-5 is a dynamic heat-transfer-fluid-flow stress model designed for computerized prediction of the temperature-stress performance of spent LWR fuel assemblies under storage/disposal conditions. Validation of the temperature calculating abilities of this model was performed by comparing temperature calculations under specified conditions to experimental data from the Engine Maintenance and Dissassembly (EMAD) Fuel Temperature Test Facility and to calculations performed by Battelle Pacific Northwest Laboratory (PNL) using the HYDRA-1 model. The comparisons confirmed the ability of STAFF-5 to calculate representative fuel temperatures over a considerable range of conditions, as a first step in the evaluation and prediction of fuel temperature-stress performance

  9. Validation of a multi-objective, predictive urban traffic model

    NARCIS (Netherlands)

    Wilmink, I.R.; Haak, P. van den; Woldeab, Z.; Vreeswijk, J.

    2013-01-01

    This paper describes the results of the verification and validation of the ecoStrategic Model, which was developed, implemented and tested in the eCoMove project. The model uses real-time and historical traffic information to determine the current, predicted and desired state of traffic in a

  10. SPR Hydrostatic Column Model Verification and Validation.

    Energy Technology Data Exchange (ETDEWEB)

    Bettin, Giorgia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lord, David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rudeen, David Keith [Gram, Inc. Albuquerque, NM (United States)

    2015-10-01

    A Hydrostatic Column Model (HCM) was developed to help differentiate between normal "tight" well behavior and small-leak behavior under nitrogen for testing the pressure integrity of crude oil storage wells at the U.S. Strategic Petroleum Reserve. This effort was motivated by steady, yet distinct, pressure behavior of a series of Big Hill caverns that have been placed under nitrogen for extended period of time. This report describes the HCM model, its functional requirements, the model structure and the verification and validation process. Different modes of operation are also described, which illustrate how the software can be used to model extended nitrogen monitoring and Mechanical Integrity Tests by predicting wellhead pressures along with nitrogen interface movements. Model verification has shown that the program runs correctly and it is implemented as intended. The cavern BH101 long term nitrogen test was used to validate the model which showed very good agreement with measured data. This supports the claim that the model is, in fact, capturing the relevant physical phenomena and can be used to make accurate predictions of both wellhead pressure and interface movements.

  11. Aromatic interactions impact ligand binding and function at serotonin 5-HT2C G protein-coupled receptors: receptor homology modelling, ligand docking, and molecular dynamics results validated by experimental studies

    Science.gov (United States)

    Córdova-Sintjago, Tania; Villa, Nancy; Fang, Lijuan; Booth, Raymond G.

    2014-02-01

    The serotonin (5-hydroxytryptamine, 5-HT) 5-HT2 G protein-coupled receptor (GPCR) family consists of types 2A, 2B, and 2C that share ∼75% transmembrane (TM) sequence identity. Agonists for 5-HT2C receptors are under development for psychoses; whereas, at 5-HT2A receptors, antipsychotic effects are associated with antagonists - in fact, 5-HT2A agonists can cause hallucinations and 5-HT2B agonists cause cardiotoxicity. It is known that 5-HT2A TM6 residues W6.48, F6.51, and F6.52 impact ligand binding and function; however, ligand interactions with these residues at the 5-HT2C receptor have not been reported. To predict and validate molecular determinants for 5-HT2C-specific activation, results from receptor homology modelling, ligand docking, and molecular dynamics simulation studies were compared with experimental results for ligand binding and function at wild type and W6.48A, F6.51A, and F6.52A point-mutated 5-HT2C receptors.

  12. Natural analogues and radionuclide transport model validation

    International Nuclear Information System (INIS)

    Lever, D.A.

    1987-08-01

    In this paper, some possible roles for natural analogues are discussed from the point of view of those involved with the development of mathematical models for radionuclide transport and with the use of these models in repository safety assessments. The characteristic features of a safety assessment are outlined in order to address the questions of where natural analogues can be used to improve our understanding of the processes involved and where they can assist in validating the models that are used. Natural analogues have the potential to provide useful information about some critical processes, especially long-term chemical processes and migration rates. There is likely to be considerable uncertainty and ambiguity associated with the interpretation of natural analogues, and thus it is their general features which should be emphasized, and models with appropriate levels of sophistication should be used. Experience gained in modelling the Koongarra uranium deposit in northern Australia is drawn upon. (author)

  13. Resampling procedures to validate dendro-auxometric regression models

    Directory of Open Access Journals (Sweden)

    2009-03-01

    Full Text Available Regression analysis has a large use in several sectors of forest research. The validation of a dendro-auxometric model is a basic step in the building of the model itself. The more a model resists to attempts of demonstrating its groundlessness, the more its reliability increases. In the last decades many new theories, that quite utilizes the calculation speed of the calculators, have been formulated. Here we show the results obtained by the application of a bootsprap resampling procedure as a validation tool.

  14. Predictive validation of an influenza spread model.

    Directory of Open Access Journals (Sweden)

    Ayaz Hyder

    Full Text Available BACKGROUND: Modeling plays a critical role in mitigating impacts of seasonal influenza epidemics. Complex simulation models are currently at the forefront of evaluating optimal mitigation strategies at multiple scales and levels of organization. Given their evaluative role, these models remain limited in their ability to predict and forecast future epidemics leading some researchers and public-health practitioners to question their usefulness. The objective of this study is to evaluate the predictive ability of an existing complex simulation model of influenza spread. METHODS AND FINDINGS: We used extensive data on past epidemics to demonstrate the process of predictive validation. This involved generalizing an individual-based model for influenza spread and fitting it to laboratory-confirmed influenza infection data from a single observed epidemic (1998-1999. Next, we used the fitted model and modified two of its parameters based on data on real-world perturbations (vaccination coverage by age group and strain type. Simulating epidemics under these changes allowed us to estimate the deviation/error between the expected epidemic curve under perturbation and observed epidemics taking place from 1999 to 2006. Our model was able to forecast absolute intensity and epidemic peak week several weeks earlier with reasonable reliability and depended on the method of forecasting-static or dynamic. CONCLUSIONS: Good predictive ability of influenza epidemics is critical for implementing mitigation strategies in an effective and timely manner. Through the process of predictive validation applied to a current complex simulation model of influenza spread, we provided users of the model (e.g. public-health officials and policy-makers with quantitative metrics and practical recommendations on mitigating impacts of seasonal influenza epidemics. This methodology may be applied to other models of communicable infectious diseases to test and potentially improve

  15. Predictive Validation of an Influenza Spread Model

    Science.gov (United States)

    Hyder, Ayaz; Buckeridge, David L.; Leung, Brian

    2013-01-01

    Background Modeling plays a critical role in mitigating impacts of seasonal influenza epidemics. Complex simulation models are currently at the forefront of evaluating optimal mitigation strategies at multiple scales and levels of organization. Given their evaluative role, these models remain limited in their ability to predict and forecast future epidemics leading some researchers and public-health practitioners to question their usefulness. The objective of this study is to evaluate the predictive ability of an existing complex simulation model of influenza spread. Methods and Findings We used extensive data on past epidemics to demonstrate the process of predictive validation. This involved generalizing an individual-based model for influenza spread and fitting it to laboratory-confirmed influenza infection data from a single observed epidemic (1998–1999). Next, we used the fitted model and modified two of its parameters based on data on real-world perturbations (vaccination coverage by age group and strain type). Simulating epidemics under these changes allowed us to estimate the deviation/error between the expected epidemic curve under perturbation and observed epidemics taking place from 1999 to 2006. Our model was able to forecast absolute intensity and epidemic peak week several weeks earlier with reasonable reliability and depended on the method of forecasting-static or dynamic. Conclusions Good predictive ability of influenza epidemics is critical for implementing mitigation strategies in an effective and timely manner. Through the process of predictive validation applied to a current complex simulation model of influenza spread, we provided users of the model (e.g. public-health officials and policy-makers) with quantitative metrics and practical recommendations on mitigating impacts of seasonal influenza epidemics. This methodology may be applied to other models of communicable infectious diseases to test and potentially improve their predictive

  16. Planck early results. XIV. ERCSC validation and extreme radio sources

    DEFF Research Database (Denmark)

    Lähteenmäki, A.; Lavonen, N.; León-Tavares, J.

    2011-01-01

    Source Catalog (ERCSC) with quasi-simultaneous ground-based observations as well as archival data at frequencies below or overlapping Planck frequency bands, to validate the astrometry and photometry of the ERCSC radio sources and study the spectral features shown in this new frequency window opened......Planck's all-sky surveys at 30-857 GHz provide an unprecedented opportunity to follow the radio spectra of a large sample of extragalactic sources to frequencies 2-20 times higher than allowed by past, large-area, ground-based surveys. We combine the results of the Planck Early Release Compact...... by Planck. The ERCSC source positions and flux density scales are found to be consistent with the ground-based observations. We present and discuss the spectral energy distributions of a sample of "extreme" radio sources, to illustrate the richness of the ERCSC for the study of extragalactic radio sources...

  17. Seclazone Reactor Modeling And Experimental Validation

    Energy Technology Data Exchange (ETDEWEB)

    Osinga, T. [ETH-Zuerich (Switzerland); Olalde, G. [CNRS Odeillo (France); Steinfeld, A. [PSI and ETHZ (Switzerland)

    2005-03-01

    A numerical model is formulated for the SOLZINC solar chemical reactor for the production of Zn by carbothermal reduction of ZnO. The model involves solving, by the finite-volume technique, a 1D unsteady state energy equation that couples heat transfer to the chemical kinetics for a shrinking packed bed exposed to thermal radiation. Validation is accomplished by comparison with experimentally measured temperature profiles and Zn production rates as a function of time, obtained for a 5-kW solar reactor tested at PSI's solar furnace. (author)

  18. Model validation studies of solar systems, Phase III. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Lantz, L.J.; Winn, C.B.

    1978-12-01

    Results obtained from a validation study of the TRNSYS, SIMSHAC, and SOLCOST solar system simulation and design are presented. Also included are comparisons between the FCHART and SOLCOST solar system design programs and some changes that were made to the SOLCOST program. Finally, results obtained from the analysis of several solar radiation models are presented. Separate abstracts were prepared for ten papers.

  19. Predicting the ungauged basin: Model validation and realism assessment

    Directory of Open Access Journals (Sweden)

    Tim evan Emmerik

    2015-10-01

    Full Text Available The hydrological decade on Predictions in Ungauged Basins (PUB led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of model outcome has not been discussed to a great extent. With this paper we aim to contribute to the discussion on how one can determine the value and validity of a hydrological model developed for an ungauged basin. As in many cases no local, or even regional, data are available, alternative methods should be applied. Using a PUB case study in a genuinely ungauged basin in southern Cambodia, we give several examples of how one can use different types of soft data to improve model design, calibrate and validate the model, and assess the realism of the model output. A rainfall-runoff model was coupled to an irrigation reservoir, allowing the use of additional and unconventional data. The model was mainly forced with remote sensing data, and local knowledge was used to constrain the parameters. Model realism assessment was done using data from surveys. This resulted in a successful reconstruction of the reservoir dynamics, and revealed the different hydrological characteristics of the two topographical classes. This paper does not present a generic approach that can be transferred to other ungauged catchments, but it aims to show how clever model design and alternative data acquisition can result in a valuable hydrological model for an ungauged catchment.

  20. ValidatorDB: database of up-to-date validation results for ligands and non-standard residues from the Protein Data Bank.

    Science.gov (United States)

    Sehnal, David; Svobodová Vařeková, Radka; Pravda, Lukáš; Ionescu, Crina-Maria; Geidl, Stanislav; Horský, Vladimír; Jaiswal, Deepti; Wimmerová, Michaela; Koča, Jaroslav

    2015-01-01

    Following the discovery of serious errors in the structure of biomacromolecules, structure validation has become a key topic of research, especially for ligands and non-standard residues. ValidatorDB (freely available at http://ncbr.muni.cz/ValidatorDB) offers a new step in this direction, in the form of a database of validation results for all ligands and non-standard residues from the Protein Data Bank (all molecules with seven or more heavy atoms). Model molecules from the wwPDB Chemical Component Dictionary are used as reference during validation. ValidatorDB covers the main aspects of validation of annotation, and additionally introduces several useful validation analyses. The most significant is the classification of chirality errors, allowing the user to distinguish between serious issues and minor inconsistencies. Other such analyses are able to report, for example, completely erroneous ligands, alternate conformations or complete identity with the model molecules. All results are systematically classified into categories, and statistical evaluations are performed. In addition to detailed validation reports for each molecule, ValidatorDB provides summaries of the validation results for the entire PDB, for sets of molecules sharing the same annotation (three-letter code) or the same PDB entry, and for user-defined selections of annotations or PDB entries. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  1. Validation of a phytoremediation computer model

    International Nuclear Information System (INIS)

    Corapcioglu, M.Y.; Sung, K.; Rhykerd, R.L.; Munster, C.; Drew, M.

    1999-01-01

    The use of plants to stimulate remediation of contaminated soil is an effective, low-cost cleanup method which can be applied to many different sites. A phytoremediation computer model has been developed to simulate how recalcitrant hydrocarbons interact with plant roots in unsaturated soil. A study was conducted to provide data to validate and calibrate the model. During the study, lysimeters were constructed and filled with soil contaminated with 10 [mg kg -1 ] TNT, PBB and chrysene. Vegetated and unvegetated treatments were conducted in triplicate to obtain data regarding contaminant concentrations in the soil, plant roots, root distribution, microbial activity, plant water use and soil moisture. When given the parameters of time and depth, the model successfully predicted contaminant concentrations under actual field conditions. Other model parameters are currently being evaluated. 15 refs., 2 figs

  2. Towards policy relevant environmental modeling: contextual validity and pragmatic models

    Science.gov (United States)

    Miles, Scott B.

    2000-01-01

    "What makes for a good model?" In various forms, this question is a question that, undoubtedly, many people, businesses, and institutions ponder with regards to their particular domain of modeling. One particular domain that is wrestling with this question is the multidisciplinary field of environmental modeling. Examples of environmental models range from models of contaminated ground water flow to the economic impact of natural disasters, such as earthquakes. One of the distinguishing claims of the field is the relevancy of environmental modeling to policy and environment-related decision-making in general. A pervasive view by both scientists and decision-makers is that a "good" model is one that is an accurate predictor. Thus, determining whether a model is "accurate" or "correct" is done by comparing model output to empirical observations. The expected outcome of this process, usually referred to as "validation" or "ground truthing," is a stamp on the model in question of "valid" or "not valid" that serves to indicate whether or not the model will be reliable before it is put into service in a decision-making context. In this paper, I begin by elaborating on the prevailing view of model validation and why this view must change. Drawing from concepts coming out of the studies of science and technology, I go on to propose a contextual view of validity that can overcome the problems associated with "ground truthing" models as an indicator of model goodness. The problem of how we talk about and determine model validity has much to do about how we perceive the utility of environmental models. In the remainder of the paper, I argue that we should adopt ideas of pragmatism in judging what makes for a good model and, in turn, developing good models. From such a perspective of model goodness, good environmental models should facilitate communication, convey—not bury or "eliminate"—uncertainties, and, thus, afford the active building of consensus decisions, instead

  3. Computational fluid dynamics simulations and validations of results

    CSIR Research Space (South Africa)

    Sitek, MA

    2013-09-01

    Full Text Available Wind flow influence on a high-rise building is analyzed. The research covers full-scale tests, wind-tunnel experiments and numerical simulations. In the present paper computational model used in simulations is described and the results, which were...

  4. Toward valid and reliable brain imaging results in eating disorders.

    Science.gov (United States)

    Frank, Guido K W; Favaro, Angela; Marsh, Rachel; Ehrlich, Stefan; Lawson, Elizabeth A

    2018-03-01

    Human brain imaging can help improve our understanding of mechanisms underlying brain function and how they drive behavior in health and disease. Such knowledge may eventually help us to devise better treatments for psychiatric disorders. However, the brain imaging literature in psychiatry and especially eating disorders has been inconsistent, and studies are often difficult to replicate. The extent or severity of extremes of eating and state of illness, which are often associated with differences in, for instance hormonal status, comorbidity, and medication use, commonly differ between studies and likely add to variation across study results. Those effects are in addition to the well-described problems arising from differences in task designs, data quality control procedures, image data preprocessing and analysis or statistical thresholds applied across studies. Which of those factors are most relevant to improve reproducibility is still a question for debate and further research. Here we propose guidelines for brain imaging research in eating disorders to acquire valid results that are more reliable and clinically useful. © 2018 Wiley Periodicals, Inc.

  5. Concepts of Model Verification and Validation

    International Nuclear Information System (INIS)

    Thacker, B.H.; Doebling, S.W.; Hemez, F.M.; Anderson, M.C.; Pepin, J.E.; Rodriguez, E.A.

    2004-01-01

    Model verification and validation (VandV) is an enabling methodology for the development of computational models that can be used to make engineering predictions with quantified confidence. Model VandV procedures are needed by government and industry to reduce the time, cost, and risk associated with full-scale testing of products, materials, and weapon systems. Quantifying the confidence and predictive accuracy of model calculations provides the decision-maker with the information necessary for making high-consequence decisions. The development of guidelines and procedures for conducting a model VandV program are currently being defined by a broad spectrum of researchers. This report reviews the concepts involved in such a program. Model VandV is a current topic of great interest to both government and industry. In response to a ban on the production of new strategic weapons and nuclear testing, the Department of Energy (DOE) initiated the Science-Based Stockpile Stewardship Program (SSP). An objective of the SSP is to maintain a high level of confidence in the safety, reliability, and performance of the existing nuclear weapons stockpile in the absence of nuclear testing. This objective has challenged the national laboratories to develop high-confidence tools and methods that can be used to provide credible models needed for stockpile certification via numerical simulation. There has been a significant increase in activity recently to define VandV methods and procedures. The U.S. Department of Defense (DoD) Modeling and Simulation Office (DMSO) is working to develop fundamental concepts and terminology for VandV applied to high-level systems such as ballistic missile defense and battle management simulations. The American Society of Mechanical Engineers (ASME) has recently formed a Standards Committee for the development of VandV procedures for computational solid mechanics models. The Defense Nuclear Facilities Safety Board (DNFSB) has been a proponent of model

  6. Validation of spectral gas radiation models under oxyfuel conditions

    Energy Technology Data Exchange (ETDEWEB)

    Becher, Johann Valentin

    2013-05-15

    Combustion of hydrocarbon fuels with pure oxygen results in a different flue gas composition than combustion with air. Standard computational-fluid-dynamics (CFD) spectral gas radiation models for air combustion are therefore out of their validity range in oxyfuel combustion. This thesis provides a common spectral basis for the validation of new spectral models. A literature review about fundamental gas radiation theory, spectral modeling and experimental methods provides the reader with a basic understanding of the topic. In the first results section, this thesis validates detailed spectral models with high resolution spectral measurements in a gas cell with the aim of recommending one model as the best benchmark model. In the second results section, spectral measurements from a turbulent natural gas flame - as an example for a technical combustion process - are compared to simulated spectra based on measured gas atmospheres. The third results section compares simplified spectral models to the benchmark model recommended in the first results section and gives a ranking of the proposed models based on their accuracy. A concluding section gives recommendations for the selection and further development of simplified spectral radiation models. Gas cell transmissivity spectra in the spectral range of 2.4 - 5.4 {mu}m of water vapor and carbon dioxide in the temperature range from 727 C to 1500 C and at different concentrations were compared in the first results section at a nominal resolution of 32 cm{sup -1} to line-by-line models from different databases, two statistical-narrow-band models and the exponential-wide-band model. The two statistical-narrow-band models EM2C and RADCAL showed good agreement with a maximal band transmissivity deviation of 3 %. The exponential-wide-band model showed a deviation of 6 %. The new line-by-line database HITEMP2010 had the lowest band transmissivity deviation of 2.2% and was therefore recommended as a reference model for the

  7. Validity of empirical models of exposure in asphalt paving

    Science.gov (United States)

    Burstyn, I; Boffetta, P; Burr, G; Cenni, A; Knecht, U; Sciarra, G; Kromhout, H

    2002-01-01

    Aims: To investigate the validity of empirical models of exposure to bitumen fume and benzo(a)pyrene, developed for a historical cohort study of asphalt paving in Western Europe. Methods: Validity was evaluated using data from the USA, Italy, and Germany not used to develop the original models. Correlation between observed and predicted exposures was examined. Bias and precision were estimated. Results: Models were imprecise. Furthermore, predicted bitumen fume exposures tended to be lower (-70%) than concentrations found during paving in the USA. This apparent bias might be attributed to differences between Western European and USA paving practices. Evaluation of the validity of the benzo(a)pyrene exposure model revealed a similar to expected effect of re-paving and a larger than expected effect of tar use. Overall, benzo(a)pyrene models underestimated exposures by 51%. Conclusions: Possible bias as a result of underestimation of the impact of coal tar on benzo(a)pyrene exposure levels must be explored in sensitivity analysis of the exposure–response relation. Validation of the models, albeit limited, increased our confidence in their applicability to exposure assessment in the historical cohort study of cancer risk among asphalt workers. PMID:12205236

  8. Progress in Geant4 Electromagnetic Physics Modelling and Validation

    CERN Document Server

    Apostolakis, J; Bagulya, A; Brown, J M C; Burkhardt, H; Chikuma, N; Cortes-Giraldo, M A; Elles, S; Grichine, V; Guatelli, S; Incerti, S; Ivanchenko, V N; Jacquemier, J; Kadri, O; Maire, M; Pandola, L; Sawkey, D; Toshito, T; Urban, L; Yamashita, T

    2015-01-01

    In this work we report on recent improvements in the electromagnetic (EM) physics models of Geant4 and new validations of EM physics. Improvements have been made in models of the photoelectric effect, Compton scattering, gamma conversion to electron and muon pairs, fluctuations of energy loss, multiple scattering, synchrotron radiation, and high energy positron annihilation. The results of these developments are included in the new Geant4 version 10.1 and in patches to previous versions 9.6 and 10.0 that are planned to be used for production for run-2 at LHC. The Geant4 validation suite for EM physics has been extended and new validation results are shown in this work. In particular, the effect of gamma-nuclear interactions on EM shower shape at LHC energies is discussed.

  9. Model-Based Method for Sensor Validation

    Science.gov (United States)

    Vatan, Farrokh

    2012-01-01

    Fault detection, diagnosis, and prognosis are essential tasks in the operation of autonomous spacecraft, instruments, and in situ platforms. One of NASA s key mission requirements is robust state estimation. Sensing, using a wide range of sensors and sensor fusion approaches, plays a central role in robust state estimation, and there is a need to diagnose sensor failure as well as component failure. Sensor validation can be considered to be part of the larger effort of improving reliability and safety. The standard methods for solving the sensor validation problem are based on probabilistic analysis of the system, from which the method based on Bayesian networks is most popular. Therefore, these methods can only predict the most probable faulty sensors, which are subject to the initial probabilities defined for the failures. The method developed in this work is based on a model-based approach and provides the faulty sensors (if any), which can be logically inferred from the model of the system and the sensor readings (observations). The method is also more suitable for the systems when it is hard, or even impossible, to find the probability functions of the system. The method starts by a new mathematical description of the problem and develops a very efficient and systematic algorithm for its solution. The method builds on the concepts of analytical redundant relations (ARRs).

  10. Validation of the filament winding process model

    Science.gov (United States)

    Calius, Emilo P.; Springer, George S.; Wilson, Brian A.; Hanson, R. Scott

    1987-01-01

    Tests were performed toward validating the WIND model developed previously for simulating the filament winding of composite cylinders. In these tests two 24 in. long, 8 in. diam and 0.285 in. thick cylinders, made of IM-6G fibers and HBRF-55 resin, were wound at + or - 45 deg angle on steel mandrels. The temperatures on the inner and outer surfaces and inside the composite cylinders were recorded during oven cure. The temperatures inside the cylinders were also calculated by the WIND model. The measured and calculated temperatures were then compared. In addition, the degree of cure and resin viscosity distributions inside the cylinders were calculated for the conditions which existed in the tests.

  11. Analytical models approximating individual processes: a validation method.

    Science.gov (United States)

    Favier, C; Degallier, N; Menkès, C E

    2010-12-01

    Upscaling population models from fine to coarse resolutions, in space, time and/or level of description, allows the derivation of fast and tractable models based on a thorough knowledge of individual processes. The validity of such approximations is generally tested only on a limited range of parameter sets. A more general validation test, over a range of parameters, is proposed; this would estimate the error induced by the approximation, using the original model's stochastic variability as a reference. This method is illustrated by three examples taken from the field of epidemics transmitted by vectors that bite in a temporally cyclical pattern, that illustrate the use of the method: to estimate if an approximation over- or under-fits the original model; to invalidate an approximation; to rank possible approximations for their qualities. As a result, the application of the validation method to this field emphasizes the need to account for the vectors' biology in epidemic prediction models and to validate these against finer scale models. Copyright © 2010 Elsevier Inc. All rights reserved.

  12. Validation of the newborn larynx modeling with aerodynamical experimental data.

    Science.gov (United States)

    Nicollas, R; Giordano, J; Garrel, R; Medale, M; Caminat, P; Giovanni, A; Ouaknine, M; Triglia, J M

    2009-06-01

    Many authors have studied adult's larynx modelization, but the mechanisms of newborn's voice production have very rarely been investigated. After validating a numerical model with acoustic data, studies were performed on larynges of human fetuses in order to validate this model with aerodynamical experiments. Anatomical measurements were performed and a simplified numerical model was built using Fluent((R)) with the vocal folds in phonatory position. The results obtained are in good agreement with those obtained by laser Doppler velocimetry (LDV) and high-frame rate particle image velocimetry (HFR-PIV), on an experimental bench with excised human fetus larynges. It appears that computing with first cry physiological parameters leads to a model which is close to those obtained in experiments with real organs.

  13. SDG and qualitative trend based model multiple scale validation

    Science.gov (United States)

    Gao, Dong; Xu, Xin; Yin, Jianjin; Zhang, Hongyu; Zhang, Beike

    2017-09-01

    Verification, Validation and Accreditation (VV&A) is key technology of simulation and modelling. For the traditional model validation methods, the completeness is weak; it is carried out in one scale; it depends on human experience. The SDG (Signed Directed Graph) and qualitative trend based multiple scale validation is proposed. First the SDG model is built and qualitative trends are added to the model. And then complete testing scenarios are produced by positive inference. The multiple scale validation is carried out by comparing the testing scenarios with outputs of simulation model in different scales. Finally, the effectiveness is proved by carrying out validation for a reactor model.

  14. Multiphysics modelling and experimental validation of high concentration photovoltaic modules

    International Nuclear Information System (INIS)

    Theristis, Marios; Fernández, Eduardo F.; Sumner, Mike; O'Donovan, Tadhg S.

    2017-01-01

    Highlights: • A multiphysics modelling approach for concentrating photovoltaics was developed. • An experimental campaign was conducted to validate the models. • The experimental results were in good agreement with the models. • The multiphysics modelling allows the concentrator’s optimisation. - Abstract: High concentration photovoltaics, equipped with high efficiency multijunction solar cells, have great potential in achieving cost-effective and clean electricity generation at utility scale. Such systems are more complex compared to conventional photovoltaics because of the multiphysics effect that is present. Modelling the power output of such systems is therefore crucial for their further market penetration. Following this line, a multiphysics modelling procedure for high concentration photovoltaics is presented in this work. It combines an open source spectral model, a single diode electrical model and a three-dimensional finite element thermal model. In order to validate the models and the multiphysics modelling procedure against actual data, an outdoor experimental campaign was conducted in Albuquerque, New Mexico using a high concentration photovoltaic monomodule that is thoroughly described in terms of its geometry and materials. The experimental results were in good agreement (within 2.7%) with the predicted maximum power point. This multiphysics approach is relatively more complex when compared to empirical models, but besides the overall performance prediction it can also provide better understanding of the physics involved in the conversion of solar irradiance into electricity. It can therefore be used for the design and optimisation of high concentration photovoltaic modules.

  15. Unit testing, model validation, and biological simulation.

    Science.gov (United States)

    Sarma, Gopal P; Jacobs, Travis W; Watts, Mark D; Ghayoomie, S Vahid; Larson, Stephen D; Gerkin, Richard C

    2016-01-01

    The growth of the software industry has gone hand in hand with the development of tools and cultural practices for ensuring the reliability of complex pieces of software. These tools and practices are now acknowledged to be essential to the management of modern software. As computational models and methods have become increasingly common in the biological sciences, it is important to examine how these practices can accelerate biological software development and improve research quality. In this article, we give a focused case study of our experience with the practices of unit testing and test-driven development in OpenWorm, an open-science project aimed at modeling Caenorhabditis elegans. We identify and discuss the challenges of incorporating test-driven development into a heterogeneous, data-driven project, as well as the role of model validation tests, a category of tests unique to software which expresses scientific models.

  16. Experimental Validation of Flow Force Models for Fast Switching Valves

    DEFF Research Database (Denmark)

    Bender, Niels Christian; Pedersen, Henrik Clemmensen; Nørgård, Christian

    2017-01-01

    This paper comprises a detailed study of the forces acting on a Fast Switching Valve (FSV) plunger. The objective is to investigate to what extend different models are valid to be used for design purposes. These models depend on the geometry of the moving plunger and the properties of the surroun......This paper comprises a detailed study of the forces acting on a Fast Switching Valve (FSV) plunger. The objective is to investigate to what extend different models are valid to be used for design purposes. These models depend on the geometry of the moving plunger and the properties...... velocity is non-zero. This is the case in FSVs, where it results in an additional dampening effect, which is of relevance when analyzing contact-impact. Experimental data from different tests cases of a FSV has been gathered, with the plunger moving through a medium of either oil or air. This data is used...... to compare and validate different models, where an effort is directed towards capturing the fluid squeeze effect just before material on material contact. The test data is compared with simulation data relying solely on analytic formulations. The general dynamics of the plunger is validated...

  17. The atrial fibrillation knowledge scale : Development, validation and results

    NARCIS (Netherlands)

    Hendriks, J.M.L.; Crijns, H.J.G.M.; Tieleman, R.G.; Vrijhoef, H.J.M.

    2013-01-01

    Background Patients' understanding of the nature and consequences of atrial fibrillation (AF) and appropriate therapy, is essential to optimize AF management. Currently, no valid instrument exists to measure knowledge in AF patients with a combined focus on disease, symptom recognition and therapy.

  18. Urban roughness mapping validation techniques and some first results

    NARCIS (Netherlands)

    Bottema, M; Mestayer, PG

    1998-01-01

    Because of measuring problems related to evaluation of urban roughness parameters, a new approach using a roughness mapping tool has been tested: evaluation of roughness length z(o) and zero displacement z(d) from cadastral databases. Special attention needs to be given to the validation of the

  19. Validation of Fatigue Modeling Predictions in Aviation Operations

    Science.gov (United States)

    Gregory, Kevin; Martinez, Siera; Flynn-Evans, Erin

    2017-01-01

    Bio-mathematical fatigue models that predict levels of alertness and performance are one potential tool for use within integrated fatigue risk management approaches. A number of models have been developed that provide predictions based on acute and chronic sleep loss, circadian desynchronization, and sleep inertia. Some are publicly available and gaining traction in settings such as commercial aviation as a means of evaluating flight crew schedules for potential fatigue-related risks. Yet, most models have not been rigorously evaluated and independently validated for the operations to which they are being applied and many users are not fully aware of the limitations in which model results should be interpreted and applied.

  20. Verification and Validation of Heat Transfer Model of AGREE Code

    Energy Technology Data Exchange (ETDEWEB)

    Tak, N. I. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Seker, V.; Drzewiecki, T. J.; Downar, T. J. [Department of Nuclear Engineering and Radiological Sciences, Univ. of Michigan, Michigan (United States); Kelly, J. M. [US Nuclear Regulatory Commission, Washington (United States)

    2013-05-15

    The AGREE code was originally developed as a multi physics simulation code to perform design and safety analysis of Pebble Bed Reactors (PBR). Currently, additional capability for the analysis of Prismatic Modular Reactor (PMR) core is in progress. Newly implemented fluid model for a PMR core is based on a subchannel approach which has been widely used in the analyses of light water reactor (LWR) cores. A hexagonal fuel (or graphite block) is discretized into triangular prism nodes having effective conductivities. Then, a meso-scale heat transfer model is applied to the unit cell geometry of a prismatic fuel block. Both unit cell geometries of multi-hole and pin-in-hole types of prismatic fuel blocks are considered in AGREE. The main objective of this work is to verify and validate the heat transfer model newly implemented for a PMR core in the AGREE code. The measured data in the HENDEL experiment were used for the validation of the heat transfer model for a pin-in-hole fuel block. However, the HENDEL tests were limited to only steady-state conditions of pin-in-hole fuel blocks. There exist no available experimental data regarding a heat transfer in multi-hole fuel blocks. Therefore, numerical benchmarks using conceptual problems are considered to verify the heat transfer model of AGREE for multi-hole fuel blocks as well as transient conditions. The CORONA and GAMMA+ codes were used to compare the numerical results. In this work, the verification and validation study were performed for the heat transfer model of the AGREE code using the HENDEL experiment and the numerical benchmarks of selected conceptual problems. The results of the present work show that the heat transfer model of AGREE is accurate and reliable for prismatic fuel blocks. Further validation of AGREE is in progress for a whole reactor problem using the HTTR safety test data such as control rod withdrawal tests and loss-of-forced convection tests.

  1. VEMAP 1: Selected Model Results

    Data.gov (United States)

    National Aeronautics and Space Administration — The Vegetation/Ecosystem Modeling and Analysis Project (VEMAP) was a multi-institutional, international effort addressing the response of biogeography and...

  2. Numerical Validation of Chemical Compositional Model for Wettability Alteration Processes

    Science.gov (United States)

    Bekbauov, Bakhbergen; Berdyshev, Abdumauvlen; Baishemirov, Zharasbek; Bau, Domenico

    2017-12-01

    Chemical compositional simulation of enhanced oil recovery and surfactant enhanced aquifer remediation processes is a complex task that involves solving dozens of equations for all grid blocks representing a reservoir. In the present work, we perform a numerical validation of the newly developed mathematical formulation which satisfies the conservation laws of mass and energy and allows applying a sequential solution approach to solve the governing equations separately and implicitly. Through its application to the numerical experiment using a wettability alteration model and comparisons with existing chemical compositional model's numerical results, the new model has proven to be practical, reliable and stable.

  3. Validation of a parametric finite element human femur model.

    Science.gov (United States)

    Klein, Katelyn F; Hu, Jingwen; Reed, Matthew P; Schneider, Lawrence W; Rupp, Jonathan D

    2017-05-19

    Finite element (FE) models with geometry and material properties that are parametric with subject descriptors, such as age and body shape/size, are being developed to incorporate population variability into crash simulations. However, the validation methods currently being used with these parametric models do not assess whether model predictions are reasonable in the space over which the model is intended to be used. This study presents a parametric model of the femur and applies a unique validation paradigm to this parametric femur model that characterizes whether model predictions reproduce experimentally observed trends. FE models of male and female femurs with geometries that are parametric with age, femur length, and body mass index (BMI) were developed based on existing statistical models that predict femur geometry. These parametric FE femur models were validated by comparing responses from combined loading tests of femoral shafts to simulation results from FE models of the corresponding femoral shafts whose geometry was predicted using the associated age, femur length, and BMI. The effects of subject variables on model responses were also compared with trends in the experimental data set by fitting similarly parameterized statistical models to both the results of the experimental data and the corresponding FE model results and then comparing fitted model coefficients for the experimental and predicted data sets. The average error in impact force at experimental failure for the parametric models was 5%. The coefficients of a statistical model fit to simulation data were within one standard error of the coefficients of a similarly parameterized model of the experimental data except for the age parameter, likely because material properties used in simulations were not varied with specimen age. In simulations to explore the effects of femur length, BMI, and age on impact response, only BMI significantly affected response for both men and women, with increasing

  4. Validated TRNSYS Model for Solar Assisted Space Heating System

    International Nuclear Information System (INIS)

    Abdalla, Nedal

    2014-01-01

    The present study involves a validated TRNSYS model for solar assisted space heating system as applied to a residential building in Jordan using new detailed radiation models of the TRNSYS 17.1 and geometric building model Trnsys3d for the Google SketchUp 3D drawing program. The annual heating load for a building (Solar House) which is located at the Royal ScientiFIc Society (RS5) in Jordan is estimated under climatological conditions of Amman. The aim of this Paper is to compare measured thermal performance of the Solar House with that modeled using TRNSYS. The results showed that the annual measured space heating load for the building was 6,188 kWh while the heati.ng load for the modeled building was 6,391 kWh. Moreover, the measured solar fraction for the solar system was 50% while the modeled solar fraction was 55%. A comparison of modeled and measured data resulted in percentage mean absolute errors for solar energy for space heating, auxiliary heating and solar fraction of 13%, 7% and 10%, respectively. The validated model will be useful for long-term performance simulation under different weather and operating conditions.(author)

  5. VEMAP 1: Selected Model Results

    Data.gov (United States)

    National Aeronautics and Space Administration — ABSTRACT: The Vegetation/Ecosystem Modeling and Analysis Project (VEMAP) was a multi-institutional, international effort addressing the response of biogeography and...

  6. Monte Carlo Modelling of Mammograms : Development and Validation

    International Nuclear Information System (INIS)

    Spyrou, G.; Panayiotakis, G.; Bakas, A.; Tzanakos, G.

    1998-01-01

    A software package using Monte Carlo methods has been developed for the simulation of x-ray mammography. A simplified geometry of the mammographic apparatus has been considered along with the software phantom of compressed breast. This phantom may contain inhomogeneities of various compositions and sizes at any point. Using this model one can produce simulated mammograms. Results that demonstrate the validity of this simulation are presented. (authors)

  7. Predicting the ungauged basin: model validation and realism assessment

    Science.gov (United States)

    van Emmerik, Tim; Mulder, Gert; Eilander, Dirk; Piet, Marijn; Savenije, Hubert

    2016-04-01

    The hydrological decade on Predictions in Ungauged Basins (PUB) [1] led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of model outcome has not been discussed to a great extent. With this study [2] we aim to contribute to the discussion on how one can determine the value and validity of a hydrological model developed for an ungauged basin. As in many cases no local, or even regional, data are available, alternative methods should be applied. Using a PUB case study in a genuinely ungauged basin in southern Cambodia, we give several examples of how one can use different types of soft data to improve model design, calibrate and validate the model, and assess the realism of the model output. A rainfall-runoff model was coupled to an irrigation reservoir, allowing the use of additional and unconventional data. The model was mainly forced with remote sensing data, and local knowledge was used to constrain the parameters. Model realism assessment was done using data from surveys. This resulted in a successful reconstruction of the reservoir dynamics, and revealed the different hydrological characteristics of the two topographical classes. We do not present a generic approach that can be transferred to other ungauged catchments, but we aim to show how clever model design and alternative data acquisition can result in a valuable hydrological model for ungauged catchments. [1] Sivapalan, M., Takeuchi, K., Franks, S., Gupta, V., Karambiri, H., Lakshmi, V., et al. (2003). IAHS decade on predictions in ungauged basins (PUB), 2003-2012: shaping an exciting future for the hydrological sciences. Hydrol. Sci. J. 48, 857-880. doi: 10.1623/hysj.48.6.857.51421 [2] van Emmerik, T., Mulder, G., Eilander, D., Piet, M. and Savenije, H. (2015). Predicting the ungauged basin: model validation and realism assessment

  8. Validating predictions from climate envelope models

    Science.gov (United States)

    Watling, J.; Bucklin, D.; Speroterra, C.; Brandt, L.; Cabal, C.; Romañach, Stephanie S.; Mazzotti, Frank J.

    2013-01-01

    Climate envelope models are a potentially important conservation tool, but their ability to accurately forecast species’ distributional shifts using independent survey data has not been fully evaluated. We created climate envelope models for 12 species of North American breeding birds previously shown to have experienced poleward range shifts. For each species, we evaluated three different approaches to climate envelope modeling that differed in the way they treated climate-induced range expansion and contraction, using random forests and maximum entropy modeling algorithms. All models were calibrated using occurrence data from 1967–1971 (t1) and evaluated using occurrence data from 1998–2002 (t2). Model sensitivity (the ability to correctly classify species presences) was greater using the maximum entropy algorithm than the random forest algorithm. Although sensitivity did not differ significantly among approaches, for many species, sensitivity was maximized using a hybrid approach that assumed range expansion, but not contraction, in t2. Species for which the hybrid approach resulted in the greatest improvement in sensitivity have been reported from more land cover types than species for which there was little difference in sensitivity between hybrid and dynamic approaches, suggesting that habitat generalists may be buffered somewhat against climate-induced range contractions. Specificity (the ability to correctly classify species absences) was maximized using the random forest algorithm and was lowest using the hybrid approach. Overall, our results suggest cautious optimism for the use of climate envelope models to forecast range shifts, but also underscore the importance of considering non-climate drivers of species range limits. The use of alternative climate envelope models that make different assumptions about range expansion and contraction is a new and potentially useful way to help inform our understanding of climate change effects on species.

  9. Validating predictions from climate envelope models.

    Directory of Open Access Journals (Sweden)

    James I Watling

    Full Text Available Climate envelope models are a potentially important conservation tool, but their ability to accurately forecast species' distributional shifts using independent survey data has not been fully evaluated. We created climate envelope models for 12 species of North American breeding birds previously shown to have experienced poleward range shifts. For each species, we evaluated three different approaches to climate envelope modeling that differed in the way they treated climate-induced range expansion and contraction, using random forests and maximum entropy modeling algorithms. All models were calibrated using occurrence data from 1967-1971 (t1 and evaluated using occurrence data from 1998-2002 (t2. Model sensitivity (the ability to correctly classify species presences was greater using the maximum entropy algorithm than the random forest algorithm. Although sensitivity did not differ significantly among approaches, for many species, sensitivity was maximized using a hybrid approach that assumed range expansion, but not contraction, in t2. Species for which the hybrid approach resulted in the greatest improvement in sensitivity have been reported from more land cover types than species for which there was little difference in sensitivity between hybrid and dynamic approaches, suggesting that habitat generalists may be buffered somewhat against climate-induced range contractions. Specificity (the ability to correctly classify species absences was maximized using the random forest algorithm and was lowest using the hybrid approach. Overall, our results suggest cautious optimism for the use of climate envelope models to forecast range shifts, but also underscore the importance of considering non-climate drivers of species range limits. The use of alternative climate envelope models that make different assumptions about range expansion and contraction is a new and potentially useful way to help inform our understanding of climate change effects on

  10. Filament winding cylinders. II - Validation of the process model

    Science.gov (United States)

    Calius, Emilio P.; Lee, Soo-Yong; Springer, George S.

    1990-01-01

    Analytical and experimental studies were performed to validate the model developed by Lee and Springer for simulating the manufacturing process of filament wound composite cylinders. First, results calculated by the Lee-Springer model were compared to results of the Calius-Springer thin cylinder model. Second, temperatures and strains calculated by the Lee-Springer model were compared to data. The data used in these comparisons were generated during the course of this investigation with cylinders made of Hercules IM-6G/HBRF-55 and Fiberite T-300/976 graphite-epoxy tows. Good agreement was found between the calculated and measured stresses and strains, indicating that the model is a useful representation of the winding and curing processes.

  11. Validation of the measure automobile emissions model : a statistical analysis

    Science.gov (United States)

    2000-09-01

    The Mobile Emissions Assessment System for Urban and Regional Evaluation (MEASURE) model provides an external validation capability for hot stabilized option; the model is one of several new modal emissions models designed to predict hot stabilized e...

  12. Dental models made with an intraoral scanner: A validation study.

    NARCIS (Netherlands)

    Cuperus, A.M.; Harms, M.C.; Rangel, F.A.; Bronkhorst, E.M.; Schols, J.G.J.H.; Breuning, K.H.

    2012-01-01

    INTRODUCTION: Our objectives were to determine the validity and reproducibility of measurements on stereolithographic models and 3-dimensional digital dental models made with an intraoral scanner. METHODS: Ten dry human skulls were scanned; from the scans, stereolithographic models and digital

  13. Improvement and Validation of Weld Residual Stress Modelling Procedure

    International Nuclear Information System (INIS)

    Zang, Weilin; Gunnars, Jens; Dong, Pingsha; Hong, Jeong K.

    2009-06-01

    The objective of this work is to identify and evaluate improvements for the residual stress modelling procedure currently used in Sweden. There is a growing demand to eliminate any unnecessary conservatism involved in residual stress assumptions. The study was focused on the development and validation of an improved weld residual stress modelling procedure, by taking advantage of the recent advances in residual stress modelling and stress measurement techniques. The major changes applied in the new weld residual stress modelling procedure are: - Improved procedure for heat source calibration based on use of analytical solutions. - Use of an isotropic hardening model where mixed hardening data is not available. - Use of an annealing model for improved simulation of strain relaxation in re-heated material. The new modelling procedure is demonstrated to capture the main characteristics of the through thickness stress distributions by validation to experimental measurements. Three austenitic stainless steel butt-welds cases are analysed, covering a large range of pipe geometries. From the cases it is evident that there can be large differences between the residual stresses predicted using the new procedure, and the earlier procedure or handbook recommendations. Previously recommended profiles could give misleading fracture assessment results. The stress profiles according to the new procedure agree well with the measured data. If data is available then a mixed hardening model should be used

  14. Data Set for Emperical Validation of Double Skin Facade Model

    DEFF Research Database (Denmark)

    Kalyanova, Olena; Jensen, Rasmus Lund; Heiselberg, Per

    2008-01-01

    the model is empirically validated and its' limitations for the DSF modeling are identified. Correspondingly, the existence and availability of the experimental data is very essential. Three sets of accurate empirical data for validation of DSF modeling with building simulation software were produced within...

  15. Isotopes as validation tools for global climate models

    International Nuclear Information System (INIS)

    Henderson-Sellers, A.

    2001-01-01

    Global Climate Models (GCMs) are the predominant tool with which we predict the future climate. In order that people can have confidence in such predictions, GCMs require validation. As almost every available item of meteorological data has been exploited in the construction and tuning of GCMs to date, independent validation is very difficult. This paper explores the use of isotopes as a novel and fully independent means of evaluating GCMs. The focus is the Amazon Basin which has a long history of isotope collection and analysis and also of climate modelling: both having been reported for over thirty years. Careful consideration of the results of GCM simulations of Amazonian deforestation and climate change suggests that the recent stable isotope record is more consistent with the predicted effects of greenhouse warming, possibly combined with forest removal, than with GCM predictions of the effects of deforestation alone

  16. Veggie Hardware Validation Test Preliminary Results and Lessons Learned

    Science.gov (United States)

    Massa, Gioia D.; Dufour, Nicole F.; Smith, T. M.

    2014-01-01

    The Veggie hardware validation test, VEG-01, was conducted on the International Space Station during Expeditions 39 and 40 from May through June of 2014. The Veggie hardware and the VEG-01 experiment payload were launched to station aboard the SpaceX-3 resupply mission in April, 2014. Veggie was installed in an Expedite-the-Processing-of-Experiments-to-Space-Station (ExPRESS) rack in the Columbus module, and the VEG-01 validation test was initiated. Veggie installation was successful, and power was supplied to the unit. The hardware was programmed and the root mat reservoir and plant pillows were installed without issue. As expected, a small amount of growth media was observed in the sealed bags which enclosed the plant pillows when they were destowed. Astronaut Steve Swanson used the wet/dry vacuum to clean up the escaped particles. Water insertion or priming the first plant pillow was unsuccessful as an issue prevented water movement through the quick disconnect. All subsequent pillows were successfully primed, and the initial pillow was replaced with a backup pillow and successfully primed. Six pillows were primed, but only five pillows had plants which germinated. After about a week and a half it was observed that plants were not growing well and that pillow wicks were dry. This indicated that the reservoir was not supplying sufficient water to the pillows via wicking, and so the team reverted to an operational fix which added water directly to the plant pillows. Direct watering of the pillows led to a recovery in several of the stressed plants; a couple of which did not recover. An important lesson learned involved Veggie's bellows. The bellows tended to float and interfere with operations when opened, so Steve secured them to the baseplate during plant tending operations. Due to the perceived intensity of the LED lights, the crew found it challenging to both work under the lights and read crew procedures on their computer. Although the lights are not a safety

  17. Validation Results for Core-Scale Oil Shale Pyrolysis

    Energy Technology Data Exchange (ETDEWEB)

    Staten, Josh; Tiwari, Pankaj

    2015-03-01

    This report summarizes a study of oil shale pyrolysis at various scales and the subsequent development a model for in situ production of oil from oil shale. Oil shale from the Mahogany zone of the Green River formation was used in all experiments. Pyrolysis experiments were conducted at four scales, powdered samples (100 mesh) and core samples of 0.75”, 1” and 2.5” diameters. The batch, semibatch and continuous flow pyrolysis experiments were designed to study the effect of temperature (300°C to 500°C), heating rate (1°C/min to 10°C/min), pressure (ambient and 500 psig) and size of the sample on product formation. Comprehensive analyses were performed on reactants and products - liquid, gas and spent shale. These experimental studies were designed to understand the relevant coupled phenomena (reaction kinetics, heat transfer, mass transfer, thermodynamics) at multiple scales. A model for oil shale pyrolysis was developed in the COMSOL multiphysics platform. A general kinetic model was integrated with important physical and chemical phenomena that occur during pyrolysis. The secondary reactions of coking and cracking in the product phase were addressed. The multiscale experimental data generated and the models developed provide an understanding of the simultaneous effects of chemical kinetics, and heat and mass transfer on oil quality and yield. The comprehensive data collected in this study will help advance the move to large-scale in situ oil production from the pyrolysis of oil shale.

  18. Impact mechanics of ship collisions and validations with experimental results

    DEFF Research Database (Denmark)

    Zhang, Shengming; Villavicencio, R.; Zhu, L.

    2017-01-01

    Closed-form analytical solutions for the energy released for deforming and crushing ofstructures and the impact impulse during ship collisions were developed and published inMarine Structures in 1998 [1]. The proposed mathematical models have been used bymany engineers and researchers although...

  19. In-Drift Microbial Communities Model Validation Calculations

    Energy Technology Data Exchange (ETDEWEB)

    D. M. Jolley

    2001-09-24

    The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS M&O 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS M&O 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN MO9909SPAMING1.003 using its replacement DTN MO0106SPAIDM01.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS M&O 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS M&O (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 2001) which includes controls for the management of electronic data.

  20. In-Drift Microbial Communities Model Validation Calculation

    Energy Technology Data Exchange (ETDEWEB)

    D. M. Jolley

    2001-10-31

    The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS M&O 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS M&O 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN MO9909SPAMING1.003 using its replacement DTN MO0106SPAIDM01.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS M&O 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS M&O (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 2001) which includes controls for the management of electronic data.

  1. In-Drift Microbial Communities Model Validation Calculations

    International Nuclear Information System (INIS)

    Jolley, D.M.

    2001-01-01

    The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS MandO 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS MandO 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN MO9909SPAMING1.003 using its replacement DTN MO0106SPAIDM01.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS MandO 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS MandO (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 2001) which includes controls for the management of electronic data

  2. IN-DRIFT MICROBIAL COMMUNITIES MODEL VALIDATION CALCULATIONS

    Energy Technology Data Exchange (ETDEWEB)

    D.M. Jolley

    2001-12-18

    The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS M&O 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS M&O 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN M09909SPAMINGl.003 using its replacement DTN M00106SPAIDMO 1.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS M&O 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS M&O (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 200 1) which includes controls for the management of electronic data.

  3. IN-DRIFT MICROBIAL COMMUNITIES MODEL VALIDATION CALCULATIONS

    International Nuclear Information System (INIS)

    D.M. Jolley

    2001-01-01

    The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS M andO 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS M andO 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN M09909SPAMINGl.003 using its replacement DTN M00106SPAIDMO 1.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS M andO 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS M andO (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 200 1) which includes controls for the management of electronic data

  4. New validation metrics for models with multiple correlated responses

    International Nuclear Information System (INIS)

    Li, Wei; Chen, Wei; Jiang, Zhen; Lu, Zhenzhou; Liu, Yu

    2014-01-01

    Validating models with correlated multivariate outputs involves the comparison of multiple stochastic quantities. Considering both uncertainty and correlations among multiple responses from model and physical observations imposes challenges. Existing marginal comparison methods and the hypothesis testing-based methods either ignore correlations among responses or only reach Boolean conclusions (yes or no) without accounting for the amount of discrepancy between a model and the underlying reality. A new validation metric is needed to quantitatively characterize the overall agreement of multiple responses considering correlations among responses and uncertainty in both model predictions and physical observations. In this paper, by extending the concept of “area metric” and the “u-pooling method” developed for validating a single response, we propose new model validation metrics for validating correlated multiple responses using the multivariate probability integral transformation (PIT). One new metric is the PIT area metric for validating multi-responses at a single validation site. The other is the t-pooling metric that allows for pooling observations of multiple responses observed at multiple validation sites to assess the global predictive capability. The proposed metrics have many favorable properties that are well suited for validation assessment of models with correlated responses. The two metrics are examined and compared with the direct area metric and the marginal u-pooling method respectively through numerical case studies and an engineering example to illustrate their validity and potential benefits

  5. Experimental validation of the multiphase extended Leblond's model

    Science.gov (United States)

    Weisz-Patrault, Daniel

    2017-10-01

    Transformation induced plasticity is a crucial contribution of the simulation of several forming processes involving phase transitions under mechanical loads, resulting in large irreversible strain even though the applied stress is under the yield stress. One of the most elegant and widely used models is based on analytic homogenization procedures and has been proposed by Leblond et al. [1-4]. Very recently, a simple extension of the Leblond's model has been developed by Weisz-Patrault [8]. Several product phases are taken into account and several assumptions are relaxed in order to extend the applicability of the model. The present contribution compares experimental tests with numerical computations, in order to discuss the validity of the developed theory. Thus, experimental results extracted from the existing literature are analyzed. Results show a good agreement between measurements and theoretical computations.

  6. Some considerations for validation of repository performance assessment models

    International Nuclear Information System (INIS)

    Eisenberg, N.

    1991-01-01

    Validation is an important aspect of the regulatory uses of performance assessment. A substantial body of literature exists indicating the manner in which validation of models is usually pursued. Because performance models for a nuclear waste repository cannot be tested over the long time periods for which the model must make predictions, the usual avenue for model validation is precluded. Further impediments to model validation include a lack of fundamental scientific theory to describe important aspects of repository performance and an inability to easily deduce the complex, intricate structures characteristic of a natural system. A successful strategy for validation must attempt to resolve these difficulties in a direct fashion. Although some procedural aspects will be important, the main reliance of validation should be on scientific substance and logical rigor. The level of validation needed will be mandated, in part, by the uses to which these models are put, rather than by the ideal of validation of a scientific theory. Because of the importance of the validation of performance assessment models, the NRC staff has engaged in a program of research and international cooperation to seek progress in this important area. 2 figs., 16 refs

  7. Parameterization and validation of an ungulate-pasture model.

    Science.gov (United States)

    Pekkarinen, Antti-Juhani; Kumpula, Jouko; Tahvonen, Olli

    2017-10-01

    Ungulate grazing and trampling strongly affect pastures and ecosystems throughout the world. Ecological population models are used for studying these systems and determining the guidelines for sustainable and economically viable management. However, the effect of trampling and other resource wastage is either not taken into account or quantified with data in earlier models. Also, the ability of models to describe the herbivore impact on pastures is usually not validated. We used a detailed model and data to study the level of winter- and summertime lichen wastage by reindeer and the effects of wastage on population sizes and management. We also validated the model with respect to its ability of predicting changes in lichen biomass and compared the actual management in herding districts with model results. The modeling efficiency value (0.75) and visual comparison between the model predictions and data showed that the model was able to describe the changes in lichen pastures caused by reindeer grazing and trampling. At the current lichen biomass levels in the northernmost Finland, the lichen wastage varied from 0 to 1 times the lichen intake during winter and from 6 to 10 times the intake during summer. With a higher value for wastage, reindeer numbers and net revenues were lower in the economically optimal solutions. Higher wastage also favored the use of supplementary feeding in the optimal steady state. Actual reindeer numbers in the districts were higher than in the optimal steady-state solutions for the model in 18 herding districts out of 20. Synthesis and applications . We show that a complex model can be used for analyzing ungulate-pasture dynamics and sustainable management if the model is parameterized and validated for the system. Wastage levels caused by trampling and other causes should be quantified with data as they strongly affect the results and management recommendations. Summertime lichen wastage caused by reindeer is higher than expected, which

  8. Validation of 2D flood models with insurance claims

    Science.gov (United States)

    Zischg, Andreas Paul; Mosimann, Markus; Bernet, Daniel Benjamin; Röthlisberger, Veronika

    2018-02-01

    Flood impact modelling requires reliable models for the simulation of flood processes. In recent years, flood inundation models have been remarkably improved and widely used for flood hazard simulation, flood exposure and loss analyses. In this study, we validate a 2D inundation model for the purpose of flood exposure analysis at the river reach scale. We validate the BASEMENT simulation model with insurance claims using conventional validation metrics. The flood model is established on the basis of available topographic data in a high spatial resolution for four test cases. The validation metrics were calculated with two different datasets; a dataset of event documentations reporting flooded areas and a dataset of insurance claims. The model fit relating to insurance claims is in three out of four test cases slightly lower than the model fit computed on the basis of the observed inundation areas. This comparison between two independent validation data sets suggests that validation metrics using insurance claims can be compared to conventional validation data, such as the flooded area. However, a validation on the basis of insurance claims might be more conservative in cases where model errors are more pronounced in areas with a high density of values at risk.

  9. NAIG/Toshiba results using ARKAS for stage 2 validation

    International Nuclear Information System (INIS)

    Nakagawa, M.; Tsuboi, Y.

    1990-01-01

    This report contains the NAIG results to the problems set in stage 2 of the co-ordinated research programme (CRP) on the Intercomparison of LMFBR Core Mechanics Codes in accordance with IAEA research agreement No. 4411/R1/CF. All the problems have been completed and were solved with the core structural mechanics code ARKAS. (author). 3 refs, figs and tabs

  10. Validation of an axial flow blood pump: computational fluid dynamics results using particle image velocimetry.

    Science.gov (United States)

    Su, Boyang; Chua, Leok Poh; Wang, Xikun

    2012-04-01

    A magnetically suspended axial flow blood pump is studied experimentally in this article. The pump casing enclosed a three-blade straightener, a two-blade impeller shrouded by a permanent magnet-embedded cylinder, and a three-blade diffuser. The internal flow fields were simulated earlier using computational fluid dynamics (CFD), and the pump characteristic curves were determined. The simulation results showed that the internal flow field was basically streamlined, except the diffuser region. Particle image velocimetry (PIV) measurement of the 1:1 pump model was conducted to validate the CFD result. In order to ensure the optical access, an acrylic prototype was fabricated with the impeller driven by a servomotor instead, as the magnet is opaque. In addition to the transparent model, the blood analog fluid with the refractive index close to that of acrylic was used to avoid refraction. According to the CFD results, the axial flow blood pump could generate adequate pressure head at the rotating speed of 9500rpm and flow rate of 5L/min, and the same flow condition was applied during the PIV measurement. Through the comparisons, it was found that the experimental results were close to those obtained by CFD and had thus validated the CFD model, which could complement the limitation of the measurement in assessing the more detailed flow fields of the axial flow pump. © 2011, Copyright the Authors. Artificial Organs © 2011, International Center for Artificial Organs and Transplantation and Wiley Periodicals, Inc.

  11. Utilization of paleoclimate results to validate projections of a future greenhouse warming

    International Nuclear Information System (INIS)

    Crowley, T.J.

    1991-01-01

    Paleoclimate data provide a rich source of information for testing projections of future greenhouse trends. This paper summarizes the present state-of-the-art as to assessments of two important climate problems. (1) Validation of climate models - The same climate models that have been used to make greenhouse forecasts have also been used for paleoclimate simulations. Comparisons of model results and observations indicate some impressive successes but also some cases where there are significant divergences between models and observations. However, special conditions associated with the impressive successes could lead to a false confidence in the models; disagreements are a topic of greater concern. It remains to be determined whether the disagreements are due to model limitations or uncertainties in geologic data. (2) Role of CO 2 as a significant climate feedback: Paleoclimate studies indicate that the climate system is generally more sensitive than our ability to model it. Addition or subtraction of CO 2 leads to a closer agreement between models and observations. In this respect paleoclimate results in general support the conclusion that CO 2 is an important climate feedback, with the magnitude of the feedback approximately comparable to the sensitivity of present climate models. If the CO 2 projections are correct, comparison of the future warming with past warm periods indicate that there may be no geologic analogs for a future warming; the future greenhouse climate may represent a unique climate realization in earth history. 55 refs.; 5 figs

  12. Utilization of paleoclimate results to validate projections of a future greenhouse warming

    International Nuclear Information System (INIS)

    Crowley, T.J.

    1990-01-01

    Paleoclimate data provide a rich source of information for testing projections of future greenhouse trends. This paper summarizes the present state-of-the-art as to assessments of two important climate problems. (1) Validation of climate models - The same climate models that have been used to make greenhouse forecasts have also been used for paleoclimate simulations. Comparisons of model results and observations indicate some impressive successes but also some cases where there are significant divergences between models and observations. However, special conditions associated with the impressive successes could lead to a false confidence in the models; disagreements are a topic of greater concern. It remains to be determined whether the disagreements are due to model limitations or uncertainties in geologic data. (2) Role of CO 2 as a significant climate feedback: Paleoclimate studies indicate that the climate system is generally more sensitive than our ability to model it. Addition or subtraction of CO 2 leads to a closer agreement between models and observations. In this respect paleoclimate results in general support the conclusion that CO 2 is an important climate feedback, with the magnitude of the feedback approximately comparable to the sensitivity of present climate models. If the CO 2 projections are correct, comparison of the future warming with past warm periods indicate that there may be no geologic analogs for a future warming; the future greenhouse climate may represent a unique climate realization in earth history

  13. Model Validation and Verification of Data Mining from the ...

    African Journals Online (AJOL)

    In this paper, we seek to present a hybrid method for Model Validation and Verification of Data Mining from the Knowledge Workers Productivity Approach. It is hoped that this paper will help managers to implement different corresponding measures. A case study is presented where this model measure and validates at the ...

  14. Validation of mentorship model for newly qualified professional ...

    African Journals Online (AJOL)

    Newly qualified professional nurses (NQPNs) allocated to community health care services require the use of validated model to practice independently. Validation was done to adapt and assess if the model is understood and could be implemented by NQPNs and mentors employed in community health care services.

  15. Validation and Adaptation of Router and Switch Models

    NARCIS (Netherlands)

    Boltjes, B.; Fernandez Diaz, I.; Kock, B.A.; Langeveld, R.J.G.M.; Schoenmaker, G.

    2003-01-01

    This paper describes validating OPNET models of key devices for the next generation IP-based tactical network of the Royal Netherlands Army (RNLA). The task of TNO-FEL is to provide insight in scalability and performance of future deployed networks. Because validated models ol key Cisco equipment

  16. Methods for Geometric Data Validation of 3d City Models

    Science.gov (United States)

    Wagner, D.; Alam, N.; Wewetzer, M.; Pries, M.; Coors, V.

    2015-12-01

    are detected, among additional criteria. Self-intersection might lead to different results, e.g. intersection points, lines or areas. Depending on the geometric constellation, they might represent gaps between bounding polygons of the solids, overlaps, or violations of the 2-manifoldness. Not least due to the floating point problem in digital numbers, tolerances must be considered in some algorithms, e.g. planarity and solid self-intersection. Effects of different tolerance values and their handling is discussed; recommendations for suitable values are given. The goal of the paper is to give a clear understanding of geometric validation in the context of 3D city models. This should also enable the data holder to get a better comprehension of the validation results and their consequences on the deployment fields of the validated data set.

  17. Exact results for chaotic scattering and experimental validation

    Science.gov (United States)

    Guhr, Thomas

    2017-12-01

    As scattering experiments are the key tool to obtain information on nuclei and other quantum systems, the foundations of scattering theory were laid already a long time ago. Compound nucleus scattering prompted the study of generic statistical features. In the Heidelberg approach, those are taken into account by assuming that the reaction zone is fully quantum chaotic. Later on, this approach turned out to be applicable to a large variety of systems on different scales, including classical wave systems. For a long time, the distribution of the off-diagonal scattering-matrix elements resisted analytical treatment. In two recent studies [1, 2], we fully solved this problem and we also carried out a comparison with data from microwave experiments. Some comment are made on our very recent results for the cross-section distributions.

  18. Validation of the WATEQ4 geochemical model for uranium

    International Nuclear Information System (INIS)

    Krupka, K.M.; Jenne, E.A.; Deutsch, W.J.

    1983-09-01

    As part of the Geochemical Modeling and Nuclide/Rock/Groundwater Interactions Studies Program, a study was conducted to partially validate the WATEQ4 aqueous speciation-solubility geochemical model for uranium. The solubility controls determined with the WATEQ4 geochemical model were in excellent agreement with those laboratory studies in which the solids schoepite [UO 2 (OH) 2 . H 2 O], UO 2 (OH) 2 , and rutherfordine ((UO 2 CO 3 ) were identified as actual solubility controls for uranium. The results of modeling solution analyses from laboratory studies of uranyl phosphate solids, however, identified possible errors in the characterization of solids in the original solubility experiments. As part of this study, significant deficiencies in the WATEQ4 thermodynamic data base for uranium solutes and solids were corrected. Revisions included recalculation of selected uranium reactions. Additionally, thermodynamic data for the hydroxyl complexes of U(VI), including anionic (VI) species, were evaluated (to the extent permitted by the available data). Vanadium reactions were also added to the thermodynamic data base because uranium-vanadium solids can exist in natural ground-water systems. This study is only a partial validation of the WATEQ4 geochemical model because the available laboratory solubility studies do not cover the range of solid phases, alkaline pH values, and concentrations of inorganic complexing ligands needed to evaluate the potential solubility of uranium in ground waters associated with various proposed nuclear waste repositories. Further validation of this or other geochemical models for uranium will require careful determinations of uraninite solubility over the pH range of 7 to 10 under highly reducing conditions and of uranyl hydroxide and phosphate solubilities over the pH range of 7 to 10 under oxygenated conditions

  19. Proceedings of the first SRL model validation workshop

    International Nuclear Information System (INIS)

    Buckner, M.R.

    1981-10-01

    The Clean Air Act and its amendments have added importance to knowing the accuracy of mathematical models used to assess transport and diffusion of environmental pollutants. These models are the link between air quality standards and emissions. To test the accuracy of a number of these models, a Model Validation Workshop was held. The meteorological, source-term, and Kr-85 concentration data bases for emissions from the separations areas of the Savannah River Plant during 1975 through 1977 were used to compare calculations from various atmospheric dispersion models. The results of statistical evaluation of the models show a degradation in the ability to predict pollutant concentrations as the time span over which the calculations are made is reduced. Forecasts for annual time periods were reasonably accurate. Weighted-average squared correlation coefficients (R 2 ) were 0.74 for annual, 0.28 for monthly, 0.21 for weekly, and 0.18 for twice-daily predictions. Model performance varied within each of these four categories; however, the results indicate that the more complex, three-dimensional models provide only marginal increases in accuracy. The increased costs of running these codes is not warranted for long-term releases or for conditions of relatively simple terrain and meteorology. The overriding factor in the calculational accuracy is the accurate description of the wind field. Further improvements of the numerical accuracy of the complex models is not nearly as important as accurate calculations of the meteorological transport conditions

  20. Implementation and Validation of IEC Generic Type 1A Wind Turbine Generator Model

    DEFF Research Database (Denmark)

    Zhao, Haoran; Wu, Qiuwei; Margaris, Ioannis

    2015-01-01

    This paper presents the implementation of the International Electrotechnical Commission (IEC) generic Type 1A wind turbine generator (WTG) model in Power Factory (PF) and the validation of the implemented model against field measurements. The IEC generic Type 1A WTG model structure is briefly...... measurement validation of the implemented model was carried out by using the “play-back” approach and the measurement data from Siemens Wind Power. The results of the model to field measurement validation show that there is a good match between the simulation results and the measurements. The errors between...... described. The details are explained regarding how the two mass mechanical model is implemented when the generator mass is included in the PF built-in generator model. In order to verify the IEC generic Type 1A WTG model, the model to field measurement validation method was employed. The model to field...

  1. Validating firn compaction model with remote sensing data

    DEFF Research Database (Denmark)

    Simonsen, S. B.; Stenseng, Lars; Sørensen, Louise Sandberg

    A comprehensive understanding of firn processes is of outmost importance, when estimating present and future changes of the Greenland Ice Sheet. Especially, when remote sensing altimetry is used to assess the state of ice sheets and their contribution to global sea level rise, firn compaction...... of firn compaction to correct ICESat measurements and assessing the present mass loss of the Greenland ice sheet. Validation of the model against the radar data gives good results and confidence in using the model to answer important questions. Questions such as; how large is the firn compaction...... correction relative to the changes in the elevation of the surface observed with remote sensing altimetry? What model time resolution is necessary to resolved the observed layering? What model refinements are necessary to give better estimates of the surface mass balance of the Greenland ice sheet from...

  2. Development and Validation of a 3-Dimensional CFB Furnace Model

    Science.gov (United States)

    Vepsäläinen, Arl; Myöhänen, Karl; Hyppäneni, Timo; Leino, Timo; Tourunen, Antti

    At Foster Wheeler, a three-dimensional CFB furnace model is essential part of knowledge development of CFB furnace process regarding solid mixing, combustion, emission formation and heat transfer. Results of laboratory and pilot scale phenomenon research are utilized in development of sub-models. Analyses of field-test results in industrial-scale CFB boilers including furnace profile measurements are simultaneously carried out with development of 3-dimensional process modeling, which provides a chain of knowledge that is utilized as feedback for phenomenon research. Knowledge gathered by model validation studies and up-to-date parameter databases are utilized in performance prediction and design development of CFB boiler furnaces. This paper reports recent development steps related to modeling of combustion and formation of char and volatiles of various fuel types in CFB conditions. Also a new model for predicting the formation of nitrogen oxides is presented. Validation of mixing and combustion parameters for solids and gases are based on test balances at several large-scale CFB boilers combusting coal, peat and bio-fuels. Field-tests including lateral and vertical furnace profile measurements and characterization of solid materials provides a window for characterization of fuel specific mixing and combustion behavior in CFB furnace at different loads and operation conditions. Measured horizontal gas profiles are projection of balance between fuel mixing and reactions at lower part of furnace and are used together with both lateral temperature profiles at bed and upper parts of furnace for determination of solid mixing and combustion model parameters. Modeling of char and volatile based formation of NO profiles is followed by analysis of oxidizing and reducing regions formed due lower furnace design and mixing characteristics of fuel and combustion airs effecting to formation ofNO furnace profile by reduction and volatile-nitrogen reactions. This paper presents

  3. Contaminant transport model validation: The Oak Ridge Reservation

    International Nuclear Information System (INIS)

    Lee, R.R.; Ketelle, R.H.

    1988-09-01

    In the complex geologic setting on the Oak Ridge Reservation, hydraulic conductivity is anisotropic and flow is strongly influenced by an extensive and largely discontinuous fracture network. Difficulties in describing and modeling the aquifer system prompted a study to obtain aquifer property data to be used in a groundwater flow model validation experiment. Characterization studies included the performance of an extensive suite of aquifer test within a 600-square-meter area to obtain aquifer property values to describe the flow field in detail. Following aquifer test, a groundwater tracer test was performed under ambient conditions to verify the aquifer analysis. Tracer migration data in the near-field were used in model calibration to predict tracer arrival time and concentration in the far-field. Despite the extensive aquifer testing, initial modeling inaccurately predicted tracer migration direction. Initial tracer migration rates were consistent with those predicted by the model; however, changing environmental conditions resulted in an unanticipated decay in tracer movement. Evaluation of the predictive accuracy of groundwater flow and contaminant transport models on the Oak Ridge Reservation depends on defining the resolution required, followed by field testing and model grid definition at compatible scales. The use of tracer tests, both as a characterization method and to verify model results, provides the highest level of resolution of groundwater flow characteristics. 3 refs., 4 figs

  4. Image decomposition as a tool for validating stress analysis models

    Directory of Open Access Journals (Sweden)

    Mottershead J.

    2010-06-01

    Full Text Available It is good practice to validate analytical and numerical models used in stress analysis for engineering design by comparison with measurements obtained from real components either in-service or in the laboratory. In reality, this critical step is often neglected or reduced to placing a single strain gage at the predicted hot-spot of stress. Modern techniques of optical analysis allow full-field maps of displacement, strain and, or stress to be obtained from real components with relative ease and at modest cost. However, validations continued to be performed only at predicted and, or observed hot-spots and most of the wealth of data is ignored. It is proposed that image decomposition methods, commonly employed in techniques such as fingerprinting and iris recognition, can be employed to validate stress analysis models by comparing all of the key features in the data from the experiment and the model. Image decomposition techniques such as Zernike moments and Fourier transforms have been used to decompose full-field distributions for strain generated from optical techniques such as digital image correlation and thermoelastic stress analysis as well as from analytical and numerical models by treating the strain distributions as images. The result of the decomposition is 101 to 102 image descriptors instead of the 105 or 106 pixels in the original data. As a consequence, it is relatively easy to make a statistical comparison of the image descriptors from the experiment and from the analytical/numerical model and to provide a quantitative assessment of the stress analysis.

  5. Validation Analysis of the Shoal Groundwater Flow and Transport Model

    Energy Technology Data Exchange (ETDEWEB)

    A. Hassan; J. Chapman

    2008-11-01

    Environmental restoration at the Shoal underground nuclear test is following a process prescribed by a Federal Facility Agreement and Consent Order (FFACO) between the U.S. Department of Energy, the U.S. Department of Defense, and the State of Nevada. Characterization of the site included two stages of well drilling and testing in 1996 and 1999, and development and revision of numerical models of groundwater flow and radionuclide transport. Agreement on a contaminant boundary for the site and a corrective action plan was reached in 2006. Later that same year, three wells were installed for the purposes of model validation and site monitoring. The FFACO prescribes a five-year proof-of-concept period for demonstrating that the site groundwater model is capable of producing meaningful results with an acceptable level of uncertainty. The corrective action plan specifies a rigorous seven step validation process. The accepted groundwater model is evaluated using that process in light of the newly acquired data. The conceptual model of ground water flow for the Project Shoal Area considers groundwater flow through the fractured granite aquifer comprising the Sand Springs Range. Water enters the system by the infiltration of precipitation directly on the surface of the mountain range. Groundwater leaves the granite aquifer by flowing into alluvial deposits in the adjacent basins of Fourmile Flat and Fairview Valley. A groundwater divide is interpreted as coinciding with the western portion of the Sand Springs Range, west of the underground nuclear test, preventing flow from the test into Fourmile Flat. A very low conductivity shear zone east of the nuclear test roughly parallels the divide. The presence of these lateral boundaries, coupled with a regional discharge area to the northeast, is interpreted in the model as causing groundwater from the site to flow in a northeastward direction into Fairview Valley. Steady-state flow conditions are assumed given the absence of

  6. The concept of validation of numerical models for consequence analysis

    International Nuclear Information System (INIS)

    Borg, Audun; Paulsen Husted, Bjarne; Njå, Ove

    2014-01-01

    Numerical models such as computational fluid dynamics (CFD) models are increasingly used in life safety studies and other types of analyses to calculate the effects of fire and explosions. The validity of these models is usually established by benchmark testing. This is done to quantitatively measure the agreement between the predictions provided by the model and the real world represented by observations in experiments. This approach assumes that all variables in the real world relevant for the specific study are adequately measured in the experiments and in the predictions made by the model. In this paper the various definitions of validation for CFD models used for hazard prediction are investigated to assess their implication for consequence analysis in a design phase. In other words, how is uncertainty in the prediction of future events reflected in the validation process? The sources of uncertainty are viewed from the perspective of the safety engineer. An example of the use of a CFD model is included to illustrate the assumptions the analyst must make and how these affect the prediction made by the model. The assessments presented in this paper are based on a review of standards and best practice guides for CFD modeling and the documentation from two existing CFD programs. Our main thrust has been to assess how validation work is performed and communicated in practice. We conclude that the concept of validation adopted for numerical models is adequate in terms of model performance. However, it does not address the main sources of uncertainty from the perspective of the safety engineer. Uncertainty in the input quantities describing future events, which are determined by the model user, outweighs the inaccuracies in the model as reported in validation studies. - Highlights: • Examine the basic concept of validation applied to models for consequence analysis. • Review standards and guides for validation of numerical models. • Comparison of the validation

  7. Trailing Edge Noise Model Validation and Application to Airfoil Optimization

    DEFF Research Database (Denmark)

    Bertagnolio, Franck; Aagaard Madsen, Helge; Bak, Christian

    2010-01-01

    The aim of this article is twofold. First, an existing trailing edge noise model is validated by comparing with airfoil surface pressure fluctuations and far field sound pressure levels measured in three different experiments. The agreement is satisfactory in one case but poor in two other cases...... noise emission, trying at the same time to preserve some of its aerodynamic and geometric characteristics. The new designs are characterized by less cambered airfoils and flatter suction sides. The resulting noise reductions seem to be mainly achieved by a reduction in the turbulent kinetic energy...

  8. Validation, Optimization and Simulation of a Solar Thermoelectric Generator Model

    Science.gov (United States)

    Madkhali, Hadi Ali; Hamil, Ali; Lee, HoSung

    2017-12-01

    This study explores thermoelectrics as a viable option for small-scale solar thermal applications. Thermoelectric technology is based on the Seebeck effect, which states that a voltage is induced when a temperature gradient is applied to the junctions of two differing materials. This research proposes to analyze, validate, simulate, and optimize a prototype solar thermoelectric generator (STEG) model in order to increase efficiency. The intent is to further develop STEGs as a viable and productive energy source that limits pollution and reduces the cost of energy production. An empirical study (Kraemer et al. in Nat Mater 10:532, 2011) on the solar thermoelectric generator reported a high efficiency performance of 4.6%. The system had a vacuum glass enclosure, a flat panel (absorber), thermoelectric generator and water circulation for the cold side. The theoretical and numerical approach of this current study validated the experimental results from Kraemer's study to a high degree. The numerical simulation process utilizes a two-stage approach in ANSYS software for Fluent and Thermal-Electric Systems. The solar load model technique uses solar radiation under AM 1.5G conditions in Fluent. This analytical model applies Dr. Ho Sung Lee's theory of optimal design to improve the performance of the STEG system by using dimensionless parameters. Applying this theory, using two cover glasses and radiation shields, the STEG model can achieve a highest efficiency of 7%.

  9. Model Validation for Simulations of Vehicle Systems

    Science.gov (United States)

    2012-08-01

    jackknife”, Annals of Statistics, 7:1-26, 1979. [45] B. Efron and G. Gong, “A leisurely look at the bootstrap, the jackknife, and cross-validation”, The...battery model developed in the Automotive Research Center, a US Army Center of Excellence for modeling and simulation of ground vehicle systems...Sandia National Laboratories and a battery model developed in the Automotive Research Center, a US Army Center of Excellence for modeling and simulation

  10. Models for Validation of Prior Learning (VPL)

    DEFF Research Database (Denmark)

    Ehlers, Søren

    The national policies for the education/training of adults are in the 21st century highly influenced by proposals which are formulated and promoted by The European Union (EU) as well as other transnational players and this shift in policy making has consequences. One is that ideas which in the past...... would have been categorized as utopian can become realpolitik. Validation of Prior Learning (VPL) was in Europe mainly regarded as utopian while universities in the United States of America (USA) were developing ways to obtain credits to those students which was coming with experiences from working life....

  11. Using virtual reality to validate system models

    Energy Technology Data Exchange (ETDEWEB)

    Winter, V.L.; Caudell, T.P.

    1999-12-09

    To date most validation techniques are highly biased towards calculations involving symbolic representations of problems. These calculations are either formal (in the case of consistency and completeness checks), or informal in the case of code inspections. The authors believe that an essential type of evidence of the correctness of the formalization process must be provided by (i.e., must originate from) human-based calculation. They further believe that human calculation can by significantly amplified by shifting from symbolic representations to graphical representations. This paper describes their preliminary efforts in realizing such a representational shift.

  12. An independent verification and validation of the Future Theater Level Model conceptual model

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, D.S. III; Kruse, K.L.; Martellaro, A.J.; Packard, S.L.; Thomas, B. Jr.; Turley, V.K.

    1994-08-01

    This report describes the methodology and results of independent verification and validation performed on a combat model in its design stage. The combat model is the Future Theater Level Model (FTLM), under development by The Joint Staff/J-8. J-8 has undertaken its development to provide an analysis tool that addresses the uncertainties of combat more directly than previous models and yields more rapid study results. The methodology adopted for this verification and validation consisted of document analyses. Included were detailed examination of the FTLM design documents (at all stages of development), the FTLM Mission Needs Statement, and selected documentation for other theater level combat models. These documents were compared to assess the FTLM as to its design stage, its purpose as an analytical combat model, and its capabilities as specified in the Mission Needs Statement. The conceptual design passed those tests. The recommendations included specific modifications as well as a recommendation for continued development. The methodology is significant because independent verification and validation have not been previously reported as being performed on a combat model in its design stage. The results are significant because The Joint Staff/J-8 will be using the recommendations from this study in determining whether to proceed with develop of the model.

  13. Validating Computational Cognitive Process Models across Multiple Timescales

    Science.gov (United States)

    Myers, Christopher; Gluck, Kevin; Gunzelmann, Glenn; Krusmark, Michael

    2010-12-01

    Model comparison is vital to evaluating progress in the fields of artificial general intelligence (AGI) and cognitive architecture. As they mature, AGI and cognitive architectures will become increasingly capable of providing a single model that completes a multitude of tasks, some of which the model was not specifically engineered to perform. These models will be expected to operate for extended periods of time and serve functional roles in real-world contexts. Questions arise regarding how to evaluate such models appropriately, including issues pertaining to model comparison and validation. In this paper, we specifically address model validation across multiple levels of abstraction, using an existing computational process model of unmanned aerial vehicle basic maneuvering to illustrate the relationship between validity and timescales of analysis.

  14. Validation of elk resource selection models with spatially independent data

    Science.gov (United States)

    Priscilla K. Coe; Bruce K. Johnson; Michael J. Wisdom; John G. Cook; Marty Vavra; Ryan M. Nielson

    2011-01-01

    Knowledge of how landscape features affect wildlife resource use is essential for informed management. Resource selection functions often are used to make and validate predictions about landscape use; however, resource selection functions are rarely validated with data from landscapes independent of those from which the models were built. This problem has severely...

  15. A Practical Approach to Validating a PD Model

    NARCIS (Netherlands)

    Medema, L.; Koning, de R.; Lensink, B.W.

    2009-01-01

    The capital adequacy framework Basel II aims to promote the adoption of stronger risk management practices by the banking industry. The implementation makes validation of credit risk models more important. Lenders therefore need a validation methodology to convince their supervisors that their

  16. Amendment to Validated dynamic flow model

    DEFF Research Database (Denmark)

    Knudsen, Torben

    2011-01-01

    The purpose of WP2 is to establish flow models relating the wind speed at turbines in a farm. Until now, active control of power reference has not been included in these models as only data with standard operation has been available. In this report the first data series with power reference excit...... turbine in undisturbed flow. For this data set both the multiplicative model and in particular the simple first order transfer function model can predict the down wind wind speed from upwind wind speed and loading.......The purpose of WP2 is to establish flow models relating the wind speed at turbines in a farm. Until now, active control of power reference has not been included in these models as only data with standard operation has been available. In this report the first data series with power reference...

  17. Ion channel model development and validation

    Science.gov (United States)

    Nelson, Peter Hugo

    2010-03-01

    The structure of the KcsA ion channel selectivity filter is used to develop three simple models of ion channel permeation. The quantitative predictions of the knock-on model are tested by comparison with experimental data from single-channel recordings of the KcsA channel. By comparison with experiment, students discover that the knock-on model can't explain saturation of ion channel current as the concentrations of the bathing solutions are increased. By inverting the energy diagram, students derive the association-dissociation model of ion channel permeation. This model predicts non-linear Michaelis-Menten saturating behavior that requires students to perform non-linear least-squares fits to the experimental data. This is done using Excel's solver feature. Students discover that this simple model does an excellent job of explaining the qualitative features of ion channel permeation but cannot account for changes in voltage sensitivity. The model is then extended to include an electrical dissociation distance. This rapid translocation model is then compared with experimental data from a wide variety of ion channels and students discover that this model also has its limitations. Support from NSF DUE 0836833 is gratefully acknowledged.

  18. Contributions to the validation of the CJS model for granular materials

    Science.gov (United States)

    Elamrani, Khadija

    1992-07-01

    Behavior model validation in the field of geotechnics is addressed, with the objective of showing the advantages and limits of the CJS (Cambou Jafari Sidoroff) behavior model for granular materials. Several levels are addressed: theoretical analysis of the CJS model to reveal consistence and first capacities; shaping (followed by validation by confrontation with other programs) of a computation code by finite elements (FINITEL) to integrate this model and prepare it for complex applications; validation of the code/model structure thus constituted by comparing its results to those of experiments in the case of nonhomogeneous (superficial foundations) problems.

  19. The Fruit & Vegetable Screener in the 2000 California Health Interview Survey: Validation Results

    Science.gov (United States)

    In this study, multiple 24-hour recalls in conjunction with a measurement error model were used to assess validity. The screeners used in the EATS included additional foods and reported portion sizes.

  20. High Turbidity Solis Clear Sky Model: Development and Validation

    Directory of Open Access Journals (Sweden)

    Pierre Ineichen

    2018-03-01

    Full Text Available The Solis clear sky model is a spectral scheme based on radiative transfer calculations and the Lambert–Beer relation. Its broadband version is a simplified fast analytical version; it is limited to broadband aerosol optical depths lower than 0.45, which is a weakness when applied in countries with very high turbidity such as China or India. In order to extend the use of the original simplified version of the model for high turbidity values, we developed a new version of the broadband Solis model based on radiative transfer calculations, valid for turbidity values up to 7, for the three components, global, beam, and diffuse, and for the four aerosol types defined by Shettle and Fenn. A validation of low turbidity data acquired in Geneva shows slightly better results than the previous version. On data acquired at sites presenting higher turbidity data, the bias stays within ±4% for the beam and the global irradiances, and the standard deviation around 5% for clean and stable condition data and around 12% for questionable data and variable sky conditions.

  1. A validated physical model of greenhouse climate.

    NARCIS (Netherlands)

    Bot, G.P.A.

    1989-01-01

    In the greenhouse model the momentaneous environmental crop growth factors are calculated as output, together with the physical behaviour of the crop. The boundary conditions for this model are the outside weather conditions; other inputs are the physical characteristics of the crop, of the

  2. Extending Model Checking To Object Process Validation

    NARCIS (Netherlands)

    van Rein, H.

    2002-01-01

    Object-oriented techniques allow the gathering and modelling of system requirements in terms of an application area. The expression of data and process models at that level is a great asset in communication with non-technical people in that area, but it does not necessarily lead to consistent

  3. Validity of microgravity simulation models on earth

    DEFF Research Database (Denmark)

    Regnard, J; Heer, M; Drummer, C

    2001-01-01

    Many studies have used water immersion and head-down bed rest as experimental models to simulate responses to microgravity. However, some data collected during space missions are at variance or in contrast with observations collected from experimental models. These discrepancies could reflect inc...

  4. Validating modeled turbulent heat fluxes across large freshwater surfaces

    Science.gov (United States)

    Lofgren, B. M.; Fujisaki-Manome, A.; Gronewold, A.; Anderson, E. J.; Fitzpatrick, L.; Blanken, P.; Spence, C.; Lenters, J. D.; Xiao, C.; Charusambot, U.

    2017-12-01

    Turbulent fluxes of latent and sensible heat are important physical processes that influence the energy and water budgets of the Great Lakes. Validation and improvement of bulk flux algorithms to simulate these turbulent heat fluxes are critical for accurate prediction of hydrodynamics, water levels, weather, and climate over the region. Here we consider five heat flux algorithms from several model systems; the Finite-Volume Community Ocean Model, the Weather Research and Forecasting model, and the Large Lake Thermodynamics Model, which are used in research and operational environments and concentrate on different aspects of the Great Lakes' physical system, but interface at the lake surface. The heat flux algorithms were isolated from each model and driven by meteorological data from over-lake stations in the Great Lakes Evaporation Network. The simulation results were compared with eddy covariance flux measurements at the same stations. All models show the capacity to the seasonal cycle of the turbulent heat fluxes. Overall, the Coupled Ocean Atmosphere Response Experiment algorithm in FVCOM has the best agreement with eddy covariance measurements. Simulations with the other four algorithms are overall improved by updating the parameterization of roughness length scales of temperature and humidity. Agreement between modelled and observed fluxes notably varied with geographical locations of the stations. For example, at the Long Point station in Lake Erie, observed fluxes are likely influenced by the upwind land surface while the simulations do not take account of the land surface influence, and therefore the agreement is worse in general.

  5. A Report on the Validation of Beryllium Strength Models

    Energy Technology Data Exchange (ETDEWEB)

    Armstrong, Derek Elswick [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-02-05

    This report discusses work on validating beryllium strength models with flyer plate and Taylor rod experimental data. Strength models are calibrated with Hopkinson bar and quasi-static data. The Hopkinson bar data for beryllium provides strain rates up to about 4000 per second. A limitation of the Hopkinson bar data for beryllium is that it only provides information on strain up to about 0.15. The lack of high strain data at high strain rates makes it difficult to distinguish between various strength model settings. The PTW model has been calibrated many different times over the last 12 years. The lack of high strain data for high strain rates has resulted in these calibrated PTW models for beryllium exhibiting significantly different behavior when extrapolated to high strain. For beryllium, the α parameter of PTW has recently been calibrated to high precision shear modulus data. In the past the α value for beryllium was set based on expert judgment. The new α value for beryllium was used in a calibration of the beryllium PTW model by Sky Sjue. The calibration by Sjue used EOS table information to model the temperature dependence of the heat capacity. Also, the calibration by Sjue used EOS table information to model the density changes of the beryllium sample during the Hopkinson bar and quasi-static experiments. In this paper, the calibrated PTW model by Sjue is compared against experimental data and other strength models. The other strength models being considered are a PTW model calibrated by Shuh- Rong Chen and a Steinberg-Guinan type model by John Pedicini. The three strength models are used in a comparison against flyer plate and Taylor rod data. The results show that the Chen PTW model provides better agreement to this data. The Chen PTW model settings have been previously adjusted to provide a better fit to flyer plate data, whereas the Sjue PTW model has not been changed based on flyer plate data. However, the Sjue model provides a reasonable fit to

  6. Statistical Validation of Engineering and Scientific Models: Background

    International Nuclear Information System (INIS)

    Hills, Richard G.; Trucano, Timothy G.

    1999-01-01

    A tutorial is presented discussing the basic issues associated with propagation of uncertainty analysis and statistical validation of engineering and scientific models. The propagation of uncertainty tutorial illustrates the use of the sensitivity method and the Monte Carlo method to evaluate the uncertainty in predictions for linear and nonlinear models. Four example applications are presented; a linear model, a model for the behavior of a damped spring-mass system, a transient thermal conduction model, and a nonlinear transient convective-diffusive model based on Burger's equation. Correlated and uncorrelated model input parameters are considered. The model validation tutorial builds on the material presented in the propagation of uncertainty tutoriaI and uses the damp spring-mass system as the example application. The validation tutorial illustrates several concepts associated with the application of statistical inference to test model predictions against experimental observations. Several validation methods are presented including error band based, multivariate, sum of squares of residuals, and optimization methods. After completion of the tutorial, a survey of statistical model validation literature is presented and recommendations for future work are made

  7. A validation study of a stochastic model of human interaction

    Science.gov (United States)

    Burchfield, Mitchel Talmadge

    The purpose of this dissertation is to validate a stochastic model of human interactions which is part of a developmentalism paradigm. Incorporating elements of ancient and contemporary philosophy and science, developmentalism defines human development as a progression of increasing competence and utilizes compatible theories of developmental psychology, cognitive psychology, educational psychology, social psychology, curriculum development, neurology, psychophysics, and physics. To validate a stochastic model of human interactions, the study addressed four research questions: (a) Does attitude vary over time? (b) What are the distributional assumptions underlying attitudes? (c) Does the stochastic model, {-}N{intlimitssbsp{-infty}{infty}}varphi(chi,tau)\\ Psi(tau)dtau, have utility for the study of attitudinal distributions and dynamics? (d) Are the Maxwell-Boltzmann, Fermi-Dirac, and Bose-Einstein theories applicable to human groups? Approximately 25,000 attitude observations were made using the Semantic Differential Scale. Positions of individuals varied over time and the logistic model predicted observed distributions with correlations between 0.98 and 1.0, with estimated standard errors significantly less than the magnitudes of the parameters. The results bring into question the applicability of Fisherian research designs (Fisher, 1922, 1928, 1938) for behavioral research based on the apparent failure of two fundamental assumptions-the noninteractive nature of the objects being studied and normal distribution of attributes. The findings indicate that individual belief structures are representable in terms of a psychological space which has the same or similar properties as physical space. The psychological space not only has dimension, but individuals interact by force equations similar to those described in theoretical physics models. Nonlinear regression techniques were used to estimate Fermi-Dirac parameters from the data. The model explained a high degree

  8. Design-validation of a hand exoskeleton using musculoskeletal modeling.

    Science.gov (United States)

    Hansen, Clint; Gosselin, Florian; Ben Mansour, Khalil; Devos, Pierre; Marin, Frederic

    2018-04-01

    Exoskeletons are progressively reaching homes and workplaces, allowing interaction with virtual environments, remote control of robots, or assisting human operators in carrying heavy loads. Their design is however still a challenge as these robots, being mechanically linked to the operators who wear them, have to meet ergonomic constraints besides usual robotic requirements in terms of workspace, speed, or efforts. They have in particular to fit the anthropometry and mobility of their users. This traditionally results in numerous prototypes which are progressively fitted to each individual person. In this paper, we propose instead to validate the design of a hand exoskeleton in a fully digital environment, without the need for a physical prototype. The purpose of this study is thus to examine whether finger kinematics are altered when using a given hand exoskeleton. Therefore, user specific musculoskeletal models were created and driven by a motion capture system to evaluate the fingers' joint kinematics when performing two industrial related tasks. The kinematic chain of the exoskeleton was added to the musculoskeletal models and its compliance with the hand movements was evaluated. Our results show that the proposed exoskeleton design does not influence fingers' joints angles, the coefficient of determination between the model with and without exoskeleton being consistently high (R 2 ¯=0.93) and the nRMSE consistently low (nRMSE¯ = 5.42°). These results are promising and this approach combining musculoskeletal and robotic modeling driven by motion capture data could be a key factor in the ergonomics validation of the design of orthotic devices and exoskeletons prior to manufacturing. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Parental modelling of eating behaviours: observational validation of the Parental Modelling of Eating Behaviours scale (PARM).

    Science.gov (United States)

    Palfreyman, Zoe; Haycraft, Emma; Meyer, Caroline

    2015-03-01

    Parents are important role models for their children's eating behaviours. This study aimed to further validate the recently developed Parental Modelling of Eating Behaviours Scale (PARM) by examining the relationships between maternal self-reports on the PARM with the modelling practices exhibited by these mothers during three family mealtime observations. Relationships between observed maternal modelling and maternal reports of children's eating behaviours were also explored. Seventeen mothers with children aged between 2 and 6 years were video recorded at home on three separate occasions whilst eating a meal with their child. Mothers also completed the PARM, the Children's Eating Behaviour Questionnaire and provided demographic information about themselves and their child. Findings provided validation for all three PARM subscales, which were positively associated with their observed counterparts on the observational coding scheme (PARM-O). The results also indicate that habituation to observations did not change the feeding behaviours displayed by mothers. In addition, observed maternal modelling was significantly related to children's food responsiveness (i.e., their interest in and desire for foods), enjoyment of food, and food fussiness. This study makes three important contributions to the literature. It provides construct validation for the PARM measure and provides further observational support for maternal modelling being related to lower levels of food fussiness and higher levels of food enjoyment in their children. These findings also suggest that maternal feeding behaviours remain consistent across repeated observations of family mealtimes, providing validation for previous research which has used single observations. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. Validity of microgravity simulation models on earth

    DEFF Research Database (Denmark)

    Regnard, J; Heer, M; Drummer, C

    2001-01-01

    incomplete knowledge of the characteristics inherent to each model. During water immersion, the hydrostatic pressure lowers the peripheral vascular capacity and causes increased thoracic blood volume and high vascular perfusion. In turn, these changes lead to high urinary flow, low vasomotor tone, and a high...... a negative pressure around the body. The differences in renal function between space and experimental models appear to be explained by the physical forces affecting tissues and hemodynamics as well as by the changes secondary to these forces. These differences may help in selecting experimental models...

  11. VERIFICATION AND VALIDATION OF THE SPARC MODEL

    Science.gov (United States)

    Mathematical models for predicting the transport and fate of pollutants in the environment require reactivity parameter values--that is, the physical and chemical constants that govern reactivity. Although empirical structure-activity relationships that allow estimation of some ...

  12. Base Flow Model Validation, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The program focuses on turbulence modeling enhancements for predicting high-speed rocket base flows. A key component of the effort is the collection of high-fidelity...

  13. Validation of Modeling Flow Approaching Navigation Locks

    Science.gov (United States)

    2013-08-01

    USACE, Pittsburgh District ( LRP ) requested that the US Army Engineer Research and Development Center, Coastal and ERDC/CHL TR-13-9 2 Hydraulics...approaching the lock and dam. The second set of experiments considered a design, referred to as Plan B lock approach, which contained the weir field in...conditions and model parameters A discharge of 1.35 cfs was set as the inflow boundary condition at the upstream end of the model. The outflow boundary was

  14. Development of a Conservative Model Validation Approach for Reliable Analysis

    Science.gov (United States)

    2015-01-01

    conservativeness level , the conservative probability of failure obtained from Section 4 must be maintained. The mathematical formulation of conservative model... CIE 2015 August 2-5, 2015, Boston, Massachusetts, USA [DRAFT] DETC2015-46982 DEVELOPMENT OF A CONSERVATIVE MODEL VALIDATION APPROACH FOR RELIABLE...PDF and a probability of failure are selected from these predicted output PDFs at a user-specified conservativeness level for validation. For

  15. Experimental validation of models for Plasma Focus devices

    International Nuclear Information System (INIS)

    Rodriguez Palomino, Luis; Gonzalez, Jose; Clausse, Alejandro

    2003-01-01

    Plasma Focus(PF) Devices are thermonuclear pulsators that produce short pulsed radiation (X-ray, charged particles and neutrons). Since Filippov and Mather, investigations have been used to study plasma properties. Nowadays the interest about PF is focused in technology applications, related to the use of these devices as pulsed neutron sources. In the numerical calculus the Inter institutional PLADEMA (PLAsmas DEnsos MAgnetizados) network is developing three models. Each one is useful in different engineering stages of the Plasma Focus design. One of the main objectives in this work is a comparative study on the influence of the different parameters involved in each models. To validate these results, several experimental measurements under different geometry and initial conditions were performed. (author)

  16. Traffic modelling validation of advanced driver assistance systems

    NARCIS (Netherlands)

    Tongeren, R. van; Gietelink, O.J.; Schutter, B. de; Verhaegen, M.

    2007-01-01

    This paper presents a microscopic traffic model for the validation of advanced driver assistance systems. This model describes single-lane traffic and is calibrated with data from a field operational test. To illustrate the use of the model, a Monte Carlo simulation of single-lane traffic scenarios

  17. Context discovery using attenuated Bloom codes: model description and validation

    NARCIS (Netherlands)

    Liu, F.; Heijenk, Geert

    A novel approach to performing context discovery in ad-hoc networks based on the use of attenuated Bloom filters is proposed in this report. In order to investigate the performance of this approach, a model has been developed. This document describes the model and its validation. The model has been

  18. Testing the validity of the International Atomic Energy Agency (IAEA) safety culture model.

    Science.gov (United States)

    López de Castro, Borja; Gracia, Francisco J; Peiró, José M; Pietrantoni, Luca; Hernández, Ana

    2013-11-01

    This paper takes the first steps to empirically validate the widely used model of safety culture of the International Atomic Energy Agency (IAEA), composed of five dimensions, further specified by 37 attributes. To do so, three independent and complementary studies are presented. First, 290 students serve to collect evidence about the face validity of the model. Second, 48 experts in organizational behavior judge its content validity. And third, 468 workers in a Spanish nuclear power plant help to reveal how closely the theoretical five-dimensional model can be replicated. Our findings suggest that several attributes of the model may not be related to their corresponding dimensions. According to our results, a one-dimensional structure fits the data better than the five dimensions proposed by the IAEA. Moreover, the IAEA model, as it stands, seems to have rather moderate content validity and low face validity. Practical implications for researchers and practitioners are included. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. Validation of kinetic modeling of progesterone release from polymeric membranes

    Directory of Open Access Journals (Sweden)

    Analia Irma Romero

    2018-01-01

    Full Text Available Mathematical modeling in drug release systems is fundamental in development and optimization of these systems, since it allows to predict drug release rates and to elucidate the physical transport mechanisms involved. In this paper we validate a novel mathematical model that describes progesterone (Prg controlled release from poly-3-hydroxybutyric acid (PHB membranes. A statistical analysis was conducted to compare the fitting of our model with six different models and the Akaike information criterion (AIC was used to find the equation with best-fit. A simple relation between mass and drug released rate was found, which allows predicting the effect of Prg loads on the release behavior. Our proposed model was the one with minimum AIC value, and therefore it was the one that statistically fitted better the experimental data obtained for all the Prg loads tested. Furthermore, the initial release rate was calculated and therefore, the interface mass transfer coefficient estimated and the equilibrium distribution constant of Prg between the PHB and the release medium was also determined. The results lead us to conclude that our proposed model is the one which best fits the experimental data and can be successfully used to describe Prg drug release in PHB membranes.

  20. Validation of a Simplified Model to Generate Multispectral Synthetic Images

    Directory of Open Access Journals (Sweden)

    Ion Sola

    2015-03-01

    Full Text Available A new procedure to assess the quality of topographic correction (TOC algorithms applied to remote sensing imagery was previously proposed by the authors. This procedure was based on a model that simulated synthetic scenes, representing the radiance an optical sensor would receive from an area under some specific conditions. TOC algorithms were then applied to synthetic scenes and the resulting corrected scenes were compared with a horizontal synthetic scene free of topographic effect. This comparison enabled an objective and quantitative evaluation of TOC algorithms. This approach showed promising results but had some shortcomings that are addressed herein. First, the model, originally built to simulate only broadband panchromatic scenes, is extended to multispectral scenes in the visible, near infrared (NIR, and short wave infrared (SWIR bands. Next, the model is validated by comparing synthetic scenes with four Satellite pour l'Observation de la Terre 5 (SPOT5 real scenes acquired on different dates and different test areas along the Pyrenees mountain range (Spain. The results obtained show a successful simulation of all the spectral bands. Therefore, the model is deemed accurate enough for its purpose of evaluating TOC algorithms.

  1. Experimental validation of solid rocket motor damping models

    Science.gov (United States)

    Riso, Cristina; Fransen, Sebastiaan; Mastroddi, Franco; Coppotelli, Giuliano; Trequattrini, Francesco; De Vivo, Alessio

    2017-12-01

    In design and certification of spacecraft, payload/launcher coupled load analyses are performed to simulate the satellite dynamic environment. To obtain accurate predictions, the system damping properties must be properly taken into account in the finite element model used for coupled load analysis. This is typically done using a structural damping characterization in the frequency domain, which is not applicable in the time domain. Therefore, the structural damping matrix of the system must be converted into an equivalent viscous damping matrix when a transient coupled load analysis is performed. This paper focuses on the validation of equivalent viscous damping methods for dynamically condensed finite element models via correlation with experimental data for a realistic structure representative of a slender launch vehicle with solid rocket motors. A second scope of the paper is to investigate how to conveniently choose a single combination of Young's modulus and structural damping coefficient—complex Young's modulus—to approximate the viscoelastic behavior of a solid propellant material in the frequency band of interest for coupled load analysis. A scaled-down test article inspired to the Z9-ignition Vega launcher configuration is designed, manufactured, and experimentally tested to obtain data for validation of the equivalent viscous damping methods. The Z9-like component of the test article is filled with a viscoelastic material representative of the Z9 solid propellant that is also preliminarily tested to investigate the dependency of the complex Young's modulus on the excitation frequency and provide data for the test article finite element model. Experimental results from seismic and shock tests performed on the test configuration are correlated with numerical results from frequency and time domain analyses carried out on its dynamically condensed finite element model to assess the applicability of different equivalent viscous damping methods to describe

  2. PIV validation of blood-heart valve leaflet interaction modelling.

    Science.gov (United States)

    Kaminsky, R; Dumont, K; Weber, H; Schroll, M; Verdonck, P

    2007-07-01

    The aim of this study was to validate the 2D computational fluid dynamics (CFD) results of a moving heart valve based on a fluid-structure interaction (FSI) algorithm with experimental measurements. Firstly, a pulsatile laminar flow through a monoleaflet valve model with a stiff leaflet was visualized by means of Particle Image Velocimetry (PIV). The inflow data sets were applied to a CFD simulation including blood-leaflet interaction. The measurement section with a fixed leaflet was enclosed into a standard mock loop in series with a Harvard Apparatus Pulsatile Blood Pump, a compliance chamber and a reservoir. Standard 2D PIV measurements were made at a frequency of 60 bpm. Average velocity magnitude results of 36 phase-locked measurements were evaluated at every 10 degrees of the pump cycle. For the CFD flow simulation, a commercially available package from Fluent Inc. was used in combination with inhouse developed FSI code based on the Arbitrary Lagrangian-Eulerian (ALE) method. Then the CFD code was applied to the leaflet to quantify the shear stress on it. Generally, the CFD results are in agreement with the PIV evaluated data in major flow regions, thereby validating the FSI simulation of a monoleaflet valve with a flexible leaflet. The applicability of the new CFD code for quantifying the shear stress on a flexible leaflet is thus demonstrated.

  3. Development and Validation of Methodology to Model Flow in Ventilation Systems Commonly Found in Nuclear Facilities. Phase I

    Energy Technology Data Exchange (ETDEWEB)

    Strons, Philip [Argonne National Lab. (ANL), Argonne, IL (United States); Bailey, James L. [Argonne National Lab. (ANL), Argonne, IL (United States); Davis, John [Argonne National Lab. (ANL), Argonne, IL (United States); Grudzinski, James [Argonne National Lab. (ANL), Argonne, IL (United States); Hlotke, John [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-03-01

    In this work, we apply the CFD in modeling airflow and particulate transport. This modeling is then compared to field validation studies to both inform and validate the modeling assumptions. Based on the results of field tests, modeling assumptions and boundary conditions are refined and the process is repeated until the results are found to be reliable with a high level of confidence.

  4. Quantitative system validation in model driven design

    DEFF Research Database (Denmark)

    Hermanns, Hilger; Larsen, Kim Guldstrand; Raskin, Jean-Francois

    2010-01-01

    The European STREP project Quasimodo1 develops theory, techniques and tool components for handling quantitative constraints in model-driven development of real-time embedded systems, covering in particular real-time, hybrid and stochastic aspects. This tutorial highlights the advances made...

  5. Systematic approach to verification and validation: High explosive burn models

    Energy Technology Data Exchange (ETDEWEB)

    Menikoff, Ralph [Los Alamos National Laboratory; Scovel, Christina A. [Los Alamos National Laboratory

    2012-04-16

    Most material models used in numerical simulations are based on heuristics and empirically calibrated to experimental data. For a specific model, key questions are determining its domain of applicability and assessing its relative merits compared to other models. Answering these questions should be a part of model verification and validation (V and V). Here, we focus on V and V of high explosive models. Typically, model developers implemented their model in their own hydro code and use different sets of experiments to calibrate model parameters. Rarely can one find in the literature simulation results for different models of the same experiment. Consequently, it is difficult to assess objectively the relative merits of different models. This situation results in part from the fact that experimental data is scattered through the literature (articles in journals and conference proceedings) and that the printed literature does not allow the reader to obtain data from a figure in electronic form needed to make detailed comparisons among experiments and simulations. In addition, it is very time consuming to set up and run simulations to compare different models over sufficiently many experiments to cover the range of phenomena of interest. The first difficulty could be overcome if the research community were to support an online web based database. The second difficulty can be greatly reduced by automating procedures to set up and run simulations of similar types of experiments. Moreover, automated testing would be greatly facilitated if the data files obtained from a database were in a standard format that contained key experimental parameters as meta-data in a header to the data file. To illustrate our approach to V and V, we have developed a high explosive database (HED) at LANL. It now contains a large number of shock initiation experiments. Utilizing the header information in a data file from HED, we have written scripts to generate an input file for a hydro code

  6. Gap Conductance model Validation in the TASS/SMR-S code using MARS code

    International Nuclear Information System (INIS)

    Ahn, Sang Jun; Yang, Soo Hyung; Chung, Young Jong; Lee, Won Jae

    2010-01-01

    Korea Atomic Energy Research Institute (KAERI) has been developing the TASS/SMR-S (Transient and Setpoint Simulation/Small and Medium Reactor) code, which is a thermal hydraulic code for the safety analysis of the advanced integral reactor. An appropriate work to validate the applicability of the thermal hydraulic models within the code should be demanded. Among the models, the gap conductance model which is describes the thermal gap conductivity between fuel and cladding was validated through the comparison with MARS code. The validation of the gap conductance model was performed by evaluating the variation of the gap temperature and gap width as the changed with the power fraction. In this paper, a brief description of the gap conductance model in the TASS/SMR-S code is presented. In addition, calculated results to validate the gap conductance model are demonstrated by comparing with the results of the MARS code with the test case

  7. Object-oriented simulation model of a parabolic trough solar collector: Static and dynamic validation

    Science.gov (United States)

    Ubieta, Eduardo; Hoyo, Itzal del; Valenzuela, Loreto; Lopez-Martín, Rafael; Peña, Víctor de la; López, Susana

    2017-06-01

    A simulation model of a parabolic-trough solar collector developed in Modelica® language is calibrated and validated. The calibration is performed in order to approximate the behavior of the solar collector model to a real one due to the uncertainty in some of the system parameters, i.e. measured data is used during the calibration process. Afterwards, the validation of this calibrated model is done. During the validation, the results obtained from the model are compared to the ones obtained during real operation in a collector from the Plataforma Solar de Almeria (PSA).

  8. Checklist for the qualitative evaluation of clinical studies with particular focus on external validity and model validity

    Directory of Open Access Journals (Sweden)

    Vollmar Horst C

    2006-12-01

    Full Text Available Abstract Background It is often stated that external validity is not sufficiently considered in the assessment of clinical studies. Although tools for its evaluation have been established, there is a lack of awareness of their significance and application. In this article, a comprehensive checklist is presented addressing these relevant criteria. Methods The checklist was developed by listing the most commonly used assessment criteria for clinical studies. Additionally, specific lists for individual applications were included. The categories of biases of internal validity (selection, performance, attrition and detection bias correspond to structural, treatment-related and observational differences between the test and control groups. Analogously, we have extended these categories to address external validity and model validity, regarding similarity between the study population/conditions and the general population/conditions related to structure, treatment and observation. Results A checklist is presented, in which the evaluation criteria concerning external validity and model validity are systemised and transformed into a questionnaire format. Conclusion The checklist presented in this article can be applied to both planning and evaluating of clinical studies. We encourage the prospective user to modify the checklists according to the respective application and research question. The higher expenditure needed for the evaluation of clinical studies in systematic reviews is justified, particularly in the light of the influential nature of their conclusions on therapeutic decisions and the creation of clinical guidelines.

  9. Modelling and validation of Proton exchange membrane fuel cell (PEMFC)

    Science.gov (United States)

    Mohiuddin, A. K. M.; Basran, N.; Khan, A. A.

    2018-01-01

    This paper is the outcome of a small scale fuel cell project. Fuel cell is an electrochemical device that converts energy from chemical reaction to electrical work. Proton Exchange Membrane Fuel Cell (PEMFC) is one of the different types of fuel cell, which is more efficient, having low operational temperature and fast start up capability results in high energy density. In this study, a mathematical model of 1.2 W PEMFC is developed and simulated using MATLAB software. This model describes the PEMFC behaviour under steady-state condition. This mathematical modeling of PEMFC determines the polarization curve, power generated, and the efficiency of the fuel cell. Simulation results were validated by comparing with experimental results obtained from the test of a single PEMFC with a 3 V motor. The performance of experimental PEMFC is little lower compared to simulated PEMFC, however both results were found in good agreement. Experiments on hydrogen flow rate also been conducted to obtain the amount of hydrogen consumed to produce electrical work on PEMFC.

  10. Dental models made with an intraoral scanner: a validation study.

    Science.gov (United States)

    Cuperus, Anne Margreet R; Harms, Marit C; Rangel, Frits A; Bronkhorst, Ewald M; Schols, Jan G J H; Breuning, K Hero

    2012-09-01

    Our objectives were to determine the validity and reproducibility of measurements on stereolithographic models and 3-dimensional digital dental models made with an intraoral scanner. Ten dry human skulls were scanned; from the scans, stereolithographic models and digital models were made. Two observers measured transversal distances, mesiodistal tooth widths, and arch segments on the skulls and the stereolithographic and digital models. All measurements were repeated 4 times. Arch length discrepancy and tooth size discrepancy were calculated. Statistical analysis was performed by using paired t tests. For the measurements on the stereolithographic and digital models, statistically significant differences were found. However, these differences were considered to be clinically insignificant. Digital models had fewer statistically significant differences and generally the smallest duplicate measurement errors compared with the stereolithographic models. Stereolithographic and digital models made with an intraoral scanner are a valid and reproducible method for measuring distances in a dentition. Copyright © 2012 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.

  11. Ensuring the Validity of the Micro Foundation in DSGE Models

    DEFF Research Database (Denmark)

    Andreasen, Martin Møller

    & Primiceri (American Economic Review, forth- coming) and Fernández-Villaverde & Rubio-Ramírez (Review of Economic Studies, 2007) do not satisfy these sufficient conditions, or any other known set of conditions ensuring finite values for the objective functions. Thus, the validity of the micro foundation......The presence of i) stochastic trends, ii) deterministic trends, and/or iii) stochastic volatil- ity in DSGE models may imply that the agents' objective functions attain infinite values. We say that such models do not have a valid micro foundation. The paper derives sufficient condi- tions which...... ensure that the objective functions of the households and the firms are finite even when various trends and stochastic volatility are included in a standard DSGE model. Based on these conditions we test the validity of the micro foundation in six DSGE models from the literature. The models of Justiniano...

  12. Validación de un modelo de medida de los resultados percibidos por los mandos relacionados con el uso de las prácticas de alta implicación – HIWP (Validation of a model of measure for the results perceived by senior managers related to the use of HIWP

    Directory of Open Access Journals (Sweden)

    Graziela Conci

    2011-12-01

    Full Text Available Entre las prácticas de recursos humanos destacan las herramientas de alta implicación (HIWP. El modelo de Lawler (1991; 1998, agrupa estas prácticas en cuatro constructos: comunicación, formación, participación y compensación. Nuestro trabajo se centra en validar el modelo de medida de la escala de resultados, propuesto por Lawler y colaboradores (2001 y otros modelos de medida alternativos para medir los resultados con percepciones subjetivas por el mando. El modelo de medida se compone de las escalas de desempeño y servicio, recursos humanos y resultados de la organización. Tras realizar los análisis factoriales confirmatorias de los datos de 98 empresas españolas, validamos un modelo que presenta unos estadísticos de validez convergente muy buenos. Abstract: Upon several human resources practices high involvement work practices stand out. Lawler’s model (1991; 1998 groups these practices in four constructs: communication, training, empowerment and rewards. Our work focuses on validating the model of measure of the scale of results, proposed by Lawler and colleagues (2001 and other alternative measurement models in order to measure the results with subjective perception for the senior managers. The measurement model is composed of performance and services scales, human resources and results achieved by the organization. After carrying out the confirmatory factorial analyses for collected data from 98 Spanish companies, we validate a model that presents very good convergent statistics of validity.

  13. Hydraulic Hybrid Excavator—Mathematical Model Validation and Energy Analysis

    Directory of Open Access Journals (Sweden)

    Paolo Casoli

    2016-11-01

    Full Text Available Recent demands to reduce pollutant emissions and improve energy efficiency have driven the implementation of hybrid solutions in mobile machinery. This paper presents the results of a numerical and experimental analysis conducted on a hydraulic hybrid excavator (HHE. The machinery under study is a middle size excavator, whose standard version was modified with the introduction of an energy recovery system (ERS. The proposed ERS layout was designed to recover the potential energy of the boom, using a hydraulic accumulator as a storage device. The recovered energy is utilized through the pilot pump of the machinery which operates as a motor, thus reducing the torque required from the internal combustion engine (ICE. The analysis reported in this paper validates the HHE model by comparing numerical and experimental data in terms of hydraulic and mechanical variables and fuel consumption. The mathematical model shows its capability to reproduce the realistic operating conditions of the realized prototype, tested on the field. A detailed energy analysis comparison between the standard and the hybrid excavator models was carried out to evaluate the energy flows along the system, showing advantages, weaknesses and possibilities to further improve the machinery efficiency. Finally, the fuel consumption estimated by the model and that measured during the experiments are presented to highlight the fuel saving percentages. The HHE model is an important starting point for the development of other energy saving solutions.

  14. Validation of Storm Water Management Model Storm Control Measures Modules

    Science.gov (United States)

    Simon, M. A.; Platz, M. C.

    2017-12-01

    EPA's Storm Water Management Model (SWMM) is a computational code heavily relied upon by industry for the simulation of wastewater and stormwater infrastructure performance. Many municipalities are relying on SWMM results to design multi-billion-dollar, multi-decade infrastructure upgrades. Since the 1970's, EPA and others have developed five major releases, the most recent ones containing storm control measures modules for green infrastructure. The main objective of this study was to quantify the accuracy with which SWMM v5.1.10 simulates the hydrologic activity of previously monitored low impact developments. Model performance was evaluated with a mathematical comparison of outflow hydrographs and total outflow volumes, using empirical data and a multi-event, multi-objective calibration method. The calibration methodology utilized PEST++ Version 3, a parameter estimation tool, which aided in the selection of unmeasured hydrologic parameters. From the validation study and sensitivity analysis, several model improvements were identified to advance SWMM LID Module performance for permeable pavements, infiltration units and green roofs, and these were performed and reported herein. Overall, it was determined that SWMM can successfully simulate low impact development controls given accurate model confirmation, parameter measurement, and model calibration.

  15. Experiments for foam model development and validation.

    Energy Technology Data Exchange (ETDEWEB)

    Bourdon, Christopher Jay; Cote, Raymond O.; Moffat, Harry K.; Grillet, Anne Mary; Mahoney, James F. (Honeywell Federal Manufacturing and Technologies, Kansas City Plant, Kansas City, MO); Russick, Edward Mark; Adolf, Douglas Brian; Rao, Rekha Ranjana; Thompson, Kyle Richard; Kraynik, Andrew Michael; Castaneda, Jaime N.; Brotherton, Christopher M.; Mondy, Lisa Ann; Gorby, Allen D.

    2008-09-01

    A series of experiments has been performed to allow observation of the foaming process and the collection of temperature, rise rate, and microstructural data. Microfocus video is used in conjunction with particle image velocimetry (PIV) to elucidate the boundary condition at the wall. Rheology, reaction kinetics and density measurements complement the flow visualization. X-ray computed tomography (CT) is used to examine the cured foams to determine density gradients. These data provide input to a continuum level finite element model of the blowing process.

  16. Finite element model validation of bridge based on structural health monitoring—Part II: Uncertainty propagation and model validation

    Directory of Open Access Journals (Sweden)

    Xiaosong Lin

    2015-08-01

    Full Text Available Because of uncertainties involved in modeling, construction, and measurement systems, the assessment of the FE model validation must be conducted based on stochastic measurements to provide designers with confidence for further applications. In this study, based on the updated model using response surface methodology, a practical model validation methodology via uncertainty propagation is presented. Several criteria of testing/analysis correlation are introduced, and the sources of model and testing uncertainties are also discussed. After that, Monte Carlo stochastic finite element (FE method is employed to perform the uncertainty quantification and propagation. The proposed methodology is illustrated with the examination of the validity of a large-span prestressed concrete continuous rigid frame bridge monitored under operational conditions. It can be concluded that the calculated frequencies and vibration modes of the updated FE model of Xiabaishi Bridge are consistent with the measured ones. The relative errors of each frequency are all less than 3.7%. Meanwhile, the overlap ratio indexes of each frequency are all more than 75%; The MAC values of each calculated vibration frequency are all more than 90%. The model of Xiabaishi Bridge is valid in the whole operation space including experimental design space, and its confidence level is upper than 95%. The validated FE model of Xiabaishi Bridge can reflect the current condition of Xiabaishi Bridge, and also can be used as basis of bridge health monitoring, damage identification and safety assessment.

  17. [Validation of abdominal wound dehiscence's risk model].

    Science.gov (United States)

    Gómez Díaz, Carlos Javier; Rebasa Cladera, Pere; Navarro Soto, Salvador; Hidalgo Rosas, José Manuel; Luna Aufroy, Alexis; Montmany Vioque, Sandra; Corredera Cantarín, Constanza

    2014-02-01

    The aim of this study is to determine the usefulness of the risk model developed by van Ramshorst et al., and a modification of the same, to predict the abdominal wound dehiscence's risk in patients who underwent midline laparotomy incisions. Observational longitudinal retrospective study. Patients who underwent midline laparotomy incisions in the General and Digestive Surgery Department of the Sabadell's Hospital-Parc Taulí's Health and University Corporation-Barcelona, between January 1, 2010 and June 30, 2010. Dependent variable: Abdominal wound dehiscence. Global risk score, preoperative risk score (postoperative variables were excluded), global and preoperative probabilities of developing abdominal wound dehiscence. 176 patients. Patients with abdominal wound dehiscence: 15 (8.5%). The global risk score of abdominal wound dehiscence group (mean: 4.97; IC 95%: 4.15-5.79) was better than the global risk score of No abdominal wound dehiscence group (mean: 3.41; IC 95%: 3.20-3.62). This difference is statistically significant (P<.001). The preoperative risk score of abdominal wound dehiscence group (mean: 3.27; IC 95%: 2.69-3.84) was better than the preoperative risk score of No abdominal wound dehiscence group (mean: 2.77; IC 95%: 2.64-2.89), also a statistically significant difference (P<.05). The global risk score (area under the ROC curve: 0.79) has better accuracy than the preoperative risk score (area under the ROC curve: 0.64). The risk model developed by van Ramshorst et al. to predict the abdominal wound dehiscence's risk in the preoperative phase has a limited usefulness. Additional refinements in the preoperative risk score are needed to improve its accuracy. Copyright © 2012 AEC. Published by Elsevier Espana. All rights reserved.

  18. NRPB models for calculating the transfer of radionuclides through the environment. Verification and validation

    International Nuclear Information System (INIS)

    Attwood, C.; Barraclough, I.; Brown, J.

    1998-06-01

    There is a wide range of models available at NRPB to predict the transfer of radionuclides through the environment. Such models form an essential part of assessments of the radiological impact of releases of radionuclides into the environment. These models cover: the atmosphere; the aquatic environment; the geosphere; the terrestrial environment including foodchains. It is important that the models used for radiological impact assessments are robust, reliable and suitable for the assessment being undertaken. During model development it is, therefore, important that the model is both verified and validated. Verification of a model involves ensuring that it has been implemented correctly, while validation consists of demonstrating that the model is an adequate representation of the real environment. The extent to which a model can be verified depends on its complexity and whether similar models exist. For relatively simple models verification is straightforward, but for more complex models verification has to form part of the development, coding and testing of the model within quality assurance procedures. Validation of models should ideally consist of comparisons between the results of the models and experimental or environmental measurement data that were not used to develop the model. This is more straightforward for some models than for others depending on the quantity and type of data available. Validation becomes increasingly difficult for models which are intended to predict environmental transfer at long times or at great distances. It is, therefore, necessary to adopt qualitative validation techniques to ensure that the model is an adequate representation of the real environment. This report summarises the models used at NRPB to predict the transfer of radionuclides through the environment as part of a radiological impact assessment. It outlines the work carried out to verify and validate the models. The majority of these models are not currently available

  19. Validating neural-network refinements of nuclear mass models

    Science.gov (United States)

    Utama, R.; Piekarewicz, J.

    2018-01-01

    Background: Nuclear astrophysics centers on the role of nuclear physics in the cosmos. In particular, nuclear masses at the limits of stability are critical in the development of stellar structure and the origin of the elements. Purpose: We aim to test and validate the predictions of recently refined nuclear mass models against the newly published AME2016 compilation. Methods: The basic paradigm underlining the recently refined nuclear mass models is based on existing state-of-the-art models that are subsequently refined through the training of an artificial neural network. Bayesian inference is used to determine the parameters of the neural network so that statistical uncertainties are provided for all model predictions. Results: We observe a significant improvement in the Bayesian neural network (BNN) predictions relative to the corresponding "bare" models when compared to the nearly 50 new masses reported in the AME2016 compilation. Further, AME2016 estimates for the handful of impactful isotopes in the determination of r -process abundances are found to be in fairly good agreement with our theoretical predictions. Indeed, the BNN-improved Duflo-Zuker model predicts a root-mean-square deviation relative to experiment of σrms≃400 keV. Conclusions: Given the excellent performance of the BNN refinement in confronting the recently published AME2016 compilation, we are confident of its critical role in our quest for mass models of the highest quality. Moreover, as uncertainty quantification is at the core of the BNN approach, the improved mass models are in a unique position to identify those nuclei that will have the strongest impact in resolving some of the outstanding questions in nuclear astrophysics.

  20. Empirical validation of an agent-based model of wood markets in Switzerland

    Science.gov (United States)

    Hilty, Lorenz M.; Lemm, Renato; Thees, Oliver

    2018-01-01

    We present an agent-based model of wood markets and show our efforts to validate this model using empirical data from different sources, including interviews, workshops, experiments, and official statistics. Own surveys closed gaps where data was not available. Our approach to model validation used a variety of techniques, including the replication of historical production amounts, prices, and survey results, as well as a historical case study of a large sawmill entering the market and becoming insolvent only a few years later. Validating the model using this case provided additional insights, showing how the model can be used to simulate scenarios of resource availability and resource allocation. We conclude that the outcome of the rigorous validation qualifies the model to simulate scenarios concerning resource availability and allocation in our study region. PMID:29351300

  1. Predicting the ungauged basin: model validation and realism assessment

    NARCIS (Netherlands)

    van Emmerik, Tim; Mulder, Gert; Eilander, Dirk; Piet, Marijn; Savenije, Hubert

    2015-01-01

    The hydrological decade on Predictions in Ungauged Basins (PUB) led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of

  2. Predicting the ungauged basin : Model validation and realism assessment

    NARCIS (Netherlands)

    van Emmerik, Tim; Mulder, Gert; Eilander, Dirk; Piet, Marijn; Savenije, Hubert

    2015-01-01

    The hydrological decade on Predictions in Ungauged Basins (PUB) led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of

  3. Landslide Tsunami Generation Models: Validation and Case Studies

    Science.gov (United States)

    Watts, P.; Grilli, S. T.; Kirby, J. T.; Fryer, G. J.; Tappin, D. R.

    2002-12-01

    There has been a proliferation of landslide tsunami generation and propagation models in recent time, spurred largely by the 1998 Papua New Guinea event. However, few of these models or techniques have been carefully validated. Moreover, few of these models have proven capable of integrating the best available geological data and interpretations into convincing case studies. The Tsunami Open and Progressive Initial Conditions System (TOPICS) rapidly provides approximate landslide tsunami sources for tsunami propagation models. We present 3D laboratory experiments and 3D Boundary Element Method simulations that validate the tsunami sources given by TOPICS. Geowave is a combination of TOPICS with the fully nonlinear and dispersive Boussinesq model FUNWAVE, which has been the subject of extensive testing and validation over the course of the last decade. Geowave is currently a tsunami community model made available to all tsunami researchers on the web site www.tsunamicommunity.org. We validate Geowave with case studies of the 1946 Unimak, Alaska, the 1994 Skagway, Alaska, and the 1998 Papua New Guinea events. The benefits of Boussinesq wave propagation over traditional shallow water wave models is very apparent for these relatively steep and nonlinear waves. For the first time, a tsunami community model appear sufficiently powerful to reproduce all observations and records with the first numerical simulation. This can only be accomplished by first assembling geological data and interpretations into a reasonable tsunami source.

  4. Child human model development: a hybrid validation approach

    NARCIS (Netherlands)

    Forbes, P.A.; Rooij, L. van; Rodarius, C.; Crandall, J.

    2008-01-01

    The current study presents a development and validation approach of a child human body model that will help understand child impact injuries and improve the biofidelity of child anthropometric test devices. Due to the lack of fundamental child biomechanical data needed to fully develop such models a

  5. Validation & verification of a Bayesian network model for aircraft vulnerability

    CSIR Research Space (South Africa)

    Schietekat, Sunelle

    2016-09-01

    Full Text Available This paper provides a methodology for Validation and Verification (V&V) of a Bayesian Network (BN) model for aircraft vulnerability against Infrared (IR) missile threats. The model considers that the aircraft vulnerability depends both on a missile...

  6. On the development and validation of QSAR models.

    Science.gov (United States)

    Gramatica, Paola

    2013-01-01

    The fundamental and more critical steps that are necessary for the development and validation of QSAR models are presented in this chapter as best practices in the field. These procedures are discussed in the context of predictive QSAR modelling that is focused on achieving models of the highest statistical quality and with external predictive power. The most important and most used statistical parameters needed to verify the real performances of QSAR models (of both linear regression and classification) are presented. Special emphasis is placed on the validation of models, both internally and externally, as well as on the need to define model applicability domains, which should be done when models are employed for the prediction of new external compounds.

  7. Validation of Vehicle Model Response with an Instrumented Experimental Vehicle

    Directory of Open Access Journals (Sweden)

    Harun Mohamad Hafiz

    2017-01-01

    Full Text Available A steering aid system called active steering is evaluated by simulating different kinds of driving events. The main purpose of the steering system is to allow the driver control the vehicle independently. A full car vehicle model is simulated in Matlab/Simulink with 14 degree of freedom of equations which include the ride vehicle model and also the handling model. The steering angle is the input of the vehicle model that should be focused on. The angle of the steering system between the tires when turning the vehicle is taken in consideration. Simulations are made on different road conditions effect and also side wind disturbances. Different values are applied to the simulation to reduce the effect of the driving events. Therefore, these simulations results to provide a better improvement to the steering system. The aim for this work is to validate the vehicle response with an instrumented experiemental vehicle. Specific driving events in these simulations are the road adhesions and lateral side wind disturbances.

  8. ADOPT: A Historically Validated Light Duty Vehicle Consumer Choice Model

    Energy Technology Data Exchange (ETDEWEB)

    Brooker, A.; Gonder, J.; Lopp, S.; Ward, J.

    2015-05-04

    The Automotive Deployment Option Projection Tool (ADOPT) is a light-duty vehicle consumer choice and stock model supported by the U.S. Department of Energy’s Vehicle Technologies Office. It estimates technology improvement impacts on U.S. light-duty vehicles sales, petroleum use, and greenhouse gas emissions. ADOPT uses techniques from the multinomial logit method and the mixed logit method estimate sales. Specifically, it estimates sales based on the weighted value of key attributes including vehicle price, fuel cost, acceleration, range and usable volume. The average importance of several attributes changes nonlinearly across its range and changes with income. For several attributes, a distribution of importance around the average value is used to represent consumer heterogeneity. The majority of existing vehicle makes, models, and trims are included to fully represent the market. The Corporate Average Fuel Economy regulations are enforced. The sales feed into the ADOPT stock model. It captures key aspects for summing petroleum use and greenhouse gas emissions This includes capturing the change in vehicle miles traveled by vehicle age, the creation of new model options based on the success of existing vehicles, new vehicle option introduction rate limits, and survival rates by vehicle age. ADOPT has been extensively validated with historical sales data. It matches in key dimensions including sales by fuel economy, acceleration, price, vehicle size class, and powertrain across multiple years. A graphical user interface provides easy and efficient use. It manages the inputs, simulation, and results.

  9. Site characterization and validation - Tracer migration experiment in the validation drift, report 2, part 1: performed experiments, results and evaluation

    International Nuclear Information System (INIS)

    Birgersson, L.; Widen, H.; Aagren, T.; Neretnieks, I.; Moreno, L.

    1992-01-01

    This report is the second of the two reports describing the tracer migration experiment where water and tracer flow has been monitored in a drift at the 385 m level in the Stripa experimental mine. The tracer migration experiment is one of a large number of experiments performed within the Site Characterization and Validation (SCV) project. The upper part of the 50 m long validation drift was covered with approximately 150 plastic sheets, in which the emerging water was collected. The water emerging into the lower part of the drift was collected in short boreholes, sumpholes. Sex different tracer mixtures were injected at distances between 10 and 25 m from the drift. The flowrate and tracer monitoring continued for ten months. Tracer breakthrough curves and flowrate distributions were used to study flow paths, velocities, hydraulic conductivities, dispersivities, interaction with the rock matrix and channelling effects within the rock. The present report describes the structure of the observations, the flowrate measurements and estimated hydraulic conductivities. The main part of this report addresses the interpretation of the tracer movement in fractured rock. The tracer movement as measured by the more than 150 individual tracer curves has been analysed with the traditional advection-dispersion model and a subset of the curves with the advection-dispersion-diffusion model. The tracer experiments have permitted the flow porosity, dispersion and interaction with the rock matrix to be studied. (57 refs.)

  10. First approximations in avalanche model validations using seismic information

    Science.gov (United States)

    Roig Lafon, Pere; Suriñach, Emma; Bartelt, Perry; Pérez-Guillén, Cristina; Tapia, Mar; Sovilla, Betty

    2017-04-01

    Avalanche dynamics modelling is an essential tool for snow hazard management. Scenario based numerical modelling provides quantitative arguments for decision-making. The software tool RAMMS (WSL Institute for Snow and Avalanche Research SLF) is one such tool, often used by government authorities and geotechnical offices. As avalanche models improve, the quality of the numerical results will depend increasingly on user experience on the specification of input (e.g. release and entrainment volumes, secondary releases, snow temperature and quality). New model developments must continue to be validated using real phenomena data, for improving performance and reliability. The avalanches group form University of Barcelona (RISKNAT - UB), has studied the seismic signals generated from avalanches since 1994. Presently, the group manages the seismic installation at SLF's Vallée de la Sionne experimental site (VDLS). At VDLS the recorded seismic signals can be correlated to other avalanche measurement techniques, including both advanced remote sensing methods (radars, videogrammetry) and obstacle based sensors (pressure, capacitance, optical sender-reflector barriers). This comparison between different measurement techniques allows the group to address the question if seismic analysis can be used alone, on more additional avalanche tracks, to gain insight and validate numerical avalanche dynamics models in different terrain conditions. In this study, we aim to add the seismic data as an external record of the phenomena, able to validate RAMMS models. The seismic sensors are considerable easy and cheaper to install than other physical measuring tools, and are able to record data from the phenomena in every atmospheric conditions (e.g. bad weather, low light, freezing make photography, and other kind of sensors not usable). With seismic signals, we record the temporal evolution of the inner and denser parts of the avalanche. We are able to recognize the approximate position

  11. Empirical Validation of Building Simulation Software : Modeling of Double Facades

    DEFF Research Database (Denmark)

    Kalyanova, Olena; Heiselberg, Per

    The work described in this report is the result of a collaborative effort of members of the International Energy Agency (IEA), Task 34/43: Testing and validation of building energy simulation tools experts group.......The work described in this report is the result of a collaborative effort of members of the International Energy Agency (IEA), Task 34/43: Testing and validation of building energy simulation tools experts group....

  12. Validation of heat transfer models for gap cooling

    International Nuclear Information System (INIS)

    Okano, Yukimitsu; Nagae, Takashi; Murase, Michio

    2004-01-01

    For severe accident assessment of a light water reactor, models of heat transfer in a narrow annular gap between overheated core debris and a reactor pressure vessel are important for evaluating vessel integrity and accident management. The authors developed and improved the models of heat transfer. However, validation was not sufficient for applicability of the gap heat flux correlation to the debris cooling in the vessel lower head and applicability of the local boiling heat flux correlations to the high-pressure conditions. Therefore, in this paper, we evaluated the validity of the heat transfer models and correlations by analyses for ALPHA and LAVA experiments where molten aluminum oxide (Al 2 O 3 ) at about 2700 K was poured into the high pressure water pool in a small-scale simulated vessel lower head. In the heating process of the vessel wall, the calculated heating rate and peak temperature agreed well with the measured values, and the validity of the heat transfer models and gap heat flux correlation was confirmed. In the cooling process of the vessel wall, the calculated cooling rate was compared with the measured value, and the validity of the nucleate boiling heat flux correlation was confirmed. The peak temperatures of the vessel wall in ALPHA and LAVA experiments were lower than the temperature at the minimum heat flux point between film boiling and transition boiling, so the minimum heat flux correlation could not be validated. (author)

  13. The turbulent viscosity models and their experimental validation; Les modeles de viscosite turbulente et leur validation experimentale

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-12-31

    This workshop on turbulent viscosity models and on their experimental validation was organized by the `convection` section of the French society of thermal engineers. From the 9 papers presented during this workshop, 8 deal with the modeling of turbulent flows inside combustion chambers, turbo-machineries or in other energy-related applications, and have been selected for ETDE. (J.S.)

  14. Empirical validation data sets for double skin facade models

    DEFF Research Database (Denmark)

    Kalyanova, Olena; Jensen, Rasmus Lund; Heiselberg, Per

    2008-01-01

    During recent years application of double skin facades (DSF) has greatly increased. However, successful application depends heavily on reliable and validated models for simulation of the DSF performance and this in turn requires access to high quality experimental data. Three sets of accurate...... empirical data for validation of DSF modeling with building simulation software were produced within the International Energy Agency (IEA) SHCTask 34 / ECBCS Annex 43. This paper describes the full-scale outdoor experimental test facility, the experimental set-up and the measurements procedure...

  15. Validation and verification of agent models for trust: Independent compared to relative trust

    NARCIS (Netherlands)

    Hoogendoorn, M.; Jaffry, S.W.; Maanen, P.P. van

    2011-01-01

    In this paper, the results of a validation experiment for two existing computational trust models describing human trust are reported. One model uses experiences of performance in order to estimate the trust in different trustees. The second model in addition carries the notion of relative trust.

  16. Model Identification and Validation for a Heating System using MATLAB System Identification Toolbox

    International Nuclear Information System (INIS)

    Rabbani, Muhammad Junaid; Hussain, Kashan; Khan, Asim-ur-Rehman; Ali, Abdullah

    2013-01-01

    This paper proposed a systematic approach to select a mathematical model for an industrial heating system by adopting system identification techniques with the aim of fulfilling the design requirement for the controller. The model identification process will begin by collecting real measurement data samples with the aid of MATLAB system identification toolbox. The criteria for selecting the model that could validate model output with actual data will based upon: parametric identification technique, picking the best model structure with low order among ARX, ARMAX and BJ, and then applying model estimation and validation tests. Simulated results have shown that the BJ model has been best in providing good estimation and validation based upon performance criteria such as: final prediction error, loss function, best percentage of model fit, and co-relation analysis of residual for output

  17. Validation of an Acoustic Impedance Prediction Model for Skewed Resonators

    Science.gov (United States)

    Howerton, Brian M.; Parrott, Tony L.

    2009-01-01

    An impedance prediction model was validated experimentally to determine the composite impedance of a series of high-aspect ratio slot resonators incorporating channel skew and sharp bends. Such structures are useful for packaging acoustic liners into constrained spaces for turbofan noise control applications. A formulation of the Zwikker-Kosten Transmission Line (ZKTL) model, incorporating the Richards correction for rectangular channels, is used to calculate the composite normalized impedance of a series of six multi-slot resonator arrays with constant channel length. Experimentally, acoustic data was acquired in the NASA Langley Normal Incidence Tube over the frequency range of 500 to 3500 Hz at 120 and 140 dB OASPL. Normalized impedance was reduced using the Two-Microphone Method for the various combinations of channel skew and sharp 90o and 180o bends. Results show that the presence of skew and/or sharp bends does not significantly alter the impedance of a slot resonator as compared to a straight resonator of the same total channel length. ZKTL predicts the impedance of such resonators very well over the frequency range of interest. The model can be used to design arrays of slot resonators that can be packaged into complex geometries heretofore unsuitable for effective acoustic treatment.

  18. Neuroinflammatory targets and treatments for epilepsy validated in experimental models.

    Science.gov (United States)

    Aronica, Eleonora; Bauer, Sebastian; Bozzi, Yuri; Caleo, Matteo; Dingledine, Raymond; Gorter, Jan A; Henshall, David C; Kaufer, Daniela; Koh, Sookyong; Löscher, Wolfgang; Louboutin, Jean-Pierre; Mishto, Michele; Norwood, Braxton A; Palma, Eleonora; Poulter, Michael O; Terrone, Gaetano; Vezzani, Annamaria; Kaminski, Rafal M

    2017-07-01

    A large body of evidence that has accumulated over the past decade strongly supports the role of inflammation in the pathophysiology of human epilepsy. Specific inflammatory molecules and pathways have been identified that influence various pathologic outcomes in different experimental models of epilepsy. Most importantly, the same inflammatory pathways have also been found in surgically resected brain tissue from patients with treatment-resistant epilepsy. New antiseizure therapies may be derived from these novel potential targets. An essential and crucial question is whether targeting these molecules and pathways may result in anti-ictogenesis, antiepileptogenesis, and/or disease-modification effects. Therefore, preclinical testing in models mimicking relevant aspects of epileptogenesis is needed to guide integrated experimental and clinical trial designs. We discuss the most recent preclinical proof-of-concept studies validating a number of therapeutic approaches against inflammatory mechanisms in animal models that could represent novel avenues for drug development in epilepsy. Finally, we suggest future directions to accelerate preclinical to clinical translation of these recent discoveries. Wiley Periodicals, Inc. © 2017 International League Against Epilepsy.

  19. Army Synthetic Validity Project. Report of Phase 3 Results. Volume 1.

    Science.gov (United States)

    1991-02-01

    Joyce Shields, and Robert Guion . EDGAR M. JOHN ON Technical Director v ARMY SYNTHETIC VALIDITY PROJECT: REPORT OF PHASE III RESULTS Volume 1 EXECUTIVE...34 Guion (1976) provides a review of several approaches to conducting synthetic validation. The approach most relevant to the problem at hand involves...included: 12B (Combat Engineer), 13B (Cannon Crewman), 27E (TOW/Dragon Repairer), 29E ( Radio Repairer), 31C (Single Channel Radio Operator), 51B (Carpentry

  20. MODEL-BASED VALIDATION AND VERIFICATION OF ANOMALIES IN LEGISLATION

    Directory of Open Access Journals (Sweden)

    Vjeran Strahonja

    2006-12-01

    Full Text Available An anomaly in legislation is absence of completeness, consistency and other desirable properties, caused by different semantic, syntactic or pragmatic reasons. In general, the detection of anomalies in legislation comprises validation and verification. The basic idea of research, as presented in this paper, is modelling legislation by capturing domain knowledge of legislation and specifying it in a generic way by using commonly agreed and understandable modelling concepts of the Unified Modelling Language (UML. Models of legislation enable to understand the system better, support the detection of anomalies and help to improve the quality of legislation by validation and verification. By implementing model-based approach, the object of validation and verification moves from legislation to its model. The business domain of legislation has two distinct aspects: a structural or static aspect (functionality, business data etc., and a behavioural or dynamic part (states, transitions, activities, sequences etc.. Because anomalism can occur on two different levels, on the level of a model, or on the level of legislation itself, a framework for validation and verification of legal regulation and its model is discussed. The presented framework includes some significant types of semantic and syntactic anomalies. Some ideas for assessment of pragmatic anomalies of models were found in the field of software quality metrics. Thus pragmatic features and attributes can be determined that could be relevant for evaluation purposes of models. Based on analogue standards for the evaluation of software, a qualitative and quantitative scale can be applied to determine the value of some feature for a specific model.

  1. Validation of Hydrodynamic Numerical Model of a Pitching Wave Energy Converter

    DEFF Research Database (Denmark)

    López, Maria del Pilar Heras; Thomas, Sarah; Kramer, Morten Mejlhede

    2017-01-01

    Validation of numerical model is essential in the development of new technologies. Commercial software and codes available simulating wave energy converters (WECs) have not been proved to work for all the available and upcoming technologies yet. The present paper presents the first stages...... of the validation process of a hydrodynamic numerical model for a pitching wave energy converter. The development of dry tests, wave flume and wave basin experiments are going to be explained, lessons learned shared and results presented....

  2. Determination of clouds in MSG data for the validation of clouds in a regional climate model

    OpenAIRE

    Huckle, Roger

    2009-01-01

    Regional climate models (e.g. CLM) can help to asses the influence of the antropogenic climate change on the different regions of the earth. Validation of these models is very important. Satellite data are of great benefit, as data on a global scale and high temporal resolution is available. In this thesis a cloud detection and object based cloud classification for Meteosat Second Generation (MSG) was developed and used to validate CLM clouds. Results show sometimes too many clouds in the CLM.

  3. Validation techniques of agent based modelling for geospatial simulations

    Science.gov (United States)

    Darvishi, M.; Ahmadi, G.

    2014-10-01

    One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS) is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS), biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI's ArcGIS, OpenMap, GeoTools, etc) for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  4. Thermal hydraulic model validation for HOR mixed core fuel management

    International Nuclear Information System (INIS)

    Gibcus, H.P.M.; Vries, J.W. de; Leege, P.F.A. de

    1997-01-01

    A thermal-hydraulic core management model has been developed for the Hoger Onderwijsreactor (HOR), a 2 MW pool-type university research reactor. The model was adopted for safety analysis purposes in the framework of HEU/LEU core conversion studies. It is applied in the thermal-hydraulic computer code SHORT (Steady-state HOR Thermal-hydraulics) which is presently in use in designing core configurations and for in-core fuel management. An elaborate measurement program was performed for establishing the core hydraulic characteristics for a variety of conditions. The hydraulic data were obtained with a dummy fuel element with special equipment allowing a.o. direct measurement of the true core flow rate. Using these data the thermal-hydraulic model was validated experimentally. The model, experimental tests, and model validation are discussed. (author)

  5. Validation of Computer Models for Homeland Security Purposes

    International Nuclear Information System (INIS)

    Schweppe, John E.; Ely, James; Kouzes, Richard T.; McConn, Ronald J.; Pagh, Richard T.; Robinson, Sean M.; Siciliano, Edward R.; Borgardt, James D.; Bender, Sarah E.; Earnhart, Alison H.

    2005-01-01

    At Pacific Northwest National Laboratory, we are developing computer models of radiation portal monitors for screening vehicles and cargo. Detailed models of the radiation detection equipment, vehicles, cargo containers, cargos, and radioactive sources have been created. These are used to determine the optimal configuration of detectors and the best alarm algorithms for the detection of items of interest while minimizing nuisance alarms due to the presence of legitimate radioactive material in the commerce stream. Most of the modeling is done with the Monte Carlo code MCNP to describe the transport of gammas and neutrons from extended sources through large, irregularly shaped absorbers to large detectors. A fundamental prerequisite is the validation of the computational models against field measurements. We describe the first step of this validation process, the comparison of the models to measurements with bare static sources

  6. Verification and Validation of FAARR Model and Data Envelopment Analysis Models for United States Army Recruiting

    National Research Council Canada - National Science Library

    Piskator, Gene

    1998-01-01

    ...) model and to develop a Data Envelopment Analysis (DEA) modeling strategy. First, the FAARR model was verified using a simulation of a known production function and validated using sensitivity analysis and ex-post forecasts...

  7. Modeling and validation of microwave ablations with internal vaporization.

    Science.gov (United States)

    Chiang, Jason; Birla, Sohan; Bedoya, Mariajose; Jones, David; Subbiah, Jeyam; Brace, Christopher L

    2015-02-01

    Numerical simulation is increasingly being utilized for computer-aided design of treatment devices, analysis of ablation growth, and clinical treatment planning. Simulation models to date have incorporated electromagnetic wave propagation and heat conduction, but not other relevant physics such as water vaporization and mass transfer. Such physical changes are particularly noteworthy during the intense heat generation associated with microwave heating. In this paper, a numerical model was created that integrates microwave heating with water vapor generation and transport by using porous media assumptions in the tissue domain. The heating physics of the water vapor model was validated through temperature measurements taken at locations 5, 10, and 20 mm away from the heating zone of the microwave antenna in homogenized ex vivo bovine liver setup. Cross-sectional area of water vapor transport was validated through intraprocedural computed tomography (CT) during microwave ablations in homogenized ex vivo bovine liver. Iso-density contours from CT images were compared to vapor concentration contours from the numerical model at intermittent time points using the Jaccard index. In general, there was an improving correlation in ablation size dimensions as the ablation procedure proceeded, with a Jaccard index of 0.27, 0.49, 0.61, 0.67, and 0.69 at 1, 2, 3, 4, and 5 min, respectively. This study demonstrates the feasibility and validity of incorporating water vapor concentration into thermal ablation simulations and validating such models experimentally.

  8. Predicting third molar surgery operative time: a validated model.

    Science.gov (United States)

    Susarla, Srinivas M; Dodson, Thomas B

    2013-01-01

    The purpose of the present study was to develop and validate a statistical model to predict third molar (M3) operative time. This was a prospective cohort study consisting of a sample of subjects presenting for M3 removal. The demographic, anatomic, and operative variables were recorded for each subject. Using an index sample of randomly selected subjects, a multiple linear regression model was generated to predict the operating time. A nonoverlapping group of randomly selected subjects (validation sample) was used to assess model accuracy. P≤.05 was considered significant. The sample was composed of 150 subjects (n) who had 450 (k) M3s removed. The index sample (n=100 subjects, k=313 M3s extracted) had a mean age of 25.4±10.0 years. The mean extraction time was 6.4±7.0 minutes. The multiple linear regression model included M3 location, Winter's classification, tooth morphology, number of teeth extracted, procedure type, and surgical experience (R2=0.58). No statistically significant differences were seen between the index sample and the validation sample (n=50, k=137) for any of the study variables. Compared with the index model, the β-coefficients of the validation model were similar in direction and magnitude for most variables. Compared with the observed extraction time for all teeth in the sample, the predicted extraction time was not significantly different (P=.16). Fair agreement was seen between the β-coefficients for our multiple models in the index and validation populations, with no significant difference in the predicted and observed operating times. Copyright © 2013 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  9. Validation Hydrodynamic Models of Three Topological Models of Secondary Facultative Ponds

    Directory of Open Access Journals (Sweden)

    Aponte-Reyes Alxander

    2014-10-01

    Full Text Available A methodology was developed to analyze boundary conditions, the size of the mesh and the turbulence of a mathematical model of CFD, which could explain hydrodynamic behavior on facultative stabilization ponds, FSP, built to pilot scale: conventional pond, CP, baffled pond, BP, and baffled-mesh pond, BMP. Models dispersion studies were performed in field for validation, taking samples into and out of the FSP, the information was used to carry out CFD model simulations of the three topologies. Evaluated mesh sizes ranged from 500,000 to 2,000,000 elements. The boundary condition in Pared surface-free slip showed good qualitative behavior and the turbulence model κ–ε Low Reynolds yielded good results. The biomass contained in LFS generates interference on dispersion studies and should be taken into account in assessing the CFD modeling, the tracer injection times, its concentration at the entrance, the effect of wind on CFD, and the flow models adopted as a basis for modeling are parameters to be taken into account for the CFD model validation and calibration.

  10. Validation of statistical models for creep rupture by parametric analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bolton, J., E-mail: john.bolton@uwclub.net [65, Fisher Ave., Rugby, Warks CV22 5HW (United Kingdom)

    2012-01-15

    Statistical analysis is an efficient method for the optimisation of any candidate mathematical model of creep rupture data, and for the comparative ranking of competing models. However, when a series of candidate models has been examined and the best of the series has been identified, there is no statistical criterion to determine whether a yet more accurate model might be devised. Hence there remains some uncertainty that the best of any series examined is sufficiently accurate to be considered reliable as a basis for extrapolation. This paper proposes that models should be validated primarily by parametric graphical comparison to rupture data and rupture gradient data. It proposes that no mathematical model should be considered reliable for extrapolation unless the visible divergence between model and data is so small as to leave no apparent scope for further reduction. This study is based on the data for a 12% Cr alloy steel used in BS PD6605:1998 to exemplify its recommended statistical analysis procedure. The models considered in this paper include a) a relatively simple model, b) the PD6605 recommended model and c) a more accurate model of somewhat greater complexity. - Highlights: Black-Right-Pointing-Pointer The paper discusses the validation of creep rupture models derived from statistical analysis. Black-Right-Pointing-Pointer It demonstrates that models can be satisfactorily validated by a visual-graphic comparison of models to data. Black-Right-Pointing-Pointer The method proposed utilises test data both as conventional rupture stress and as rupture stress gradient. Black-Right-Pointing-Pointer The approach is shown to be more reliable than a well-established and widely used method (BS PD6605).

  11. A practical guide for operational validation of discrete simulation models

    Directory of Open Access Journals (Sweden)

    Fabiano Leal

    2011-04-01

    Full Text Available As the number of simulation experiments increases, the necessity for validation and verification of these models demands special attention on the part of the simulation practitioners. By analyzing the current scientific literature, it is observed that the operational validation description presented in many papers does not agree on the importance designated to this process and about its applied techniques, subjective or objective. With the expectation of orienting professionals, researchers and students in simulation, this article aims to elaborate a practical guide through the compilation of statistical techniques in the operational validation of discrete simulation models. Finally, the guide's applicability was evaluated by using two study objects, which represent two manufacturing cells, one from the automobile industry and the other from a Brazilian tech company. For each application, the guide identified distinct steps, due to the different aspects that characterize the analyzed distributions

  12. QUANTIFYING FOREST ABOVEGROUND CARBON POOLS AND FLUXES USING MULTI-TEMPORAL LIDAR A report on field monitoring, remote sensing MMV, GIS integration, and modeling results for forestry field validation test to quantify aboveground tree biomass and carbon

    Energy Technology Data Exchange (ETDEWEB)

    Lee Spangler; Lee A. Vierling; Eva K. Stand; Andrew T. Hudak; Jan U.H. Eitel; Sebastian Martinuzzi

    2012-04-01

    Sound policy recommendations relating to the role of forest management in mitigating atmospheric carbon dioxide (CO{sub 2}) depend upon establishing accurate methodologies for quantifying forest carbon pools for large tracts of land that can be dynamically updated over time. Light Detection and Ranging (LiDAR) remote sensing is a promising technology for achieving accurate estimates of aboveground biomass and thereby carbon pools; however, not much is known about the accuracy of estimating biomass change and carbon flux from repeat LiDAR acquisitions containing different data sampling characteristics. In this study, discrete return airborne LiDAR data was collected in 2003 and 2009 across {approx}20,000 hectares (ha) of an actively managed, mixed conifer forest landscape in northern Idaho, USA. Forest inventory plots, established via a random stratified sampling design, were established and sampled in 2003 and 2009. The Random Forest machine learning algorithm was used to establish statistical relationships between inventory data and forest structural metrics derived from the LiDAR acquisitions. Aboveground biomass maps were created for the study area based on statistical relationships developed at the plot level. Over this 6-year period, we found that the mean increase in biomass due to forest growth across the non-harvested portions of the study area was 4.8 metric ton/hectare (Mg/ha). In these non-harvested areas, we found a significant difference in biomass increase among forest successional stages, with a higher biomass increase in mature and old forest compared to stand initiation and young forest. Approximately 20% of the landscape had been disturbed by harvest activities during the six-year time period, representing a biomass loss of >70 Mg/ha in these areas. During the study period, these harvest activities outweighed growth at the landscape scale, resulting in an overall loss in aboveground carbon at this site. The 30-fold increase in sampling density

  13. Validation of the dermal exposure model in ECETOC TRA.

    Science.gov (United States)

    Marquart, Hans; Franken, Remy; Goede, Henk; Fransman, Wouter; Schinkel, Jody

    2017-08-01

    The ECETOC TRA model (presently version 3.1) is often used to estimate worker inhalation and dermal exposure in regulatory risk assessment. The dermal model in ECETOC TRA has not yet been validated by comparison with independent measured exposure levels. This was the goal of the present study. Measured exposure levels and relevant contextual information were gathered via literature search, websites of relevant occupational health institutes and direct requests for data to industry. Exposure data were clustered in so-called exposure cases, which are sets of data from one data source that are expected to have the same values for input parameters in the ECETOC TRA dermal exposure model. For each exposure case, the 75th percentile of measured values was calculated, because the model intends to estimate these values. The input values for the parameters in ECETOC TRA were assigned by an expert elicitation and consensus building process, based on descriptions of relevant contextual information.From more than 35 data sources, 106 useful exposure cases were derived, that were used for direct comparison with the model estimates. The exposure cases covered a large part of the ECETOC TRA dermal exposure model. The model explained 37% of the variance in the 75th percentiles of measured values. In around 80% of the exposure cases, the model estimate was higher than the 75th percentile of measured values. In the remaining exposure cases, the model estimate may not be sufficiently conservative.The model was shown to have a clear bias towards (severe) overestimation of dermal exposure at low measured exposure values, while all cases of apparent underestimation by the ECETOC TRA dermal exposure model occurred at high measured exposure values. This can be partly explained by a built-in bias in the effect of concentration of substance in product used, duration of exposure and the use of protective gloves in the model. The effect of protective gloves was calculated to be on average a

  14. An approach to model validation and model-based prediction -- polyurethane foam case study.

    Energy Technology Data Exchange (ETDEWEB)

    Dowding, Kevin J.; Rutherford, Brian Milne

    2003-07-01

    Enhanced software methodology and improved computing hardware have advanced the state of simulation technology to a point where large physics-based codes can be a major contributor in many systems analyses. This shift toward the use of computational methods has brought with it new research challenges in a number of areas including characterization of uncertainty, model validation, and the analysis of computer output. It is these challenges that have motivated the work described in this report. Approaches to and methods for model validation and (model-based) prediction have been developed recently in the engineering, mathematics and statistical literatures. In this report we have provided a fairly detailed account of one approach to model validation and prediction applied to an analysis investigating thermal decomposition of polyurethane foam. A model simulates the evolution of the foam in a high temperature environment as it transforms from a solid to a gas phase. The available modeling and experimental results serve as data for a case study focusing our model validation and prediction developmental efforts on this specific thermal application. We discuss several elements of the ''philosophy'' behind the validation and prediction approach: (1) We view the validation process as an activity applying to the use of a specific computational model for a specific application. We do acknowledge, however, that an important part of the overall development of a computational simulation initiative is the feedback provided to model developers and analysts associated with the application. (2) We utilize information obtained for the calibration of model parameters to estimate the parameters and quantify uncertainty in the estimates. We rely, however, on validation data (or data from similar analyses) to measure the variability that contributes to the uncertainty in predictions for specific systems or units (unit-to-unit variability). (3) We perform statistical

  15. Validation of a tuber blight (Phytophthora infestans) prediction model

    Science.gov (United States)

    Potato tuber blight caused by Phytophthora infestans accounts for significant losses in storage. There is limited published quantitative data on predicting tuber blight. We validated a tuber blight prediction model developed in New York with cultivars Allegany, NY 101, and Katahdin using independent...

  16. A Comparison and Validation of Two Surface Ship Readiness Models

    Science.gov (United States)

    1994-09-01

    they cannot be considered validated. Any application of these programs without additional verification is at the risk of the user. vii Vifi TABLE OF...contains the S’AS code that was used to perform the full model run for the SIM. // SIMIC JOB USER=S6402,CLASS--C 1/ EXEC SAS //WORK DD UNIT=SYSDASPACE

  17. Validating Work Discrimination and Coping Strategy Models for Sexual Minorities

    Science.gov (United States)

    Chung, Y. Barry; Williams, Wendi; Dispenza, Franco

    2009-01-01

    The purpose of this study was to validate and expand on Y. B. Chung's (2001) models of work discrimination and coping strategies among lesbian, gay, and bisexual persons. In semistructured individual interviews, 17 lesbians and gay men reported 35 discrimination incidents and their related coping strategies. Responses were coded based on Chung's…

  18. Validity of the Bersohn–Zewail model beyond justification

    DEFF Research Database (Denmark)

    Petersen, Jakob; Henriksen, Niels Engholm; Møller, Klaus Braagaard

    2012-01-01

    excellent agreement between the classical trajectory and the average position of the excited state wave packet. By investigating the approximations connecting the nuclear dynamics described by quantum mechanics and the BZ model, we conclude that this agreement goes far beyond the validity of the individual...

  19. Improving Perovskite Solar Cells: Insights From a Validated Device Model

    NARCIS (Netherlands)

    Sherkar, Tejas S.; Momblona, Cristina; Gil-Escrig, Lidon; Bolink, Henk J.; Koster, L. Jan Anton

    2017-01-01

    To improve the efficiency of existing perovskite solar cells (PSCs), a detailed understanding of the underlying device physics during their operation is essential. Here, a device model has been developed and validated that describes the operation of PSCs and quantitatively explains the role of

  20. Validating soil phosphorus routines in the SWAT model

    Science.gov (United States)

    Phosphorus transfer from agricultural soils to surface waters is an important environmental issue. Commonly used models like SWAT have not always been updated to reflect improved understanding of soil P transformations and transfer to runoff. Our objective was to validate the ability of the P routin...

  1. Development and validation of models for bubble coalescence and breakup

    International Nuclear Information System (INIS)

    Liao, Yiaxiang

    2013-01-01

    A generalized model for bubble coalescence and breakup has been developed, which is based on a comprehensive survey of existing theories and models. One important feature of the model is that all important mechanisms leading to bubble coalescence and breakup in a turbulent gas-liquid flow are considered. The new model is tested extensively in a 1D Test Solver and a 3D CFD code ANSYS CFX for the case of vertical gas-liquid pipe flow under adiabatic conditions, respectively. Two kinds of extensions of the standard multi-fluid model, i.e. the discrete population model and the inhomogeneous MUSIG (multiple-size group) model, are available in the two solvers, respectively. These extensions with suitable closure models such as those for coalescence and breakup are able to predict the evolution of bubble size distribution in dispersed flows and to overcome the mono-dispersed flow limitation of the standard multi-fluid model. For the validation of the model the high quality database of the TOPFLOW L12 experiments for air-water flow in a vertical pipe was employed. A wide range of test points, which cover the bubbly flow, turbulent-churn flow as well as the transition regime, is involved in the simulations. The comparison between the simulated results such as bubble size distribution, gas velocity and volume fraction and the measured ones indicates a generally good agreement for all selected test points. As the superficial gas velocity increases, bubble size distribution evolves via coalescence dominant regimes first, then breakup-dominant regimes and finally turns into a bimodal distribution. The tendency of the evolution is well reproduced by the model. However, the tendency is almost always overestimated, i.e. too much coalescence in the coalescence dominant case while too much breakup in breakup dominant ones. The reason of this problem is discussed by studying the contribution of each coalescence and breakup mechanism at different test points. The redistribution of the

  2. Validating a perceptual distraction model in a personal two-zone sound system

    DEFF Research Database (Denmark)

    Rämö, Jussi; Christensen, Lasse; Bech, Søren

    2017-01-01

    using a simple loudspeaker setup, consisting of only two loudspeakers, one for the target sound source and the other for the interfering sound source. Recently, the model was successfully validated in a complex personal sound-zone system with speech-on-music stimuli. Second round of validations were...... conducted by physically altering the sound-zone system and running a set of new listening experiments utilizing two sound zones within the sound-zone system. Thus, validating the model using a different sound-zone system with both speech-on-music and music-on-speech stimuli sets. Preliminary results show...... the performance of personal sound-zone systems....

  3. Validating a perceptual distraction model in a personal two-zone sound system

    DEFF Research Database (Denmark)

    Rämö, Jussi; Christensen, Lasse; Bech, Søren

    2017-01-01

    This paper focuses on validating a perceptual distraction model, which aims to predict user’s perceived distraction caused by audio-on-audio interference, e.g., two competing audio sources within the same listening space. Originally, the distraction model was trained with music-on-music stimuli...... using a simple loudspeaker setup, consisting of only two loudspeakers, one for the target sound source and the other for the interfering sound source. Recently, the model was successfully validated in a complex personal sound-zone system with speech-on-music stimuli. Second round of validations were...... conducted by physically altering the sound-zone system and running a set of new listening experiments utilizing two sound zones within the sound-zone system. Thus, validating the model using a different sound-zone system with both speech-on-music and music-on-speech stimuli sets. Preliminary results show...

  4. Data Set for Emperical Validation of Double Skin Facade Model

    DEFF Research Database (Denmark)

    Kalyanova, Olena; Jensen, Rasmus Lund; Heiselberg, Per

    2008-01-01

    During the recent years the attention to the double skin facade (DSF) concept has greatly increased. Nevertheless, the application of the concept depends on whether a reliable model for simulation of the DSF performance will be developed or pointed out. This is, however, not possible to do, until...... the model is empirically validated and its' limitations for the DSF modeling are identified. Correspondingly, the existence and availability of the experimental data is very essential. Three sets of accurate empirical data for validation of DSF modeling with building simulation software were produced within...... of a double skin facade: 1. External air curtain mode, it is the naturally ventilated DSF cavity with the top and bottom openings open to the outdoor; 2. Thermal insulation mode, when all of the DSF openings closed; 3. Preheating mode, with the bottom DSF openings open to the outdoor and top openings open...

  5. Characterization Report on Fuels for NEAMS Model Validation

    Energy Technology Data Exchange (ETDEWEB)

    Gofryk, Krzysztof [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    Nearly 20% of the world’s electricity today is generated by nuclear energy from uranium dioxide (UO2) fuel. The thermal conductivity of UO2 governs the conversion of heat produced from fission events into electricity and it is an important parameter in reactor design and safety. While nuclear fuel operates at high to very high temperatures, thermal conductivity and other materials properties lack sensitivity to temperature variations and to material variations at reactor temperatures. As a result, both the uncertainties in laboratory measurements at high temperatures and the small differences in properties of different materials inevitably lead to large uncertainties in models and little predictive power. Conversely, properties measured at low to moderate temperatures have more sensitivity, less uncertainty, and have larger differences in properties for different materials. These variations need to be characterized as they will afford the highest predictive capability in modeling and offer best assurances for validation and verification at all temperatures. This is well emphasized in the temperature variation of the thermal conductivity of UO2.

  6. Estimation of precipitable water vapour from GPS measurements in Argentina: Validation and qualitative analysis of results

    Science.gov (United States)

    Fernández, L. I.; Salio, P.; Natali, M. P.; Meza, A. M.

    2010-10-01

    This paper presents PWV estimates from GPS data computed at four continuously operated GPS stations in Argentina established at Buenos Aires, Córdoba, Rosario and Salta over a 1 year period (2006-2007). The objective is to analyze the behaviour of the GPS PWV estimation using mean tropospheric temperature ( Tm) values from the Bevis model, Sapucci model and obtained by a numerical integration of variables provided by the operational analysis of the National Centre of Environmental Prediction (NCEP). The results are validated using PWV values from nearest radio soundings. Moreover, a comparison between PWV values determined from microwave sensors deployed on the NOAA-18 satellite and PWV from GPS observations is also presented. From the analysis we can see that the computation of GPS PWV using the Tm from the Bevis model, originally deduced for the northern hemisphere, shows similar behaviour to the respective computation using a Sapucci model inside 0.5 mm. The differences between the Tm values computed from the Sapucci model and the numerical integration of NCEP variables are of the order of 15 K, although it does not represent a significant error in PWV. Nevertheless, differences in bias are imperceptible during the dry period and they are as big as 3 mm during the moist or high precipitation period. This behaviour could not represent an improvement when comparing radio soundings with respect to the GPS PWV values using different estimations of Tm. Thus, we conclude that the usage of Tm estimated from the Bevis model is the best choice for regional studies, considering the simplicity and dissemination of the method, unless some more studies taking into account the geographical and climatological characteristic of the region are performed. As expected, GPS PWV values show very good agreement with radio sounding determinations, small differences can be observed especially during extreme precipitation periods. In general the NOAA PWV values denote an over

  7. ALTWAVE: Toolbox for use of satellite L2P altimeter data for wave model validation

    Science.gov (United States)

    Appendini, Christian M.; Camacho-Magaña, Víctor; Breña-Naranjo, José Agustín

    2016-03-01

    To characterize some of the world's ocean physical processes such as its wave height, wind speed and sea surface elevation is a major need for coastal and marine infrastructure planning and design, tourism activities, wave power and storm surge risk assessment, among others. Over the last decades, satellite remote sensing tools have provided quasi-global measurements of ocean altimetry by merging data from different satellite missions. While there is a widely use of altimeter data for model validation, practical tools for model validation remain scarce. Our purpose is to fill this gap by introducing ALTWAVE, a MATLAB user-oriented toolbox for oceanographers and coastal engineers developed to validate wave model results based on visual features and statistical estimates against satellite derived altimetry. Our toolbox uses altimetry information from the GlobWave initiative, and provides a sample application to validate a one year wave hindcast for the Gulf of Mexico. ALTWAVE also offers an effective toolbox to validate wave model results using altimeter data, as well as a guidance for non-experienced satellite data users. This article is intended for wave modelers with no experience using altimeter data to validate their results.

  8. Results of steel containment vessel model test

    International Nuclear Information System (INIS)

    Luk, V.K.; Ludwigsen, J.S.; Hessheimer, M.F.; Komine, Kuniaki; Matsumoto, Tomoyuki; Costello, J.F.

    1998-05-01

    A series of static overpressurization tests of scale models of nuclear containment structures is being conducted by Sandia National Laboratories for the Nuclear Power Engineering Corporation of Japan and the US Nuclear Regulatory Commission. Two tests are being conducted: (1) a test of a model of a steel containment vessel (SCV) and (2) a test of a model of a prestressed concrete containment vessel (PCCV). This paper summarizes the conduct of the high pressure pneumatic test of the SCV model and the results of that test. Results of this test are summarized and are compared with pretest predictions performed by the sponsoring organizations and others who participated in a blind pretest prediction effort. Questions raised by this comparison are identified and plans for posttest analysis are discussed

  9. HELOKA-HP thermal-hydraulic model validation and calibration

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Xue Zhou; Ghidersa, Bradut-Eugen; Badea, Aurelian Florin

    2016-11-01

    Highlights: • The electrical heater in HELOKA-HP has been modeled with RELAP5-3D using experimental data as input. • The model has been validated using novel techniques for assimilating experimental data and the representative model parameters with BEST-EST. • The methodology is successfully used for reducing the model uncertainties and provides a quantitative measure of the consistency between the experimental data and the model. - Abstract: The Helium Loop Karlsruhe High Pressure (HELOKA-HP) is an experimental facility for the testing of various helium-cooled components at high temperature (500 °C) and high pressure (8 MPa) for nuclear fusion applications. For modeling the loop thermal dynamics, a thermal-hydraulic model has been created using the system code RELAP5-3D. Recently, new experimental data covering the behavior of the loop components under relevant operational conditions have been made available giving the possibility of validating and calibrating the existing models in order to reduce the uncertainties of the simulated responses. This paper presents an example where such process has been applied for the HELOKA electrical heater model. Using novel techniques for assimilating experimental data, implemented in the computational module BEST-EST, the representative parameters of the model have been calibrated.

  10. Hierarchical Multi-Scale Approach To Validation and Uncertainty Quantification of Hyper-Spectral Image Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Engel, David W.; Reichardt, Thomas A.; Kulp, Thomas J.; Graff, David; Thompson, Sandra E.

    2016-09-17

    Validating predictive models and quantifying uncertainties inherent in the modeling process is a critical component of the HARD Solids Venture program [1]. Our current research focuses on validating physics-based models predicting the optical properties of solid materials for arbitrary surface morphologies and characterizing the uncertainties in these models. We employ a systematic and hierarchical approach by designing physical experiments and comparing the experimental results with the outputs of computational predictive models. We illustrate this approach through an example comparing a micro-scale forward model to an idealized solid-material system and then propagating the results through a system model to the sensor level. Our efforts should enhance detection reliability of the hyper-spectral imaging technique and the confidence in model utilization and model outputs by users and stakeholders.

  11. Validation Test For The Result Of Neutron Activation Analysis With Standard Reference Material

    International Nuclear Information System (INIS)

    Rina M, Th; Wardani, Sri

    2001-01-01

    The validity test with standard reference material is necessary to ensure the result analysis of samples. The analysis of CRM No.8 (Vehicle Exhaust Particulates) and SRM 1646a (Estuary Sediment) has been done in P2TRR. These analysis is intended to validate the analysis result done in NAA laboratory of P2TRR. For the CRM No. 8, is elements of the 30 certified elements were successfully analyzed Meanwhile, for SRM 1646a, 21 elements of the 39 certified elements were completely investigated The quantitative analysis showed the relative difference of 2% - 15% compared to the certificate

  12. Alaska North Slope Tundra Travel Model and Validation Study

    Energy Technology Data Exchange (ETDEWEB)

    Harry R. Bader; Jacynthe Guimond

    2006-03-01

    lack of variability in snow depth cover throughout the period of field experimentation. The amount of change in disturbance indicators was greater in the tundra communities of the Foothills than in those of the Coastal Plain. However the overall level of change in both community types was less than expected. In Coastal Plain communities, ground hardness and snow slab thickness were found to play an important role in change in active layer depth and soil moisture as a result of treatment. In the Foothills communities, snow cover had the most influence on active layer depth and soil moisture as a result of treatment. Once certain minimum thresholds for ground hardness, snow slab thickness, and snow depth were attained, it appeared that little or no additive effect was realized regarding increased resistance to disturbance in the tundra communities studied. DNR used the results of this modeling project to set a standard for maximum permissible disturbance of cross-country tundra travel, with the threshold set below the widely accepted standard of Low Disturbance levels (as determined by the U.S. Fish and Wildlife Service). DNR followed the modeling project with a validation study, which seemed to support the field trial conclusions and indicated that the standard set for maximum permissible disturbance exhibits a conservative bias in favor of environmental protection. Finally DNR established a quick and efficient tool for visual estimations of disturbance to determine when investment in field measurements is warranted. This Visual Assessment System (VAS) seemed to support the plot disturbance measurements taking during the modeling and validation phases of this project.

  13. Infrared ship signature prediction, model validation and sky radiance

    NARCIS (Netherlands)

    Neele, F.P.

    2005-01-01

    The increased interest during the last decade in the infrared signature of (new) ships results in a clear need of validated infrared signature prediction codes. This paper presents the results of comparing an in-house developed signature prediction code with measurements made in the 3-5 μm band in

  14. FDA 2011 process validation guidance: lifecycle compliance model.

    Science.gov (United States)

    Campbell, Cliff

    2014-01-01

    This article has been written as a contribution to the industry's efforts in migrating from a document-driven to a data-driven compliance mindset. A combination of target product profile, control engineering, and general sum principle techniques is presented as the basis of a simple but scalable lifecycle compliance model in support of modernized process validation. Unit operations and significant variables occupy pole position within the model, documentation requirements being treated as a derivative or consequence of the modeling process. The quality system is repositioned as a subordinate of system quality, this being defined as the integral of related "system qualities". The article represents a structured interpretation of the U.S. Food and Drug Administration's 2011 Guidance for Industry on Process Validation and is based on the author's educational background and his manufacturing/consulting experience in the validation field. The U.S. Food and Drug Administration's Guidance for Industry on Process Validation (2011) provides a wide-ranging and rigorous outline of compliant drug manufacturing requirements relative to its 20(th) century predecessor (1987). Its declared focus is patient safety, and it identifies three inter-related (and obvious) stages of the compliance lifecycle. Firstly, processes must be designed, both from a technical and quality perspective. Secondly, processes must be qualified, providing evidence that the manufacturing facility is fully "roadworthy" and fit for its intended purpose. Thirdly, processes must be verified, meaning that commercial batches must be monitored to ensure that processes remain in a state of control throughout their lifetime.

  15. A physiological production model for cacao : results of model simulations

    NARCIS (Netherlands)

    Zuidema, P.A.; Leffelaar, P.A.

    2002-01-01

    CASE2 is a physiological model for cocoa (Theobroma cacao L.) growth and yield. This report introduces the CAcao Simulation Engine for water-limited production in a non-technical way and presents simulation results obtained with the model.

  16. Validating a model that predicts daily growth and feed quality of New Zealand dairy pastures.

    Science.gov (United States)

    Woodward, S J

    2001-09-01

    The Pasture Quality (PQ) model is a simple, mechanistic, dynamical system model that was designed to capture the essential biological processes in grazed grass-clover pasture, and to be optimised to derive improved grazing strategies for New Zealand dairy farms. While the individual processes represented in the model (photosynthesis, tissue growth, flowering, leaf death, decomposition, worms) were based on experimental data, this did not guarantee that the assembled model would accurately predict the behaviour of the system as a whole (i.e., pasture growth and quality). Validation of the whole model was thus a priority, since any strategy derived from the model could impact a farm business in the order of thousands of dollars per annum if adopted. This paper describes the process of defining performance criteria for the model, obtaining suitable data to test the model, and carrying out the validation analysis. The validation process highlighted a number of weaknesses in the model, which will lead to the model being improved. As a result, the model's utility will be enhanced. Furthermore, validation was found to have an unexpected additional benefit, in that despite the model's poor initial performance, support was generated for the model among field scientists involved in the wider project.

  17. Validation of Occupants’ Behaviour Models for Indoor Quality Parameter and Energy Consumption Prediction

    DEFF Research Database (Denmark)

    Fabi, Valentina; Sugliano, Martina; Andersen, Rune Korsholm

    2015-01-01

    . For this reason, the validation of occupant's behavioral models is an issue that is gaining importance.In this paper validation was carried out through dynamic Building Energy Performance simulation (BEPS); behavioral models of windows opening and thermostats set-point published in literature were implemented...... in a dynamic BEPS software and the obtained results in terms of temperature, relative humidity and CO2 concentration were compared to real measurements. Through this comparison it will be possible to verify the accuracy of the implemented behavioral models.The models were able to reproduce the general...

  18. A process improvement model for software verification and validation

    Science.gov (United States)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  19. Validation of a Wave-Body Interaction Model by Experimental Tests

    DEFF Research Database (Denmark)

    Ferri, Francesco; Kramer, Morten; Pecher, Arthur

    2013-01-01

    Within the wave energy field, numerical simulation has recently acquired a worldwide consent as being a useful tool, besides physical model testing. The main goal of this work is the validation of a numerical model by experimental results. The numerical model is based on a linear wave-body intera......-body interaction theory, applied for a point absorber wave energy converter. The results show that the ratio floater size/wave amplitude is a key parameter for the validity of the applied theory....

  20. A New Statistical Method to Determine the Degree of Validity of Health Economic Model Outcomes against Empirical Data.

    Science.gov (United States)

    Corro Ramos, Isaac; van Voorn, George A K; Vemer, Pepijn; Feenstra, Talitha L; Al, Maiwenn J

    2017-09-01

    The validation of health economic (HE) model outcomes against empirical data is of key importance. Although statistical testing seems applicable, guidelines for the validation of HE models lack guidance on statistical validation, and actual validation efforts often present subjective judgment of graphs and point estimates. To discuss the applicability of existing validation techniques and to present a new method for quantifying the degrees of validity statistically, which is useful for decision makers. A new Bayesian method is proposed to determine how well HE model outcomes compare with empirical data. Validity is based on a pre-established accuracy interval in which the model outcomes should fall. The method uses the outcomes of a probabilistic sensitivity analysis and results in a posterior distribution around the probability that HE model outcomes can be regarded as valid. We use a published diabetes model (Modelling Integrated Care for Diabetes based on Observational data) to validate the outcome "number of patients who are on dialysis or with end-stage renal disease." Results indicate that a high probability of a valid outcome is associated with relatively wide accuracy intervals. In particular, 25% deviation from the observed outcome implied approximately 60% expected validity. Current practice in HE model validation can be improved by using an alternative method based on assessing whether the model outcomes fit to empirical data at a predefined level of accuracy. This method has the advantage of assessing both model bias and parameter uncertainty and resulting in a quantitative measure of the degree of validity that penalizes models predicting the mean of an outcome correctly but with overly wide credible intervals. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  1. Cultural adaptation and validation of an instrument on barriers for the use of research results

    Directory of Open Access Journals (Sweden)

    Maria Beatriz Guimarães Ferreira

    Full Text Available ABSTRACT Objective: to culturally adapt The Barriers to Research Utilization Scale and to analyze the metric validity and reliability properties of its Brazilian Portuguese version. Method: methodological research conducted by means of the cultural adaptation process (translation and back-translation, face and content validity, construct validity (dimensionality and known groups and reliability analysis (internal consistency and test-retest. The sample consisted of 335 nurses, of whom 43 participated in the retest phase. Results: the validity of the adapted version of the instrument was confirmed. The scale investigates the barriers for the use of the research results in clinical practice. Confirmatory factorial analysis demonstrated that the Brazilian Portuguese version of the instrument is adequately adjusted to the dimensional structure the scale authors originally proposed. Statistically significant differences were observed among the nurses holding a Master's or Doctoral degree, with characteristics favorable to Evidence-Based Practice, and working at an institution with an organizational cultural that targets this approach. The reliability showed a strong correlation (r ranging between 0.77 and 0.84, p<0.001 and the internal consistency was adequate (Cronbach's alpha ranging between 0.77 and 0.82. Conclusion: the Brazilian Portuguese version of The Barriers Scale was valid and reliable in the group studied.

  2. Interpreting Results from the Multinomial Logit Model

    DEFF Research Database (Denmark)

    Wulff, Jesper

    2015-01-01

    This article provides guidelines and illustrates practical steps necessary for an analysis of results from the multinomial logit model (MLM). The MLM is a popular model in the strategy literature because it allows researchers to examine strategic choices with multiple outcomes. However, there seem...... to be systematic issues with regard to how researchers interpret their results when using the MLM. In this study, I present a set of guidelines critical to analyzing and interpreting results from the MLM. The procedure involves intuitive graphical representations of predicted probabilities and marginal effects...... suitable for both interpretation and communication of results. The pratical steps are illustrated through an application of the MLM to the choice of foreign market entry mode....

  3. Validation of the WIMSD4M cross-section generation code with benchmark results

    Energy Technology Data Exchange (ETDEWEB)

    Deen, J.R.; Woodruff, W.L. [Argonne National Lab., IL (United States); Leal, L.E. [Oak Ridge National Lab., TN (United States)

    1995-01-01

    The WIMSD4 code has been adopted for cross-section generation in support of the Reduced Enrichment Research and Test Reactor (RERTR) program at Argonne National Laboratory (ANL). Subsequently, the code has undergone several updates, and significant improvements have been achieved. The capability of generating group-collapsed micro- or macroscopic cross sections from the ENDF/B-V library and the more recent evaluation, ENDF/B-VI, in the ISOTXS format makes the modified version of the WIMSD4 code, WIMSD4M, very attractive, not only for the RERTR program, but also for the reactor physics community. The intent of the present paper is to validate the WIMSD4M cross-section libraries for reactor modeling of fresh water moderated cores. The results of calculations performed with multigroup cross-section data generated with the WIMSD4M code will be compared against experimental results. These results correspond to calculations carried out with thermal reactor benchmarks of the Oak Ridge National Laboratory (ORNL) unreflected HEU critical spheres, the TRX LEU critical experiments, and calculations of a modified Los Alamos HEU D{sub 2}O moderated benchmark critical system. The benchmark calculations were performed with the discrete-ordinates transport code, TWODANT, using WIMSD4M cross-section data. Transport calculations using the XSDRNPM module of the SCALE code system are also included. In addition to transport calculations, diffusion calculations with the DIF3D code were also carried out, since the DIF3D code is used in the RERTR program for reactor analysis and design. For completeness, Monte Carlo results of calculations performed with the VIM and MCNP codes are also presented.

  4. Validation of the WIMSD4M cross-section generation code with benchmark results

    International Nuclear Information System (INIS)

    Deen, J.R.; Woodruff, W.L.; Leal, L.E.

    1995-01-01

    The WIMSD4 code has been adopted for cross-section generation in support of the Reduced Enrichment Research and Test Reactor (RERTR) program at Argonne National Laboratory (ANL). Subsequently, the code has undergone several updates, and significant improvements have been achieved. The capability of generating group-collapsed micro- or macroscopic cross sections from the ENDF/B-V library and the more recent evaluation, ENDF/B-VI, in the ISOTXS format makes the modified version of the WIMSD4 code, WIMSD4M, very attractive, not only for the RERTR program, but also for the reactor physics community. The intent of the present paper is to validate the WIMSD4M cross-section libraries for reactor modeling of fresh water moderated cores. The results of calculations performed with multigroup cross-section data generated with the WIMSD4M code will be compared against experimental results. These results correspond to calculations carried out with thermal reactor benchmarks of the Oak Ridge National Laboratory (ORNL) unreflected HEU critical spheres, the TRX LEU critical experiments, and calculations of a modified Los Alamos HEU D 2 O moderated benchmark critical system. The benchmark calculations were performed with the discrete-ordinates transport code, TWODANT, using WIMSD4M cross-section data. Transport calculations using the XSDRNPM module of the SCALE code system are also included. In addition to transport calculations, diffusion calculations with the DIF3D code were also carried out, since the DIF3D code is used in the RERTR program for reactor analysis and design. For completeness, Monte Carlo results of calculations performed with the VIM and MCNP codes are also presented

  5. Use of the FDA nozzle model to illustrate validation techniques in computational fluid dynamics (CFD) simulations.

    Science.gov (United States)

    Hariharan, Prasanna; D'Souza, Gavin A; Horner, Marc; Morrison, Tina M; Malinauskas, Richard A; Myers, Matthew R

    2017-01-01

    A "credible" computational fluid dynamics (CFD) model has the potential to provide a meaningful evaluation of safety in medical devices. One major challenge in establishing "model credibility" is to determine the required degree of similarity between the model and experimental results for the model to be considered sufficiently validated. This study proposes a "threshold-based" validation approach that provides a well-defined acceptance criteria, which is a function of how close the simulation and experimental results are to the safety threshold, for establishing the model validity. The validation criteria developed following the threshold approach is not only a function of Comparison Error, E (which is the difference between experiments and simulations) but also takes in to account the risk to patient safety because of E. The method is applicable for scenarios in which a safety threshold can be clearly defined (e.g., the viscous shear-stress threshold for hemolysis in blood contacting devices). The applicability of the new validation approach was tested on the FDA nozzle geometry. The context of use (COU) was to evaluate if the instantaneous viscous shear stress in the nozzle geometry at Reynolds numbers (Re) of 3500 and 6500 was below the commonly accepted threshold for hemolysis. The CFD results ("S") of velocity and viscous shear stress were compared with inter-laboratory experimental measurements ("D"). The uncertainties in the CFD and experimental results due to input parameter uncertainties were quantified following the ASME V&V 20 standard. The CFD models for both Re = 3500 and 6500 could not be sufficiently validated by performing a direct comparison between CFD and experimental results using the Student's t-test. However, following the threshold-based approach, a Student's t-test comparing |S-D| and |Threshold-S| showed that relative to the threshold, the CFD and experimental datasets for Re = 3500 were statistically similar and the model could be

  6. AEROTAXI ground static test and finite element model validation

    Directory of Open Access Journals (Sweden)

    Radu BISCA

    2011-06-01

    Full Text Available In this presentation, we will concentrate on typical Ground Static Test (GST and Finite Element (FE software comparisons. It is necessary to note, that standard GST are obligatory for any new aircraft configuration. We can mention here the investigations of the AeroTAXITM, a small aircraft configuration, using PRODERA® equipment. A Finite Element Model (FEM of the AeroTAXITM has been developed in PATRAN/NASTRAN®, partly from a previous ANSYS® model. FEM can be used to investigate potential structural modifications or changes with realistic component corrections. Model validation should be part of every modern engineering analysis and quality assurance procedure.

  7. Circumplex Model VII: validation studies and FACES III.

    Science.gov (United States)

    Olson, D H

    1986-09-01

    This paper reviews some of the recent empirical studies validating the Circumplex Model and describes the newly developed self-report measure, FACES III. Studies testing hypotheses derived from the Circumplex Model regarding the three dimensions of cohesion, change, and communication are reviewed. Case illustrations using FACES III and the Clinical Rating Scale are presented. These two assessment tools can be used for making a diagnosis of family functioning and for assessing changes over the course of treatment. This paper reflects the continuing attempt to develop further the Circumplex Model and to bridge more adequately research, theory, and practice.

  8. Modeling and Validation across Scales: Parametrizing the effect of the forested landscape

    DEFF Research Database (Denmark)

    Dellwik, Ebba; Badger, Merete; Angelou, Nikolas

    be transferred into a parametrization of forests in wind models. The presentation covers three scales: the single tree, the forest edges and clearings, and the large-scale forested landscape in which the forest effects are parameterized with a roughness length. Flow modeling results and validation against...

  9. Development and Validation of a Path Analytic Model of Students' Performance in Chemistry.

    Science.gov (United States)

    Anamuah-Mensah, Jophus; And Others

    1987-01-01

    Reported the development and validation of an integrated model of performance on chemical concept-volumetric analysis. Model was tested on 265 chemistry students in eight schools.Results indicated that for subjects using algorithms without understanding, performance on volumetric analysis problems was not influenced by proportional reasoning…

  10. Development and Validation of the Controller Acceptance Rating Scale (CARS): Results of Empirical Research

    Science.gov (United States)

    Lee, Katharine K.; Kerns, Karol; Bone, Randall

    2001-01-01

    The measurement of operational acceptability is important for the development, implementation, and evolution of air traffic management decision support tools. The Controller Acceptance Rating Scale was developed at NASA Ames Research Center for the development and evaluation of the Passive Final Approach Spacing Tool. CARS was modeled after a well-known pilot evaluation rating instrument, the Cooper-Harper Scale, and has since been used in the evaluation of the User Request Evaluation Tool, developed by MITRE's Center for Advanced Aviation System Development. In this paper, we provide a discussion of the development of CARS and an analysis of the empirical data collected with CARS to examine construct validity. Results of intraclass correlations indicated statistically significant reliability for the CARS. From the subjective workload data that were collected in conjunction with the CARS, it appears that the expected set of workload attributes was correlated with the CARS. As expected, the analysis also showed that CARS was a sensitive indicator of the impact of decision support tools on controller operations. Suggestions for future CARS development and its improvement are also provided.

  11. Modeling and experimental validation of the desiccant wheel in a hybrid desiccant air conditioning system

    International Nuclear Information System (INIS)

    Wrobel, Jan; Morgenstern, Paula; Schmitz, Gerhard

    2013-01-01

    Modeling can be strong asset to the operation of air conditioning plants taking into account e.g. the strong dependency of local climate conditions for the operation of HVAC systems. This paper presents a validated physical model and a simplified model based on the results of the physical model for a desiccant wheel, which is the central part of a hybrid air conditioning system. The two models offer different advantages: While the physical model is complex and can be adapted flexibly to different wheel dimensions, desiccant materials or climatic conditions; the simplified model requires no knowledge of underlying equations and modeling language utilized and can be used for a first assessment of the potential of a desiccant cooling system in a certain location or for the use within online control systems. The coexistence of both models ensures that information tailored to the users' needs are made available. The validity of the physical model, and therewith the simplified model, is ensured through comparison with measurement obtained from a hybrid air conditioning system situated in northern Europe. The demonstration plant combines the advantages of a dedicated outdoor air system (DOAS) with the advantages of the common hybrid desiccant system to allow for energy efficient air conditioning in one installation. The availability of primary measurement data is extremely valuable to the process of model validation because knowledge about uncertainties and bias in measurement data unlikely to be known for secondary data can be used to understand and validate model results. A comparison of simulation results from the physical model to measurement data from the demonstration plant shows good compliance for a typical day of wheel operation after adjusting relevant model parameters. -- Highlights: ► The desiccant wheel as core component of a highly efficient HVAC pilot installation based on renewable energies. ► Modeling and experimental validation of a desiccant wheel

  12. Experimental validation of the twins prediction program for rolling noise. Pt.2: results

    NARCIS (Netherlands)

    Thompson, D.J.; Fodiman, P.; Mahé, H.

    1996-01-01

    Two extensive measurement campaigns have been carried out to validate the TWINS prediction program for rolling noise, as described in part 1 of this paper. This second part presents the experimental results of vibration and noise during train pass-bys and compares them with predictions from the

  13. Planck intermediate results I. Further validation of new Planck clusters with XMM-Newton

    DEFF Research Database (Denmark)

    Aghanim, N.; Collaboration, Planck; Arnaud, M.

    2012-01-01

    We present further results from the ongoing XMM-Newton validation follow-up of Planck cluster candidates, detailing X-ray observations of eleven candidates detected at a signal-to-noise ratio of 4.5

  14. Validation and intercomparison of Persistent Scatterers Interferometry: PSIC4 project results

    NARCIS (Netherlands)

    Raucoules, D.; Bourgine, B.; Michele, M. de; Le Cozannet, G.; Closset, L.; Bremmer, C.; Veldkamp, H.; Tragheim, D.; Bateson, L.; Crosetto, M.; Agudo, M.; Engdahl, M.

    2009-01-01

    This article presents the main results of the Persistent Scatterer Interferometry Codes Cross Comparison and Certification for long term differential interferometry (PSIC4) project. The project was based on the validation of the PSI (Persistent Scatterer Interferometry) data with respect to

  15. Development and validation of a building design waste reduction model.

    Science.gov (United States)

    Llatas, C; Osmani, M

    2016-10-01

    Reduction in construction waste is a pressing need in many countries. The design of building elements is considered a pivotal process to achieve waste reduction at source, which enables an informed prediction of their wastage reduction levels. However the lack of quantitative methods linking design strategies to waste reduction hinders designing out waste practice in building projects. Therefore, this paper addresses this knowledge gap through the design and validation of a Building Design Waste Reduction Strategies (Waste ReSt) model that aims to investigate the relationships between design variables and their impact on onsite waste reduction. The Waste ReSt model was validated in a real-world case study involving 20 residential buildings in Spain. The validation process comprises three stages. Firstly, design waste causes were analyzed. Secondly, design strategies were applied leading to several alternative low waste building elements. Finally, their potential source reduction levels were quantified and discussed within the context of the literature. The Waste ReSt model could serve as an instrumental tool to simulate designing out strategies in building projects. The knowledge provided by the model could help project stakeholders to better understand the correlation between the design process and waste sources and subsequently implement design practices for low-waste buildings. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Assessment model validity document - HYDRASTAR. A stochastic continuum program for groundwater flow

    Energy Technology Data Exchange (ETDEWEB)

    Gylling, B. [Kemakta Konsult AB, Stockholm (Sweden); Eriksson, Lars [Equa Simulation AB, Sundbyberg (Sweden)

    2001-12-01

    The prevailing document addresses validation of the stochastic continuum model HYDRASTAR designed for Monte Carlo simulations of groundwater flow in fractured rocks. Here, validation is defined as a process to demonstrate that a model concept is fit for its purpose. Preferably, the validation is carried out by comparison of model predictions with independent field observations and experimental measurements. In addition, other sources can also be used to confirm that the model concept gives acceptable results. One method is to compare results with the ones achieved using other model concepts for the same set of input data. Another method is to compare model results with analytical solutions. The model concept HYDRASTAR has been used in several studies including performance assessments of hypothetical repositories for spent nuclear fuel. In the performance assessments, the main tasks for HYDRASTAR have been to calculate groundwater travel time distributions, repository flux distributions, path lines and their exit locations. The results have then been used by other model concepts to calculate the near field release and far field transport. The aim and framework for the validation process includes describing the applicability of the model concept for its purpose in order to build confidence in the concept. Preferably, this is made by comparisons of simulation results with the corresponding field experiments or field measurements. Here, two comparisons with experimental results are reported. In both cases the agreement was reasonably fair. In the broader and more general context of the validation process, HYDRASTAR results have been compared with other models and analytical solutions. Commonly, the approximation calculations agree well with the medians of model ensemble results. Additional indications that HYDRASTAR is suitable for its purpose were obtained from the comparisons with results from other model concepts. Several verification studies have been made for

  17. Prediction and Validation of Heat Release Direct Injection Diesel Engine Using Multi-Zone Model

    Science.gov (United States)

    Anang Nugroho, Bagus; Sugiarto, Bambang; Prawoto; Shalahuddin, Lukman

    2014-04-01

    The objective of this study is to develop simulation model which capable to predict heat release of diesel combustion accurately in efficient computation time. A multi-zone packet model has been applied to solve the combustion phenomena inside diesel cylinder. The model formulations are presented first and then the numerical results are validated on a single cylinder direct injection diesel engine at various engine speed and timing injections. The model were found to be promising to fulfill the objective above.

  18. Integral Reactor Containment Condensation Model and Experimental Validation

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Qiao [Oregon State Univ., Corvallis, OR (United States); Corradini, Michael [Univ. of Wisconsin, Madison, WI (United States)

    2016-05-02

    This NEUP funded project, NEUP 12-3630, is for experimental, numerical and analytical studies on high-pressure steam condensation phenomena in a steel containment vessel connected to a water cooling tank, carried out at Oregon State University (OrSU) and the University of Wisconsin at Madison (UW-Madison). In the three years of investigation duration, following the original proposal, the planned tasks have been completed: (1) Performed a scaling study for the full pressure test facility applicable to the reference design for the condensation heat transfer process during design basis accidents (DBAs), modified the existing test facility to route the steady-state secondary steam flow into the high pressure containment for controllable condensation tests, and extended the operations at negative gage pressure conditions (OrSU). (2) Conducted a series of DBA and quasi-steady experiments using the full pressure test facility to provide a reliable high pressure condensation database (OrSU). (3) Analyzed experimental data and evaluated condensation model for the experimental conditions, and predicted the prototypic containment performance under accidental conditions (UW-Madison). A film flow model was developed for the scaling analysis, and the results suggest that the 1/3 scaled test facility covers large portion of laminar film flow, leading to a lower average heat transfer coefficient comparing to the prototypic value. Although it is conservative in reactor safety analysis, the significant reduction of heat transfer coefficient (50%) could under estimate the prototypic condensation heat transfer rate, resulting in inaccurate prediction of the decay heat removal capability. Further investigation is thus needed to quantify the scaling distortion for safety analysis code validation. Experimental investigations were performed in the existing MASLWR test facility at OrST with minor modifications. A total of 13 containment condensation tests were conducted for pressure

  19. Validation of Pressure Drop Models for PHWR-type Fuel Elements

    International Nuclear Information System (INIS)

    Brasnarof Daniel; Daverio, H.

    2003-01-01

    In the present work an one-dimensional pressure drop analytical model and the COBRA code, are validated with experimental data of CANDU and Atucha fuel bundles in low and high pressure experimental test loops.Models have very good agreement with the experimental data, having less than 5 % of discrepancy. The analytical model results were compared with COBRA code results, having small difference between them in a wide range of pressure, temperature and mass flow

  20. Pharmacokinetic modeling of gentamicin in treatment of infective endocarditis: Model development and validation of existing models.

    Directory of Open Access Journals (Sweden)

    Anna Gomes

    Full Text Available Gentamicin shows large variations in half-life and volume of distribution (Vd within and between individuals. Thus, monitoring and accurately predicting serum levels are required to optimize effectiveness and minimize toxicity. Currently, two population pharmacokinetic models are applied for predicting gentamicin doses in adults. For endocarditis patients the optimal model is unknown. We aimed at: 1 creating an optimal model for endocarditis patients; and 2 assessing whether the endocarditis and existing models can accurately predict serum levels. We performed a retrospective observational two-cohort study: one cohort to parameterize the endocarditis model by iterative two-stage Bayesian analysis, and a second cohort to validate and compare all three models. The Akaike Information Criterion and the weighted sum of squares of the residuals divided by the degrees of freedom were used to select the endocarditis model. Median Prediction Error (MDPE and Median Absolute Prediction Error (MDAPE were used to test all models with the validation dataset. We built the endocarditis model based on data from the modeling cohort (65 patients with a fixed 0.277 L/h/70kg metabolic clearance, 0.698 (±0.358 renal clearance as fraction of creatinine clearance, and Vd 0.312 (±0.076 L/kg corrected lean body mass. External validation with data from 14 validation cohort patients showed a similar predictive power of the endocarditis model (MDPE -1.77%, MDAPE 4.68% as compared to the intensive-care (MDPE -1.33%, MDAPE 4.37% and standard (MDPE -0.90%, MDAPE 4.82% models. All models acceptably predicted pharmacokinetic parameters for gentamicin in endocarditis patients. However, these patients appear to have an increased Vd, similar to intensive care patients. Vd mainly determines the height of peak serum levels, which in turn correlate with bactericidal activity. In order to maintain simplicity, we advise to use the existing intensive-care model in clinical practice to

  1. CFD Modeling and Experimental Validation of a Solar Still

    Directory of Open Access Journals (Sweden)

    Mahmood Tahir

    2017-01-01

    Full Text Available Earth is the densest planet of the solar system with total area of 510.072 million square Km. Over 71.68% of this area is covered with water leaving a scant area of 28.32% for human to inhabit. The fresh water accounts for only 2.5% of the total volume and the rest is the brackish water. Presently, the world is facing chief problem of lack of potable water. This issue can be addressed by converting brackish water into potable through a solar distillation process and solar still is specially assigned for this purpose. Efficiency of a solar still explicitly depends on its design parameters, such as wall material, chamber depth, width and slope of the zcondensing surface. This study was aimed at investigating the solar still parameters using CFD modeling and experimental validation. The simulation data of ANSYS-FLUENT was compared with actual experimental data. A close agreement among the simulated and experimental results was seen in the presented work. It reveals that ANSYS-FLUENT is a potent tool to analyse the efficiency of the new designs of the solar distillation systems.

  2. Cross validation for the classical model of structured expert judgment

    International Nuclear Information System (INIS)

    Colson, Abigail R.; Cooke, Roger M.

    2017-01-01

    We update the 2008 TU Delft structured expert judgment database with data from 33 professionally contracted Classical Model studies conducted between 2006 and March 2015 to evaluate its performance relative to other expert aggregation models. We briefly review alternative mathematical aggregation schemes, including harmonic weighting, before focusing on linear pooling of expert judgments with equal weights and performance-based weights. Performance weighting outperforms equal weighting in all but 1 of the 33 studies in-sample. True out-of-sample validation is rarely possible for Classical Model studies, and cross validation techniques that split calibration questions into a training and test set are used instead. Performance weighting incurs an “out-of-sample penalty” and its statistical accuracy out-of-sample is lower than that of equal weighting. However, as a function of training set size, the statistical accuracy of performance-based combinations reaches 75% of the equal weight value when the training set includes 80% of calibration variables. At this point the training set is sufficiently powerful to resolve differences in individual expert performance. The information of performance-based combinations is double that of equal weighting when the training set is at least 50% of the set of calibration variables. Previous out-of-sample validation work used a Total Out-of-Sample Validity Index based on all splits of the calibration questions into training and test subsets, which is expensive to compute and includes small training sets of dubious value. As an alternative, we propose an Out-of-Sample Validity Index based on averaging the product of statistical accuracy and information over all training sets sized at 80% of the calibration set. Performance weighting outperforms equal weighting on this Out-of-Sample Validity Index in 26 of the 33 post-2006 studies; the probability of 26 or more successes on 33 trials if there were no difference between performance

  3. Seine estuary modelling and AirSWOT measurements validation

    Science.gov (United States)

    Chevalier, Laetitia; Lyard, Florent; Laignel, Benoit

    2013-04-01

    In the context of global climate change, knowing water fluxes and storage, from the global scale to the local scale, is a crucial issue. The future satellite SWOT (Surface Water and Ocean Topography) mission, dedicated to the surface water observation, is proposed to meet this challenge. SWOT main payload will be a Ka-band Radar Interferometer (KaRIn). To validate this new kind of measurements, preparatory airborne campaigns (called AirSWOT) are currently being designed. AirSWOT will carry an interferometer similar to Karin: Kaspar-Ka-band SWOT Phenomenology Airborne Radar. Some campaigns are planned in France in 2014. During these campaigns, the plane will fly over the Seine River basin, especially to observe its estuary, the upstream river main channel (to quantify river-aquifer exchange) and some wetlands. The present work objective is to validate the ability of AirSWOT and SWOT, using a Seine estuary hydrodynamic modelling. In this context, field measurements will be collected by different teams such as GIP (Public Interest Group) Seine Aval, the GPMR (Rouen Seaport), SHOM (Hydrographic and Oceanographic Service of the Navy), the IFREMER (French Research Institute for Sea Exploitation), Mercator-Ocean, LEGOS (Laboratory of Space Study in Geophysics and Oceanography), ADES (Data Access Groundwater) ... . These datasets will be used first to validate locally AirSWOT measurements, and then to improve a hydrodynamic simulations (using tidal boundary conditions, river and groundwater inflows ...) for AirSWOT data 2D validation. This modelling will also be used to estimate the benefit of the future SWOT mission for mid-latitude river hydrology. To do this modelling,the TUGOm barotropic model (Toulouse Unstructured Grid Ocean model 2D) is used. Preliminary simulations have been performed by first modelling and then combining to different regions: first the Seine River and its estuarine area and secondly the English Channel. These two simulations h are currently being

  4. Experimental validation of Swy-2 clay standard's PHREEQC model

    Science.gov (United States)

    Szabó, Zsuzsanna; Hegyfalvi, Csaba; Freiler, Ágnes; Udvardi, Beatrix; Kónya, Péter; Székely, Edit; Falus, György

    2017-04-01

    One of the challenges of the present century is to limit the greenhouse gas emissions for the mitigation of climate change which is possible for example by a transitional technology, CCS (Carbon Capture and Storage) and, among others, by the increase of nuclear proportion in the energy mix. Clay minerals are considered to be responsible for the low permeability and sealing capacity of caprocks sealing off stored CO2 and they are also the main constituents of bentonite in high level radioactive waste disposal facilities. The understanding of clay behaviour in these deep geological environments is possible through laboratory batch experiments of well-known standards and coupled geochemical models. Such experimentally validated models are scarce even though they allow deriving more precise long-term predictions of mineral reactions and rock and bentonite degradation underground and, therefore, ensuring the safety of the above technologies and increase their public acceptance. This ongoing work aims to create a kinetic geochemical model of Na-montmorillonite standard Swy-2 in the widely used PHREEQC code, supported by solution and mineral composition results from batch experiments. Several four days experiments have been carried out in 1:35 rock:water ratio at atmospheric conditions, and with inert and CO2 supercritical phase at 100 bar and 80 ⁰C relevant for the potential Hungarian CO2 reservoir complex. Solution samples have been taken during and after experiments and their compositions were measured by ICP-OES. The treated solid phase has been analysed by XRD and ATR-FTIR and compared to in-parallel measured references (dried Swy-2). Kinetic geochemical modelling of the experimental conditions has been performed by PHREEQC version 3 using equations and kinetic rate parameters from the USGS report of Palandri and Kharaka (2004). The visualization of experimental and numerous modelling results has been automatized by R. Experiments and models show very fast

  5. Validating unit commitment models: A case for benchmark test systems

    OpenAIRE

    Melhorn, Alexander C.; Li, Mingsong; Carroll, Paula; Flynn, Damian

    2016-01-01

    Due to increasing penetration of non-traditional power system resources; e.g. renewable generation, electric vehicles, demand response, etc. and computational power there has been an increased interest in research on unit commitment. It therefore may be important to take another look at how unit commitment models and algorithms are validated especially as improvements in solutions and algorithmic performance are desired to combat the added complexity of additional constraints. This paper expl...

  6. Requirements Validation: Execution of UML Models with CPN Tools

    DEFF Research Database (Denmark)

    Machado, Ricardo J.; Lassen, Kristian Bisgaard; Oliveira, Sérgio

    2007-01-01

    Requirements validation is a critical task in any engineering project. The confrontation of stakeholders with static requirements models is not enough, since stakeholders with non-computer science education are not able to discover all the inter-dependencies between the elicited requirements. Even...... requirements, where the system to be built must explicitly support the interaction between people within a pervasive cooperative workflow execution. A case study from a real project is used to illustrate the proposed approach....

  7. Experimental Validation of a Permeability Model for Enrichment Membranes

    International Nuclear Information System (INIS)

    Orellano, Pablo; Brasnarof, Daniel; Florido Pablo

    2003-01-01

    An experimental loop with a real scale diffuser, in a single enrichment-stage configuration, was operated with air at different process conditions, in order to characterize the membrane permeability.Using these experimental data, an analytical geometric-and-morphologic-based model was validated.It is conclude that a new set of independent measurements, i.e. enrichment, is necessary in order to fully characterize diffusers, because of its internal parameters are not univocally determinated with permeability experimental data only

  8. Evaluation model and experimental validation of tritium in agricultural plant

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Hee Suk; Keum, Dong Kwon; Lee, Han Soo; Jun, In; Choi, Yong Ho; Lee, Chang Woo [KAERI, Daejon (Korea, Republic of)

    2005-12-15

    This paper describes a compartment dynamic model for evaluating the contamination level of tritium in agricultural plants exposed by accidentally released tritium. The present model uses a time dependent growth equation of plant so that it can predict the effect of growth stage of plant during the exposure time. The model including atmosphere, soil and plant compartments is described by a set of nonlinear ordinary differential equations, and is able to predict time-dependent concentrations of tritium in the compartments. To validate the model, a series of exposure experiments of HTO vapor on Chinese cabbage and radish was carried out at the different growth stage of each plant. At the end of exposure, the tissue free water(TFWT) and the organically bound tritium (OBT) were measured. The measured concentrations were agreed well with model predictions.

  9. Statistical Analysis Methods for Physics Models Verification and Validation

    CERN Document Server

    De Luca, Silvia

    2017-01-01

    The validation and verification process is a fundamental step for any software like Geant4 and GeantV, which aim to perform data simulation using physics models and Monte Carlo techniques. As experimental physicists, we have to face the problem to compare the results obtained using simulations with what the experiments actually observed. One way to solve the problem is to perform a consistency test. Within the Geant group, we developed a C++ compact library which will be added to the automated validation process on the Geant Validation Portal

  10. Wave Tank Testing and Model Validation of an Autonomous Wave Energy Converter

    Directory of Open Access Journals (Sweden)

    Bret Bosma

    2015-08-01

    Full Text Available A key component in bringing ocean wave energy converters from concept to commercialization is the building and testing of scaled prototypes to provide model validation. A one quarter scale prototype of an autonomous two body heaving point absorber was modeled, built, and tested for this work. Wave tank testing results are compared with two hydrodynamic and system models—implemented in both ANSYS AQWA and MATLAB/Simulink—and show model validation over certain regions of operation. This work will serve as a guide for future developers of wave energy converter devices, providing insight in taking their design from concept to prototype stage.

  11. Validation of advanced NSSS simulator model for loss-of-coolant accidents

    Energy Technology Data Exchange (ETDEWEB)

    Kao, S.P.; Chang, S.K.; Huang, H.C. [Nuclear Training Branch, Northeast Utilities, Waterford, CT (United States)

    1995-09-01

    The replacement of the NSSS (Nuclear Steam Supply System) model on the Millstone 2 full-scope simulator has significantly increased its fidelity to simulate adverse conditions in the RCS. The new simulator NSSS model is a real-time derivative of the Nuclear Plant Analyzer by ABB. The thermal-hydraulic model is a five-equation, non-homogeneous model for water, steam, and non-condensible gases. The neutronic model is a three-dimensional nodal diffusion model. In order to certify the new NSSS model for operator training, an extensive validation effort has been performed by benchmarking the model performance against RELAP5/MOD2. This paper presents the validation results for the cases of small-and large-break loss-of-coolant accidents (LOCA). Detailed comparisons in the phenomena of reflux-condensation, phase separation, and two-phase natural circulation are discussed.

  12. Validation of a Business Model for Cultural Heritage Institutions

    Directory of Open Access Journals (Sweden)

    Cristian CIUREA

    2015-01-01

    Full Text Available The paper proposes a business model for the efficiency optimization of the interaction between all actors involved in cultural heritage sector, such as galleries, libraries, archives and museums (GLAM. The validation of the business model is subject of analyses and implementations in a real environment made by different cultural institutions. The implementation of virtual exhibitions on mobile devices is described and analyzed as a key factor for increasing the cultural heritage visibility. New perspectives on the development of virtual exhibitions for mobile devices are considered. A study on the number of visitors of cultural institutions is carried out and ways to increase the number of visitors are described.

  13. Experimental Validation of a Mathematical Model for Seabed Liquefaction Under Waves

    DEFF Research Database (Denmark)

    Sumer, B. Mutlu; Kirca, Özgür; Fredsøe, Jørgen

    2012-01-01

    This paper summarizes the results of an experimental study directed towards the validation of a mathematical model for the buildup of pore water pressure and resulting liquefaction of marine soils under progressive waves. Experiments were conducted under controlled conditions with silt (d(50) = 0.......070 mm) in a wave flume with a soil pit. Waves with wave heights in the range of 7.7-18 cm, 55-cm water depth and 1.6-s wave period enabled us to study both the liquefaction and no-liquefaction regime pore water pressure buildup. The experimental data were used to validate the model. A numerical example...

  14. Cross-validation of an employee safety climate model in Malaysia.

    Science.gov (United States)

    Bahari, Siti Fatimah; Clarke, Sharon

    2013-06-01

    Whilst substantial research has investigated the nature of safety climate, and its importance as a leading indicator of organisational safety, much of this research has been conducted with Western industrial samples. The current study focuses on the cross-validation of a safety climate model in the non-Western industrial context of Malaysian manufacturing. The first-order factorial validity of Cheyne et al.'s (1998) [Cheyne, A., Cox, S., Oliver, A., Tomas, J.M., 1998. Modelling safety climate in the prediction of levels of safety activity. Work and Stress, 12(3), 255-271] model was tested, using confirmatory factor analysis, in a Malaysian sample. Results showed that the model fit indices were below accepted levels, indicating that the original Cheyne et al. (1998) safety climate model was not supported. An alternative three-factor model was developed using exploratory factor analysis. Although these findings are not consistent with previously reported cross-validation studies, we argue that previous studies have focused on validation across Western samples, and that the current study demonstrates the need to take account of cultural factors in the development of safety climate models intended for use in non-Western contexts. The results have important implications for the transferability of existing safety climate models across cultures (for example, in global organisations) and highlight the need for future research to examine cross-cultural issues in relation to safety climate. Copyright © 2013 National Safety Council and Elsevier Ltd. All rights reserved.

  15. SHERMAN, a shape-based thermophysical model. I. Model description and validation

    Science.gov (United States)

    Magri, Christopher; Howell, Ellen S.; Vervack, Ronald J.; Nolan, Michael C.; Fernández, Yanga R.; Marshall, Sean E.; Crowell, Jenna L.

    2018-03-01

    SHERMAN, a new thermophysical modeling package designed for analyzing near-infrared spectra of asteroids and other solid bodies, is presented. The model's features, the methods it uses to solve for surface and subsurface temperatures, and the synthetic data it outputs are described. A set of validation tests demonstrates that SHERMAN produces accurate output in a variety of special cases for which correct results can be derived from theory. These cases include a family of solutions to the heat equation for which thermal inertia can have any value and thermophysical properties can vary with depth and with temperature. An appendix describes a new approximation method for estimating surface temperatures within spherical-section craters, more suitable for modeling infrared beaming at short wavelengths than the standard method.

  16. Validating and Verifying Biomathematical Models of Human Fatigue

    Science.gov (United States)

    Martinez, Siera Brooke; Quintero, Luis Ortiz; Flynn-Evans, Erin

    2015-01-01

    Airline pilots experience acute and chronic sleep deprivation, sleep inertia, and circadian desynchrony due to the need to schedule flight operations around the clock. This sleep loss and circadian desynchrony gives rise to cognitive impairments, reduced vigilance and inconsistent performance. Several biomathematical models, based principally on patterns observed in circadian rhythms and homeostatic drive, have been developed to predict a pilots levels of fatigue or alertness. These models allow for the Federal Aviation Administration (FAA) and commercial airlines to make decisions about pilot capabilities and flight schedules. Although these models have been validated in a laboratory setting, they have not been thoroughly tested in operational environments where uncontrolled factors, such as environmental sleep disrupters, caffeine use and napping, may impact actual pilot alertness and performance. We will compare the predictions of three prominent biomathematical fatigue models (McCauley Model, Harvard Model, and the privately-sold SAFTE-FAST Model) to actual measures of alertness and performance. We collected sleep logs, movement and light recordings, psychomotor vigilance task (PVT), and urinary melatonin (a marker of circadian phase) from 44 pilots in a short-haul commercial airline over one month. We will statistically compare with the model predictions to lapses on the PVT and circadian phase. We will calculate the sensitivity and specificity of each model prediction under different scheduling conditions. Our findings will aid operational decision-makers in determining the reliability of each model under real-world scheduling situations.

  17. Validation of a mathematical model of the bovine estrous cycle for cows with different estrous cycle characteristics

    NARCIS (Netherlands)

    Boer, H.M.T.; Butler, S.T.; Stötzel, C.; Pas, te M.F.W.; Veerkamp, R.F.; Woelders, H.

    2017-01-01

    A recently developed mechanistic mathematical model of the bovine estrous cycle was parameterized to fit empirical data sets collected during one estrous cycle of 31 individual cows, with the main objective to further validate the model. The a priori criteria for validation were (1) the resulting

  18. Calibration and validation of a model describing complete autotrophic nitrogen removal in a granular SBR system

    DEFF Research Database (Denmark)

    Vangsgaard, Anna Katrine; Mutlu, Ayten Gizem; Gernaey, Krist

    2013-01-01

    BACKGROUND: A validated model describing the nitritation-anammox process in a granular sequencing batch reactor (SBR) system is an important tool for: a) design of future experiments and b) prediction of process performance during optimization, while applying process control, or during system scale......-up. RESULTS: A model was calibrated using a step-wise procedure customized for the specific needs of the system. The important steps in the procedure were initialization, steady-state and dynamic calibration, and validation. A fast and effective initialization approach was developed to approximate pseudo...... screening of the parameter space proposed by Sin et al. (2008) - to find the best fit of the model to dynamic data. Finally, the calibrated model was validated with an independent data set. CONCLUSION: The presented calibration procedure is the first customized procedure for this type of system...

  19. Structural refinement and prediction of potential CCR2 antagonists through validated multi-QSAR modeling studies.

    Science.gov (United States)

    Amin, Sk Abdul; Adhikari, Nilanjan; Baidya, Sandip Kumar; Gayen, Shovanlal; Jha, Tarun

    2018-01-03

    Chemokines trigger numerous inflammatory responses and modulate the immune system. The interaction between monocyte chemoattractant protein-1 and chemokine receptor 2 (CCR2) may be the cause of atherosclerosis, obesity, and insulin resistance. However, CCR2 is also implicated in other inflammatory diseases such as rheumatoid arthritis, multiple sclerosis, asthma, and neuropathic pain. Therefore, there is a paramount importance of designing potent and selective CCR2 antagonists despite a number of drug candidates failed in clinical trials. In this article, 83 CCR2 antagonists by Jhonson and Jhonson Pharmaceuticals have been considered for robust validated multi-QSAR modeling studies to get an idea about the structural and pharmacophoric requirements for designing more potent CCR2 antagonists. All these QSAR models were validated and statistically reliable. Observations resulted from different modeling studies correlated and validated results of other ones. Finally, depending on these QSAR observations, some new molecules were proposed that may exhibit higher activity against CCR2.

  20. Condensation of steam in horizontal pipes: model development and validation

    International Nuclear Information System (INIS)

    Szijarto, R.

    2015-01-01

    This thesis submitted to the Swiss Federal Institute of Technology ETH in Zurich presents the development and validation of a model for the condensation of steam in horizontal pipes. Condensation models were introduced and developed particularly for the application in the emergency cooling system of a Gen-III+ boiling water reactor. Such an emergency cooling system consists of slightly inclined horizontal pipes, which are immersed in a cold water tank. The pipes are connected to the reactor pressure vessel. They are responsible for a fast depressurization of the reactor core in the case of accident. Condensation in horizontal pipes was investigated with both one-dimensional system codes (RELAP5) and three-dimensional computational fluid dynamics software (ANSYS FLUENT). The performance of the RELAP5 code was not sufficient for transient condensation processes. Therefore, a mechanistic model was developed and implemented. Four models were tested on the LAOKOON facility, which analysed direct contact condensation in a horizontal duct

  1. Load Model Verification, Validation and Calibration Framework by Statistical Analysis on Field Data

    Science.gov (United States)

    Jiao, Xiangqing; Liao, Yuan; Nguyen, Thai

    2017-11-01

    Accurate load models are critical for power system analysis and operation. A large amount of research work has been done on load modeling. Most of the existing research focuses on developing load models, while little has been done on developing formal load model verification and validation (V&V) methodologies or procedures. Most of the existing load model validation is based on qualitative rather than quantitative analysis. In addition, not all aspects of model V&V problem have been addressed by the existing approaches. To complement the existing methods, this paper proposes a novel load model verification and validation framework that can systematically and more comprehensively examine load model's effectiveness and accuracy. Statistical analysis, instead of visual check, quantifies the load model's accuracy, and provides a confidence level of the developed load model for model users. The analysis results can also be used to calibrate load models. The proposed framework can be used as a guidance to systematically examine load models for utility engineers and researchers. The proposed method is demonstrated through analysis of field measurements collected from a utility system.

  2. Validation of thermal models for a prototypical MEMS thermal actuator.

    Energy Technology Data Exchange (ETDEWEB)

    Gallis, Michail A.; Torczynski, John Robert; Piekos, Edward Stanley; Serrano, Justin Raymond; Gorby, Allen D.; Phinney, Leslie Mary

    2008-09-01

    This report documents technical work performed to complete the ASC Level 2 Milestone 2841: validation of thermal models for a prototypical MEMS thermal actuator. This effort requires completion of the following task: the comparison between calculated and measured temperature profiles of a heated stationary microbeam in air. Such heated microbeams are prototypical structures in virtually all electrically driven microscale thermal actuators. This task is divided into four major subtasks. (1) Perform validation experiments on prototypical heated stationary microbeams in which material properties such as thermal conductivity and electrical resistivity are measured if not known and temperature profiles along the beams are measured as a function of electrical power and gas pressure. (2) Develop a noncontinuum gas-phase heat-transfer model for typical MEMS situations including effects such as temperature discontinuities at gas-solid interfaces across which heat is flowing, and incorporate this model into the ASC FEM heat-conduction code Calore to enable it to simulate these effects with good accuracy. (3) Develop a noncontinuum solid-phase heat transfer model for typical MEMS situations including an effective thermal conductivity that depends on device geometry and grain size, and incorporate this model into the FEM heat-conduction code Calore to enable it to simulate these effects with good accuracy. (4) Perform combined gas-solid heat-transfer simulations using Calore with these models for the experimentally investigated devices, and compare simulation and experimental temperature profiles to assess model accuracy. These subtasks have been completed successfully, thereby completing the milestone task. Model and experimental temperature profiles are found to be in reasonable agreement for all cases examined. Modest systematic differences appear to be related to uncertainties in the geometric dimensions of the test structures and in the thermal conductivity of the

  3. Validation of the DeLone and McLean Information Systems Success Model

    OpenAIRE

    Ojo, Adebowale I.

    2017-01-01

    Objectives This study is an adaptation of the widely used DeLone and McLean information system success model in the context of hospital information systems in a developing country. Methods A survey research design was adopted in the study. A structured questionnaire was used to collect data from 442 health information management personnel in five Nigerian teaching hospitals. A structural equation modeling technique was used to validate the model's constructs. Results It was revealed that syst...

  4. External validation of multivariable prediction models: a systematic review of methodological conduct and reporting

    Science.gov (United States)

    2014-01-01

    Background Before considering whether to use a multivariable (diagnostic or prognostic) prediction model, it is essential that its performance be evaluated in data that were not used to develop the model (referred to as external validation). We critically appraised the methodological conduct and reporting of external validation studies of multivariable prediction models. Methods We conducted a systematic review of articles describing some form of external validation of one or more multivariable prediction models indexed in PubMed core clinical journals published in 2010. Study data were extracted in duplicate on design, sample size, handling of missing data, reference to the original study developing the prediction models and predictive performance measures. Results 11,826 articles were identified and 78 were included for full review, which described the evaluation of 120 prediction models. in participant data that were not used to develop the model. Thirty-three articles described both the development of a prediction model and an evaluation of its performance on a separate dataset, and 45 articles described only the evaluation of an existing published prediction model on another dataset. Fifty-seven percent of the prediction models were presented and evaluated as simplified scoring systems. Sixteen percent of articles failed to report the number of outcome events in the validation datasets. Fifty-four percent of studies made no explicit mention of missing data. Sixty-seven percent did not report evaluating model calibration whilst most studies evaluated model discrimination. It was often unclear whether the reported performance measures were for the full regression model or for the simplified models. Conclusions The vast majority of studies describing some form of external validation of a multivariable prediction model were poorly reported with key details frequently not presented. The validation studies were characterised by poor design, inappropriate handling

  5. Development and Validation of Linear Alternator Models for the Advanced Stirling Convertor

    Science.gov (United States)

    Metscher, Jonathan F.; Lewandowski, Edward J.

    2015-01-01

    Two models of the linear alternator of the Advanced Stirling Convertor (ASC) have been developed using the Sage 1-D modeling software package. The first model relates the piston motion to electric current by means of a motor constant. The second uses electromagnetic model components to model the magnetic circuit of the alternator. The models are tuned and validated using test data and also compared against each other. Results show both models can be tuned to achieve results within 7 of ASC test data under normal operating conditions. Using Sage enables the creation of a complete ASC model to be developed and simulations completed quickly compared to more complex multi-dimensional models. These models allow for better insight into overall Stirling convertor performance, aid with Stirling power system modeling, and in the future support NASA mission planning for Stirling-based power systems.

  6. First experiments results about the engineering model of Rapsodie

    International Nuclear Information System (INIS)

    Chalot, A.; Ginier, R.; Sauvage, M.

    1964-01-01

    This report deals with the first series of experiments carried out on the engineering model of Rapsodie and on an associated sodium facility set in a laboratory hall of Cadarache. It conveys more precisely: 1/ - The difficulties encountered during the erection and assembly of the engineering model and a compilation of the results of the first series of experiments and tests carried out on this installation (loading of the subassemblies preheating, thermal chocks...). 2/ - The experiments and tests carried out on the two prototypes control rod drive mechanisms which brought to the choice for the design of the definitive drive mechanism. As a whole, the results proved the validity of the general design principles adopted for Rapsodie. (authors) [fr

  7. Validation of multi-body modelling methodology for reconfigurable underwater robots

    DEFF Research Database (Denmark)

    Nielsen, M.C.; Eidsvik, O. A.; Blanke, Mogens

    2016-01-01

    This paper investigates the problem of employing reconfigurable robots in an underwater setting. The main results presented is the experimental validation of a modelling methodology for a system consisting of N dynamically connected robots with heterogeneous dynamics. Two distinct types...... of experiments are performed, a series of hydrostatic free-decay tests and a series of open-loop trajectory tests. The results are compared to a simulation based on the modelling methodology. The modelling methodology shows promising results for usage with systems composed of reconfigurable underwater modules....... The purpose of the model is to enable design of control strategies for cooperative reconfigurable underwater systems....

  8. The Danish national passenger modelModel specification and results

    DEFF Research Database (Denmark)

    Rich, Jeppe; Hansen, Christian Overgaard

    2016-01-01

    , the paper provides a description of a large-scale forecast model with a discussion of the linkage between population synthesis, demand and assignment. Secondly, the paper gives specific attention to model specification and in particular choice of functional form and cost-damping. Specifically we suggest...... a family of logarithmic spline functions and illustrate how it is applied in the model. Thirdly and finally, we evaluate model sensitivity and performance by evaluating the distance distribution and elasticities. In the paper we present results where the spline-function is compared with more traditional...... function types and it is indicated that the spline-function provides a better description of the data. Results are also provided in the form of a back-casting exercise where the model is tested in a back-casting scenario to 2002....

  9. Recent Progress Validating the HADES Model of LLNL's HEAF MicroCT Measurements

    Energy Technology Data Exchange (ETDEWEB)

    White, W. T. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Bond, K. C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Lennox, K. P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Aufderheide, M. B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Seetho, I. M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Roberson, G. P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-07-17

    This report compares recent HADES calculations of x-ray linear attenuation coefficients to previous MicroCT measurements made at Lawrence Livermore National Laboratory’s High Energy Applications Facility (HEAF). The chief objective is to investigate what impact recent changes in HADES modeling have on validation results. We find that these changes have no obvious effect on the overall accuracy of the model. Detailed comparisons between recent and previous results are presented.

  10. Systematic validation of non-equilibrium thermochemical models using Bayesian inference

    KAUST Repository

    Miki, Kenji

    2015-10-01

    © 2015 Elsevier Inc. The validation process proposed by Babuška et al. [1] is applied to thermochemical models describing post-shock flow conditions. In this validation approach, experimental data is involved only in the calibration of the models, and the decision process is based on quantities of interest (QoIs) predicted on scenarios that are not necessarily amenable experimentally. Moreover, uncertainties present in the experimental data, as well as those resulting from an incomplete physical model description, are propagated to the QoIs. We investigate four commonly used thermochemical models: a one-temperature model (which assumes thermal equilibrium among all inner modes), and two-temperature models developed by Macheret et al. [2], Marrone and Treanor [3], and Park [4]. Up to 16 uncertain parameters are estimated using Bayesian updating based on the latest absolute volumetric radiance data collected at the Electric Arc Shock Tube (EAST) installed inside the NASA Ames Research Center. Following the solution of the inverse problems, the forward problems are solved in order to predict the radiative heat flux, QoI, and examine the validity of these models. Our results show that all four models are invalid, but for different reasons: the one-temperature model simply fails to reproduce the data while the two-temperature models exhibit unacceptably large uncertainties in the QoI predictions.

  11. SWAT Model Configuration, Calibration and Validation for Lake Champlain Basin

    Science.gov (United States)

    The Soil and Water Assessment Tool (SWAT) model was used to develop phosphorus loading estimates for sources in the Lake Champlain Basin. This document describes the model setup and parameterization, and presents calibration results.

  12. Experimental Validation of a Dynamic Model for Lightweight Robots

    Directory of Open Access Journals (Sweden)

    Alessandro Gasparetto

    2013-03-01

    Full Text Available Nowadays, one of the main topics in robotics research is dynamic performance improvement by means of a lightening of the overall system structure. The effective motion and control of these lightweight robotic systems occurs with the use of suitable motion planning and control process. In order to do so, model-based approaches can be adopted by exploiting accurate dynamic models that take into account the inertial and elastic terms that are usually neglected in a heavy rigid link configuration. In this paper, an effective method for modelling spatial lightweight industrial robots based on an Equivalent Rigid Link System approach is considered from an experimental validation perspective. A dynamic simulator implementing the formulation is used and an experimental test-bench is set-up. Experimental tests are carried out with a benchmark L-shape mechanism.

  13. Method validation in plasma source optical emission spectroscopy (ICP-OES) - From samples to results

    International Nuclear Information System (INIS)

    Pilon, Fabien; Vielle, Karine; Birolleau, Jean-Claude; Vigneau, Olivier; Labet, Alexandre; Arnal, Nadege; Adam, Christelle; Camilleri, Virginie; Amiel, Jeanine; Granier, Guy; Faure, Joel; Arnaud, Regine; Beres, Andre; Blanchard, Jean-Marc; Boyer-Deslys, Valerie; Broudic, Veronique; Marques, Caroline; Augeray, Celine; Bellefleur, Alexandre; Bienvenu, Philippe; Delteil, Nicole; Boulet, Beatrice; Bourgarit, David; Brennetot, Rene; Fichet, Pascal; Celier, Magali; Chevillotte, Rene; Klelifa, Aline; Fuchs, Gilbert; Le Coq, Gilles; Mermet, Jean-Michel

    2017-01-01

    Even though ICP-OES (Inductively Coupled Plasma - Optical Emission Spectroscopy) is now a routine analysis technique, requirements for measuring processes impose a complete control and mastering of the operating process and of the associated quality management system. The aim of this (collective) book is to guide the analyst during all the measurement validation procedure and to help him to guarantee the mastering of its different steps: administrative and physical management of samples in the laboratory, preparation and treatment of the samples before measuring, qualification and monitoring of the apparatus, instrument setting and calibration strategy, exploitation of results in terms of accuracy, reliability, data covariance (with the practical determination of the accuracy profile). The most recent terminology is used in the book, and numerous examples and illustrations are given in order to a better understanding and to help the elaboration of method validation documents

  14. Assessment of the validity of inelastic design analysis methods by comparisons of predictions with test results

    International Nuclear Information System (INIS)

    Corum, J.M.; Clinard, J.A.; Sartory, W.K.

    1976-01-01

    A description is given of some of the work under way in the United States to provide the necessary information to evaluate inelastic analysis methods and computer programs used in fast reactor component design, and typical comparisons of analysis predictions with inelastic structural test results are presented. It is emphasized throughout that rather than asking how valid, or correct, are the analytical predictions, one might more properly question whether or not the combination of the predictions and the associated high-temperature design criteria leads to an acceptable level of structural integrity. It is believed that in this context the analysis predictions are generally valid, even though exact correlations between predictions and actual behavior are not obtained and cannot be expected. Final judgment, however, must be reserved for the design analyst in each specific case

  15. Simulation Methods and Validation Criteria for Modeling Cardiac Ventricular Electrophysiology.

    Science.gov (United States)

    Krishnamoorthi, Shankarjee; Perotti, Luigi E; Borgstrom, Nils P; Ajijola, Olujimi A; Frid, Anna; Ponnaluri, Aditya V; Weiss, James N; Qu, Zhilin; Klug, William S; Ennis, Daniel B; Garfinkel, Alan

    2014-01-01

    We describe a sequence of methods to produce a partial differential equation model of the electrical activation of the ventricles. In our framework, we incorporate the anatomy and cardiac microstructure obtained from magnetic resonance imaging and diffusion tensor imaging of a New Zealand White rabbit, the Purkinje structure and the Purkinje-muscle junctions, and an electrophysiologically accurate model of the ventricular myocytes and tissue, which includes transmural and apex-to-base gradients of action potential characteristics. We solve the electrophysiology governing equations using the finite element method and compute both a 6-lead precordial electrocardiogram (ECG) and the activation wavefronts over time. We are particularly concerned with the validation of the various methods used in our model and, in this regard, propose a series of validation criteria that we consider essential. These include producing a physiologically accurate ECG, a correct ventricular activation sequence, and the inducibility of ventricular fibrillation. Among other components, we conclude that a Purkinje geometry with a high density of Purkinje muscle junctions covering the right and left ventricular endocardial surfaces as well as transmural and apex-to-base gradients in action potential characteristics are necessary to produce ECGs and time activation plots that agree with physiological observations.

  16. Simulation Methods and Validation Criteria for Modeling Cardiac Ventricular Electrophysiology.

    Directory of Open Access Journals (Sweden)

    Shankarjee Krishnamoorthi

    Full Text Available We describe a sequence of methods to produce a partial differential equation model of the electrical activation of the ventricles. In our framework, we incorporate the anatomy and cardiac microstructure obtained from magnetic resonance imaging and diffusion tensor imaging of a New Zealand White rabbit, the Purkinje structure and the Purkinje-muscle junctions, and an electrophysiologically accurate model of the ventricular myocytes and tissue, which includes transmural and apex-to-base gradients of action potential characteristics. We solve the electrophysiology governing equations using the finite element method and compute both a 6-lead precordial electrocardiogram (ECG and the activation wavefronts over time. We are particularly concerned with the validation of the various methods used in our model and, in this regard, propose a series of validation criteria that we consider essential. These include producing a physiologically accurate ECG, a correct ventricular activation sequence, and the inducibility of ventricular fibrillation. Among other components, we conclude that a Purkinje geometry with a high density of Purkinje muscle junctions covering the right and left ventricular endocardial surfaces as well as transmural and apex-to-base gradients in action potential characteristics are necessary to produce ECGs and time activation plots that agree with physiological observations.

  17. Lessons learned from recent geomagnetic disturbance model validation activities

    Science.gov (United States)

    Pulkkinen, A. A.; Welling, D. T.

    2017-12-01

    Due to concerns pertaining to geomagnetically induced current impact on ground-based infrastructure, there has been significantly elevated interest in applying models for local geomagnetic disturbance or "delta-B" predictions. Correspondingly there has been elevated need for testing the quality of the delta-B predictions generated by the modern empirical and physics-based models. To address this need, community-wide activities were launched under the GEM Challenge framework and one culmination of the activities was the validation and selection of models that were transitioned into operations at NOAA SWPC. The community-wide delta-B action is continued under the CCMC-facilitated International Forum for Space Weather Capabilities Assessment and its "Ground Magnetic Perturbations: dBdt, delta-B, GICs, FACs" working group. The new delta-B working group builds on the past experiences and expands the collaborations to cover the entire international space weather community. In this paper, we discuss the key lessons learned from the past delta-B validation exercises and lay out the path forward for building on those experience under the new delta-B working group.

  18. Validation of mathematical models to describe fluid dynamics of a cold riser by gamma ray attenuation

    International Nuclear Information System (INIS)

    Melo, Ana Cristina Bezerra Azedo de

    2004-12-01

    The fluid dynamic behavior of a riser in a cold type FCC model was investigated by means of catalyst concentration distribution measured with gamma attenuation and simulated with a mathematical model. In the riser of the cold model, MEF, 0,032 m in diameter, 2,30 m in length the fluidized bed, whose components are air and FCC catalyst, circulates. The MEF is operated by automatic control and instruments for measuring fluid dynamic variables. An axial catalyst concentration distribution was measured using an Am-241 gamma source and a NaI detector coupled to a multichannel provided with a software for data acquisition and evaluation. The MEF was adapted for a fluid dynamic model validation which describes the flow in the riser, for example, by introducing an injector for controlling the solid flow in circulation. Mathematical models were selected from literature, analyzed and tested to simulate the fluid dynamic of the riser. A methodology for validating fluid dynamic models was studied and implemented. The stages of the work were developed according to the validation methodology, such as data planning experiments, study of the equations which describe the fluidodynamic, computational solvers application and comparison with experimental data. Operational sequences were carried out keeping the MEF conditions for measuring catalyst concentration and simultaneously measuring the fluid dynamic variables, velocity of the components and pressure drop in the riser. Following this, simulated and experimental values were compared and statistical data treatment done, aiming at the required precision to validate the fluid dynamic model. The comparison tests between experimental and simulated data were carried out under validation criteria. The fluid dynamic behavior of the riser was analyzed and the results and the agreement with literature were discussed. The adopt model was validated under the MEF operational conditions, for a 3 to 6 m/s gas velocity in the riser and a slip

  19. Experimental validation of a thermodynamic boiler model under steady state and dynamic conditions

    International Nuclear Information System (INIS)

    Carlon, Elisa; Verma, Vijay Kumar; Schwarz, Markus; Golicza, Laszlo; Prada, Alessandro; Baratieri, Marco; Haslinger, Walter; Schmidl, Christoph

    2015-01-01

    Highlights: • Laboratory tests on two commercially available pellet boilers. • Steady state and a dynamic load cycle tests. • Pellet boiler model calibration based on data registered in stationary operation. • Boiler model validation with reference to both stationary and dynamic operation. • Validated model suitable for coupled simulation of building and heating system. - Abstract: Nowadays dynamic building simulation is an essential tool for the design of heating systems for residential buildings. The simulation of buildings heated by biomass systems, first of all needs detailed boiler models, capable of simulating the boiler both as a stand-alone appliance and as a system component. This paper presents the calibration and validation of a boiler model by means of laboratory tests. The chosen model, i.e. TRNSYS “Type 869”, has been validated for two commercially available pellet boilers of 6 and 12 kW nominal capacities. Two test methods have been applied: the first is a steady state test at nominal load and the second is a load cycle test including stationary operation at different loads as well as transient operation. The load cycle test is representative of the boiler operation in the field and characterises the boiler’s stationary and dynamic behaviour. The model had been calibrated based on laboratory data registered during stationary operation at different loads and afterwards it was validated by simulating both the stationary and the dynamic tests. Selected parameters for the validation were the heat transfer rates to water and the water temperature profiles inside the boiler and at the boiler outlet. Modelling results showed better agreement with experimental data during stationary operation rather than during dynamic operation. Heat transfer rates to water were predicted with a maximum deviation of 10% during the stationary operation, and a maximum deviation of 30% during the dynamic load cycle. However, for both operational regimes the

  20. Discrete fracture modelling for the Stripa tracer validation experiment predictions

    International Nuclear Information System (INIS)

    Dershowitz, W.; Wallmann, P.

    1992-02-01

    Groundwater flow and transport through three-dimensional networks of discrete fractures was modeled to predict the recovery of tracer from tracer injection experiments conducted during phase 3 of the Stripa site characterization and validation protect. Predictions were made on the basis of an updated version of the site scale discrete fracture conceptual model used for flow predictions and preliminary transport modelling. In this model, individual fractures were treated as stochastic features described by probability distributions of geometric and hydrologic properties. Fractures were divided into three populations: Fractures in fracture zones near the drift, non-fracture zone fractures within 31 m of the drift, and fractures in fracture zones over 31 meters from the drift axis. Fractures outside fracture zones are not modelled beyond 31 meters from the drift axis. Transport predictions were produced using the FracMan discrete fracture modelling package for each of five tracer experiments. Output was produced in the seven formats specified by the Stripa task force on fracture flow modelling. (au)

  1. ABA versus TEACCH: the case for defining and validating comprehensive treatment models in autism.

    Science.gov (United States)

    Callahan, Kevin; Shukla-Mehta, Smita; Magee, Sandy; Wie, Min

    2010-01-01

    The authors analyzed the results of a social validation survey to determine if autism service providers including special education teachers, parents, and administrators demonstrate a preference for the intervention components of Applied Behavior Analysis or Training and Education of Autistic and other Communication Handicapped Children. They also investigated the comprehensiveness of these treatment models for use in public school programs. The findings indicate no clear preference for either model, but a significantly higher level of social validity for components inherent in both approaches. The authors discuss the need for research to define what is meant by comprehensive programming in autism.

  2. Experimental validation of a fuel additive assisted regeneration model in silicon carbide diesel filters

    Energy Technology Data Exchange (ETDEWEB)

    Stratakis, G.A.; Pontikakis, G.N.; Stamatelos, A.M. [University of Thessaly, Volos (Greece). Mechanical and Industrial Engineering Dept.

    2004-07-01

    In this paper, an experimental validation procedure is applied to an improved one-dimensional model of fuel additive assisted regeneration of a diesel particulate filter. Full-scale tests on an engine bench of the regeneration behaviour of a diesel filter fitted to a modern diesel engine run on catalyst-doped fuel are employed for this purpose. The main objectives of the validation procedure concern the ability of the model to predict the effects of exhaust mass flowrate, initial soot loading mass, volatile organic fraction of the soot and additive concentration in the fuel. The results of the validation procedure are intended to demonstrate the scope and extent of applicability of models of this type to real-world design and optimization studies with diesel filters. (author)

  3. A ferrofluid based energy harvester: Computational modeling, analysis, and experimental validation

    Science.gov (United States)

    Liu, Qi; Alazemi, Saad F.; Daqaq, Mohammed F.; Li, Gang

    2018-03-01

    A computational model is described and implemented in this work to analyze the performance of a ferrofluid based electromagnetic energy harvester. The energy harvester converts ambient vibratory energy into an electromotive force through a sloshing motion of a ferrofluid. The computational model solves the coupled Maxwell's equations and Navier-Stokes equations for the dynamic behavior of the magnetic field and fluid motion. The model is validated against experimental results for eight different configurations of the system. The validated model is then employed to study the underlying mechanisms that determine the electromotive force of the energy harvester. Furthermore, computational analysis is performed to test the effect of several modeling aspects, such as three-dimensional effect, surface tension, and type of the ferrofluid-magnetic field coupling on the accuracy of the model prediction.

  4. Ion thruster modeling: Particle simulations and experimental validations

    International Nuclear Information System (INIS)

    Wang, Joseph; Polk, James; Brinza, David

    2003-01-01

    This paper presents results from ion thruster modeling studies performed in support of NASA's Deep Space 1 mission and NSTAR project. Fully 3-dimensional computer particle simulation models are presented for ion optics plasma flow and ion thruster plume. Ion optics simulation results are compared with measurements obtained from ground tests of the NSTAR ion thruster. Plume simulation results are compared with in-flight measurements from the Deep Space 1 spacecraft. Both models show excellent agreement with experimental data

  5. Hydroclimatology of the Nile: results from a regional climate model

    Directory of Open Access Journals (Sweden)

    Y. A. Mohamed

    2005-01-01

    Full Text Available This paper presents the result of the regional coupled climatic and hydrologic model of the Nile Basin. For the first time the interaction between the climatic processes and the hydrological processes on the land surface have been fully coupled. The hydrological model is driven by the rainfall and the energy available for evaporation generated in the climate model, and the runoff generated in the catchment is again routed over the wetlands of the Nile to supply moisture for atmospheric feedback. The results obtained are quite satisfactory given the extremely low runoff coefficients in the catchment. The paper presents the validation results over the sub-basins: Blue Nile, White Nile, Atbara river, the Sudd swamps, and the Main Nile for the period 1995 to 2000. Observational datasets were used to evaluate the model results including radiation, precipitation, runoff and evaporation data. The evaporation data were derived from satellite images over a major part of the Upper Nile. Limitations in both the observational data and the model are discussed. It is concluded that the model provides a sound representation of the regional water cycle over the Nile. The sources of atmospheric moisture to the basin, and location of convergence/divergence fields could be accurately illustrated. The model is used to describe the regional water cycle in the Nile basin in terms of atmospheric fluxes, land surface fluxes and land surface-climate feedbacks. The monthly moisture recycling ratio (i.e. locally generated/total precipitation over the Nile varies between 8 and 14%, with an annual mean of 11%, which implies that 89% of the Nile water resources originates from outside the basin physical boundaries. The monthly precipitation efficiency varies between 12 and 53%, and the annual mean is 28%. The mean annual result of the Nile regional water cycle is compared to that of the Amazon and the Mississippi basins.

  6. Scale Model Thruster Acoustic Measurement Results

    Science.gov (United States)

    Vargas, Magda; Kenny, R. Jeremy

    2013-01-01

    The Space Launch System (SLS) Scale Model Acoustic Test (SMAT) is a 5% scale representation of the SLS vehicle, mobile launcher, tower, and launch pad trench. The SLS launch propulsion system will be comprised of the Rocket Assisted Take-Off (RATO) motors representing the solid boosters and 4 Gas Hydrogen (GH2) thrusters representing the core engines. The GH2 thrusters were tested in a horizontal configuration in order to characterize their performance. In Phase 1, a single thruster was fired to determine the engine performance parameters necessary for scaling a single engine. A cluster configuration, consisting of the 4 thrusters, was tested in Phase 2 to integrate the system and determine their combined performance. Acoustic and overpressure data was collected during both test phases in order to characterize the system's acoustic performance. The results from the single thruster and 4- thuster system are discussed and compared.

  7. CMS standard model Higgs boson results

    Directory of Open Access Journals (Sweden)

    Garcia-Abia Pablo

    2013-11-01

    Full Text Available In July 2012 CMS announced the discovery of a new boson with properties resembling those of the long-sought Higgs boson. The analysis of the proton-proton collision data recorded by the CMS detector at the LHC, corresponding to integrated luminosities of 5.1 fb−1 at √s = 7 TeV and 19.6 fb−1 at √s = 8 TeV, confirm the Higgs-like nature of the new boson, with a signal strength associated with vector bosons and fermions consistent with the expectations for a standard model (SM Higgs boson, and spin-parity clearly favouring the scalar nature of the new boson. In this note I review the updated results of the CMS experiment.

  8. Validation results of satellite mock-up capturing experiment using nets

    Science.gov (United States)

    Medina, Alberto; Cercós, Lorenzo; Stefanescu, Raluca M.; Benvenuto, Riccardo; Pesce, Vincenzo; Marcon, Marco; Lavagna, Michèle; González, Iván; Rodríguez López, Nuria; Wormnes, Kjetil

    2017-05-01

    The PATENDER activity (Net parametric characterization and parabolic flight), funded by the European Space Agency (ESA) via its Clean Space initiative, was aiming to validate a simulation tool for designing nets for capturing space debris. This validation has been performed through a set of different experiments under microgravity conditions where a net was launched capturing and wrapping a satellite mock-up. This paper presents the architecture of the thrown-net dynamics simulator together with the set-up of the deployment experiment and its trajectory reconstruction results on a parabolic flight (Novespace A-310, June 2015). The simulator has been implemented within the Blender framework in order to provide a highly configurable tool, able to reproduce different scenarios for Active Debris Removal missions. The experiment has been performed over thirty parabolas offering around 22 s of zero-g conditions. Flexible meshed fabric structure (the net) ejected from a container and propelled by corner masses (the bullets) arranged around its circumference have been launched at different initial velocities and launching angles using a pneumatic-based dedicated mechanism (representing the chaser satellite) against a target mock-up (the target satellite). High-speed motion cameras were recording the experiment allowing 3D reconstruction of the net motion. The net knots have been coloured to allow the images post-process using colour segmentation, stereo matching and iterative closest point (ICP) for knots tracking. The final objective of the activity was the validation of the net deployment and wrapping simulator using images recorded during the parabolic flight. The high-resolution images acquired have been post-processed to determine accurately the initial conditions and generate the reference data (position and velocity of all knots of the net along its deployment and wrapping of the target mock-up) for the simulator validation. The simulator has been properly

  9. Criteria of validity for animal models of psychiatric disorders: focus on anxiety disorders and depression

    Science.gov (United States)

    2011-01-01

    Animal models of psychiatric disorders are usually discussed with regard to three criteria first elaborated by Willner; face, predictive and construct validity. Here, we draw the history of these concepts and then try to redraw and refine these criteria, using the framework of the diathesis model of depression that has been proposed by several authors. We thus propose a set of five major criteria (with sub-categories for some of them); homological validity (including species validity and strain validity), pathogenic validity (including ontopathogenic validity and triggering validity), mechanistic validity, face validity (including ethological and biomarker validity) and predictive validity (including induction and remission validity). Homological validity requires that an adequate species and strain be chosen: considering species validity, primates will be considered to have a higher score than drosophila, and considering strains, a high stress reactivity in a strain scores higher than a low stress reactivity in another strain. Pathological validity corresponds to the fact that, in order to shape pathological characteristics, the organism has been manipulated both during the developmental period (for example, maternal separation: ontopathogenic validity) and during adulthood (for example, stress: triggering validity). Mechanistic validity corresponds to the fact that the cognitive (for example, cognitive bias) or biological mechanisms (such as dysfunction of the hormonal stress axis regulation) underlying the disorder are identical in both humans and animals. Face validity corresponds to the observable behavioral (ethological validity) or biological (biomarker validity) outcomes: for example anhedonic behavior (ethological validity) or elevated corticosterone (biomarker validity). Finally, predictive validity corresponds to the identity of the relationship between the triggering factor and the outcome (induction validity) and between the effects of the treatments

  10. Validation of symptom validity tests using a "child-model" of adult cognitive impairments

    NARCIS (Netherlands)

    Rienstra, A.; Spaan, P.E.J.; Schmand, B.

    2010-01-01

    Validation studies of symptom validity tests (SVTs) in children are uncommon. However, since children’s cognitive abilities are not yet fully developed, their performance may provide additional support for the validity of these measures in adult populations. Four SVTs, the Test of Memory Malingering

  11. Validation of Symptom Validity Tests Using a "Child-model" of Adult Cognitive Impairments

    NARCIS (Netherlands)

    Rienstra, A.; Spaan, P. E. J.; Schmand, B.

    2010-01-01

    Validation studies of symptom validity tests (SVTs) in children are uncommon. However, since children's cognitive abilities are not yet fully developed, their performance may provide additional support for the validity of these measures in adult populations. Four SVTs, the Test of Memory Malingering

  12. Validation of a Parametric Approach for 3d Fortification Modelling: Application to Scale Models

    Science.gov (United States)

    Jacquot, K.; Chevrier, C.; Halin, G.

    2013-02-01

    Parametric modelling approach applied to cultural heritage virtual representation is a field of research explored for years since it can address many limitations of digitising tools. For example, essential historical sources for fortification virtual reconstructions like plans-reliefs have several shortcomings when they are scanned. To overcome those problems, knowledge based-modelling can be used: knowledge models based on the analysis of theoretical literature of a specific domain such as bastioned fortification treatises can be the cornerstone of the creation of a parametric library of fortification components. Implemented in Grasshopper, these components are manually adjusted on the data available (i.e. 3D surveys of plans-reliefs or scanned maps). Most of the fortification area is now modelled and the question of accuracy assessment is raised. A specific method is used to evaluate the accuracy of the parametric components. The results of the assessment process will allow us to validate the parametric approach. The automation of the adjustment process can finally be planned. The virtual model of fortification is part of a larger project aimed at valorising and diffusing a very unique cultural heritage item: the collection of plans-reliefs. As such, knowledge models are precious assets when automation and semantic enhancements will be considered.

  13. Qualitative Validation of the IMM Model for ISS and STS Programs

    Science.gov (United States)

    Kerstman, E.; Walton, M.; Reyes, D.; Boley, L.; Saile, L.; Young, M.; Arellano, J.; Garcia, Y.; Myers, J. G.

    2016-01-01

    To validate and further improve the Integrated Medical Model (IMM), medical event data were obtained from 32 ISS and 122 STS person-missions. Using the crew characteristics from these observed missions, IMM v4.0 was used to forecast medical events and medical resource utilization. The IMM medical condition incidence values were compared to the actual observed medical event incidence values, and the IMM forecasted medical resource utilization was compared to actual observed medical resource utilization. Qualitative comparisons of these parameters were conducted for both the ISS and STS programs. The results of these analyses will provide validation of IMM v4.0 and reveal areas of the model requiring adjustments to improve the overall accuracy of IMM outputs. This validation effort should result in enhanced credibility of the IMM and improved confidence in the use of IMM as a decision support tool for human space flight.

  14. Ovarian volume throughout life: a validated normative model.

    Science.gov (United States)

    Kelsey, Thomas W; Dodwell, Sarah K; Wilkinson, A Graham; Greve, Tine; Andersen, Claus Y; Anderson, Richard A; Wallace, W Hamish B

    2013-01-01

    The measurement of ovarian volume has been shown to be a useful indirect indicator of the ovarian reserve in women of reproductive age, in the diagnosis and management of a number of disorders of puberty and adult reproductive function, and is under investigation as a screening tool for ovarian cancer. To date there is no normative model of ovarian volume throughout life. By searching the published literature for ovarian volume in healthy females, and using our own data from multiple sources (combined n=59,994) we have generated and robustly validated the first model of ovarian volume from conception to 82 years of age. This model shows that 69% of the variation in ovarian volume is due to age alone. We have shown that in the average case ovarian volume rises from 0.7 mL (95% CI 0.4-1.1 mL) at 2 years of age to a peak of 7.7 mL (95% CI 6.5-9.2 mL) at 20 years of age with a subsequent decline to about 2.8 mL (95% CI 2.7-2.9 mL) at the menopause and smaller volumes thereafter. Our model allows us to generate normal values and ranges for ovarian volume throughout life. This is the first validated normative model of ovarian volume from conception to old age; it will be of use in the diagnosis and management of a number of diverse gynaecological and reproductive conditions in females from birth to menopause and beyond.

  15. The impact of school leadership on school level factors: validation of a causal model

    NARCIS (Netherlands)

    Krüger, M.L.; Witziers, B.; Sleegers, P.

    2007-01-01

    This study aims to contribute to a better understanding of the antecedents and effects of educational leadership, and of the influence of the principal's leadership on intervening and outcome variables. A path analysis was conducted to test and validate a causal model. The results show no direct or

  16. Validation of spectral gas radiation models under oxyfuel conditions. Part A: Gas cell experiments

    DEFF Research Database (Denmark)

    Becher, Valentin; Clausen, Sønnik; Fateev, Alexander

    2011-01-01

    AbstractCombustion of hydrocarbon fuels with pure oxygen results in a different flue gas composition as combustion with air. Standard CFD spectral gas radiation models for air combustion are out of their validity range. The series of three articles provides a common spectral basis...

  17. 2013 CEF RUN - PHASE 1 DATA ANALYSIS AND MODEL VALIDATION

    Energy Technology Data Exchange (ETDEWEB)

    Choi, A.

    2014-05-08

    Phase 1 of the 2013 Cold cap Evaluation Furnace (CEF) test was completed on June 3, 2013 after a 5-day round-the-clock feeding and pouring operation. The main goal of the test was to characterize the CEF off-gas produced from a nitric-formic acid flowsheet feed and confirm whether the CEF platform is capable of producing scalable off-gas data necessary for the revision of the DWPF melter off-gas flammability model; the revised model will be used to define new safety controls on the key operating parameters for the nitric-glycolic acid flowsheet feeds including total organic carbon (TOC). Whether the CEF off-gas data were scalable for the purpose of predicting the potential flammability of the DWPF melter exhaust was determined by comparing the predicted H{sub 2} and CO concentrations using the current DWPF melter off-gas flammability model to those measured during Phase 1; data were deemed scalable if the calculated fractional conversions of TOC-to-H{sub 2} and TOC-to-CO at varying melter vapor space temperatures were found to trend and further bound the respective measured data with some margin of safety. Being scalable thus means that for a given feed chemistry the instantaneous flow rates of H{sub 2} and CO in the DWPF melter exhaust can be estimated with some degree of conservatism by multiplying those of the respective gases from a pilot-scale melter by the feed rate ratio. This report documents the results of the Phase 1 data analysis and the necessary calculations performed to determine the scalability of the CEF off-gas data. A total of six steady state runs were made during Phase 1 under non-bubbled conditions by varying the CEF vapor space temperature from near 700 to below 300°C, as measured in a thermowell (T{sub tw}). At each steady state temperature, the off-gas composition was monitored continuously for two hours using MS, GC, and FTIR in order to track mainly H{sub 2}, CO, CO{sub 2}, NO{sub x}, and organic gases such as CH{sub 4}. The standard

  18. Validation of coastal oceanographic models at Forsmark. Site descriptive modelling SDM-Site Forsmark

    Energy Technology Data Exchange (ETDEWEB)

    Engqvist, Anders (A och I Engqvist Konsult HB, Vaxholm (SE)); Andrejev, Oleg (Finnish Inst. of Marine Research, Helsinki (FI))

    2008-01-15

    The Swedish Nuclear Fuel and Waste Management Company (SKB) is undertaking site characterisation at two different locations, the Forsmark and the Simpevarp areas, with the objective of siting a geological repository for spent nuclear fuel. The characterisation work is divided into an initial site investigation phase and a complete site investigation phase. In this context, the water exchange of the coastal zone is one link of the chain of possible nuclide transport mechanisms that must be assessed in the site description of potential repository areas. For the purpose of validating the pair of nested 3D-models employed to simulate the water exchange in the near-shore coastal zone in the Forsmark area, an encompassing measurement program entailing six stations has been performed. The design of this program was to first assess to what degree the forcing of the fine resolution (FR) model of the Forsmark study area at its interfacial boundary to the coarse resolution (CR) model of the entire Baltic was reproduced. In addition to this scrutiny it is of particular interest how the time-varying density-determining properties, salinity and temperature, at the borders are propagated into the FR-domain, since this corresponds to the most efficient mode of water exchange. An important part of the validation process has been to carefully evaluate which measurement data that can be considered reliable. The result was that several periods of foremost near-surface salinity data had to be discarded due to growth of algae on the conductivity sensors. Lack of thorough absolute calibration of the salinity meters also necessitates dismissal of measurement data. Relative the assessed data that can be accepted as adequate, the outcome of the validation can be summarized in five points: (i) The surface-most salinity of the CR-model drifts downward a little less than one practical salinity unit (psu) per year, requiring that the ensuing correlation analysis be subdivided into periods of a

  19. Validation of coastal oceanographic models at Forsmark. Site descriptive modelling SDM-Site Forsmark

    International Nuclear Information System (INIS)

    Engqvist, Anders; Andrejev, Oleg

    2008-01-01

    The Swedish Nuclear Fuel and Waste Management Company (SKB) is undertaking site characterisation at two different locations, the Forsmark and the Simpevarp areas, with the objective of siting a geological repository for spent nuclear fuel. The characterisation work is divided into an initial site investigation phase and a complete site investigation phase. In this context, the water exchange of the coastal zone is one link of the chain of possible nuclide transport mechanisms that must be assessed in the site description of potential repository areas. For the purpose of validating the pair of nested 3D-models employed to simulate the water exchange in the near-shore coastal zone in the Forsmark area, an encompassing measurement program entailing six stations has been performed. The design of this program was to first assess to what degree the forcing of the fine resolution (FR) model of the Forsmark study area at its interfacial boundary to the coarse resolution (CR) model of the entire Baltic was reproduced. In addition to this scrutiny it is of particular interest how the time-varying density-determining properties, salinity and temperature, at the borders are propagated into the FR-domain, since this corresponds to the most efficient mode of water exchange. An important part of the validation process has been to carefully evaluate which measurement data that can be considered reliable. The result was that several periods of foremost near-surface salinity data had to be discarded due to growth of algae on the conductivity sensors. Lack of thorough absolute calibration of the salinity meters also necessitates dismissal of measurement data. Relative the assessed data that can be accepted as adequate, the outcome of the validation can be summarized in five points: (i) The surface-most salinity of the CR-model drifts downward a little less than one practical salinity unit (psu) per year, requiring that the ensuing correlation analysis be subdivided into periods of a

  20. Modeling Clinically Validated Physical Activity Assessments Using Commodity Hardware.

    Science.gov (United States)

    Winfree, Kyle N; Dominick, Gregory

    2018-03-01

    Consumer-grade wearable activity devices such as Fitbits are increasingly being used in research settings to promote physical activity (PA) due to their low-cost and widespread popularity. However, Fitbit-derived measures of activity intensity are consistently reported to be less accurate than intensity estimates obtained from research-grade accelerometers (i.e., ActiGraph). As such, the potential for using a Fitbit to measure PA intensity within research contexts remains limited. This study aims to model ActiGraph-based intensity estimates from the validated Freedson vector magnitude (VM3) algorithm using measures of steps, metabolic equivalents, and intensity levels obtained from Fitbit. Minute-level data collected from 19 subjects, who concurrently wore the ActiGraph GT3X and Fitbit Flex devices for an average of 1.8 weeks, were used to generate the model. After testing several modeling methods, a naïve Bayes classifier was chosen based on the lowest achieved error rate. Overall, the model reduced Fitbit to ActiGraph errors from 19.97% to 16.32%. Moreover, the model reduced misclassification of Fitbit-based estimates of moderate-to-vigorous physical activity (MVPA) by 40%, eliminating a statistically significant difference between MVPA estimates derived from ActiGraph and Fitbit. Study findings support the general utility of the model for measuring MVPA with the Fitbit Flex in place of the more costly ActiGraph GT3X accelerometer for young healthy adults.

  1. Validation of a non-linear model of health.

    Science.gov (United States)

    Topolski, Stefan; Sturmberg, Joachim

    2014-12-01

    The purpose of this study was to evaluate the veracity of a theoretically derived model of health that describes a non-linear trajectory of health from birth to death with available population data sets. The distribution of mortality by age is directly related to health at that age, thus health approximates 1/mortality. The inverse of available all-cause mortality data from various time periods and populations was used as proxy data to compare with the theoretically derived non-linear health model predictions, using both qualitative approaches and quantitative one-sample Kolmogorov-Smirnov analysis with Monte Carlo simulation. The mortality data's inverse resembles a log-normal distribution as predicted by the proposed health model. The curves have identical slopes from birth and follow a logarithmic decline from peak health in young adulthood. A majority of the sampled populations had a good to excellent quantitative fit to a log-normal distribution, supporting the underlying model assumptions. Post hoc manipulation showed the model predictions to be stable. This is a first theory of health to be validated by proxy data, namely the inverse of all-cause mortality. This non-linear model, derived from the notion of the interaction of physical, environmental, mental, emotional, social and sense-making domains of health, gives physicians a more rigorous basis to direct health care services and resources away from disease-focused elder care towards broad-based biopsychosocial interventions earlier in life. © 2014 John Wiley & Sons, Ltd.

  2. Nonlinear ultrasound modelling and validation of fatigue damage

    Science.gov (United States)

    Fierro, G. P. Malfense; Ciampa, F.; Ginzburg, D.; Onder, E.; Meo, M.

    2015-05-01

    Nonlinear ultrasound techniques have shown greater sensitivity to microcracks and they can be used to detect structural damages at their early stages. However, there is still a lack of numerical models available in commercial finite element analysis (FEA) tools that are able to simulate the interaction of elastic waves with the materials nonlinear behaviour. In this study, a nonlinear constitutive material model was developed to predict the structural response under continuous harmonic excitation of a fatigued isotropic sample that showed anharmonic effects. Particularly, by means of Landau's theory and Kelvin tensorial representation, this model provided an understanding of the elastic nonlinear phenomena such as the second harmonic generation in three-dimensional solid media. The numerical scheme was implemented and evaluated using a commercially available FEA software LS-DYNA, and it showed a good numerical characterisation of the second harmonic amplitude generated by the damaged region known as the nonlinear response area (NRA). Since this process requires only the experimental second-order nonlinear parameter and rough damage size estimation as an input, it does not need any baseline testing with the undamaged structure or any dynamic modelling of the fatigue crack growth. To validate this numerical model, the second-order nonlinear parameter was experimentally evaluated at various points over the fatigue life of an aluminium (AA6082-T6) coupon and the crack propagation was measured using an optical microscope. A good correlation was achieved between the experimental set-up and the nonlinear constitutive model.

  3. MT3DMS: Model use, calibration, and validation

    Science.gov (United States)

    Zheng, C.; Hill, Mary C.; Cao, G.; Ma, R.

    2012-01-01

    MT3DMS is a three-dimensional multi-species solute transport model for solving advection, dispersion, and chemical reactions of contaminants in saturated groundwater flow systems. MT3DMS interfaces directly with the U.S. Geological Survey finite-difference groundwater flow model MODFLOW for the flow solution and supports the hydrologic and discretization features of MODFLOW. MT3DMS contains multiple transport solution techniques in one code, which can often be important, including in model calibration. Since its first release in 1990 as MT3D for single-species mass transport modeling, MT3DMS has been widely used in research projects and practical field applications. This article provides a brief introduction to MT3DMS and presents recommendations about calibration and validation procedures for field applications of MT3DMS. The examples presented suggest the need to consider alternative processes as models are calibrated and suggest opportunities and difficulties associated with using groundwater age in transport model calibration.

  4. Experimental validation of void fraction models for R410A air conditioners

    Energy Technology Data Exchange (ETDEWEB)

    Ma, Xiaokui; Ding, Guoliang; Zhang, Ping; Han, Weizhe [Institute of Refrigeration and Cryogenics, Shanghai Jiaotong University, No. 800 Dongchuan Road, Shanghai 200240 (China); Kasahara, Shinichi; Yamaguchi, Takahiro [Daikin Air-Conditioning R and D Laboratory, Ltd., 1304 Kanaoka-cho, Sakai, Osaka 591-8511 (Japan)

    2009-08-15

    This paper presents the combined void fraction models on the basis of flow pattern to choose suitable models for calculating the two-phase refrigerant mass of R410A air conditioners, and the model is validated experimentally. The existing single void fraction models are also evaluated experimentally. The Taitel-Dukler flow pattern map is recommended to use in the combined void fraction models. The results presented herein show that the calculated refrigerant mass using the combined void fraction models agrees relatively better with the measured refrigerant mass than that of the single void fraction models. T-B model (for intermittent flow) and Premoli model (for annular flow) are recommended for calculating the two-phase refrigerant mass in evaporator, and T-B model (for intermittent flow) and Harms model (for annular flow) are also indicated for calculating the two-phase refrigerant mass in condenser. (author)

  5. Non-Linear Slosh Damping Model Development and Validation

    Science.gov (United States)

    Yang, H. Q.; West, Jeff

    2015-01-01

    Propellant tank slosh dynamics are typically represented by a mechanical model of spring mass damper. This mechanical model is then included in the equation of motion of the entire vehicle for Guidance, Navigation and Control (GN&C) analysis. For a partially-filled smooth wall propellant tank, the critical damping based on classical empirical correlation is as low as 0.05%. Due to this low value of damping, propellant slosh is potential sources of disturbance critical to the stability of launch and space vehicles. It is postulated that the commonly quoted slosh damping is valid only under the linear regime where the slosh amplitude is small. With the increase of slosh amplitude, the critical damping value should also increase. If this nonlinearity can be verified and validated, the slosh stability margin can be significantly improved, and the level of conservatism maintained in the GN&C analysis can be lessened. The purpose of this study is to explore and to quantify the dependence of slosh damping with slosh amplitude. Accurately predicting the extremely low damping value of a smooth wall tank is very challenging for any Computational Fluid Dynamics (CFD) tool. One must resolve thin boundary layers near the wall and limit numerical damping to minimum. This computational study demonstrates that with proper grid resolution, CFD can indeed accurately predict the low damping physics from smooth walls under the linear regime. Comparisons of extracted damping values with experimental data for different tank sizes show very good agreements. Numerical simulations confirm that slosh damping is indeed a function of slosh amplitude. When slosh amplitude is low, the damping ratio is essentially constant, which is consistent with the empirical correlation. Once the amplitude reaches a critical value, the damping ratio becomes a linearly increasing function of the slosh amplitude. A follow-on experiment validated the developed nonlinear damping relationship. This discovery can

  6. Model Validation of Radiocaesium Transfer from Soil to Leafy Vegetables

    Directory of Open Access Journals (Sweden)

    P. Sukmabuana

    2012-04-01

    Full Text Available The accumulation of radionuclide in plant tissues can be estimated using a mathematical model, however the applicability of the model into field experiment still needs to be evaluated. A model validation has been conducted for radiocaesium transfer from soil to two leafy vegetables generally consumed by Indonesian people, i.e. spinach and morning glory in order to validate the transfer model toward field experimental data. The vegetable plants were grown on the soil contaminated with 134CsNO3 of 19 MBq for about 70 days. As the control, vegetables plant were also grown on soil without 134CsNO3 contamination. Every 5 days, both of contaminated and un contaminated plants were sampled for 3 persons respectively. The soil media was also tested. The samples were dried by infra red lamp and then the radioactivity was counted using gamma spectrometer. Data of 134Cs radioactivity on soil and plants were substituted into mathematical equation to obtain the coeficient of transfer rate (k12. The values of k12 were then used for calculating the 134Cs radioactivity in the vegetable plants. The 134Cs radioactivity in plants obtained from mathematical model analysis was compared with the radioactivity data obtained from the experiment. Correlation of 134Cs radioactivity in vegetables plant obtained from the experiment with those obtained from model analysis was expressed as correlation coefficient, and it was obtained to be 0.90 and 0.71 for spinach and morning glory plants respectively. The values of 134Cs in plants obtained from the model analysis can be corrected using standard deviation values, namely 48.65 and 20 for spinach at 0model analysis and experiment data, the model of 134Cs transfer from soil to plant can be used for analysing 134Cs radioactivity

  7. Validation of crop weather models for crop assessment arid yield ...

    African Journals Online (AJOL)

    IRSIS and CRPSM models were used in this study to see how closely they could predict grain yields for selected stations in Tanzania. Input for the models comprised of weather, crop and soil data collected from five selected stations. Simulation results show that IRSIS model tends to over predict grain yields of maize, ...

  8. Validating a DEB model for a blue mussel.

    NARCIS (Netherlands)

    Saraiva, S.; van der Meer, J.; Kooijman, S.A.L.M.; Witbaard, R.; Philippart, C.J.M; Hippler, D.; Parker, R.

    2012-01-01

    A model for bivalve growth was developed and the results were tested against field observations. The model is based on the Dynamic Energy Budget (DEB) theory and includes an extension of the standard DEB model to cope with changing food quantity and quality. At 4 different locations in the North Sea

  9. Validation of buoyancy driven spectral tensor model using HATS data

    DEFF Research Database (Denmark)

    Chougule, A.; Mann, Jakob; Kelly, Mark C.

    2016-01-01

    We present a homogeneous spectral tensor model for wind velocity and temperature fluctuations, driven by mean vertical shear and mean temperature gradient. Results from the model, including one-dimensional velocity and temperature spectra and the associated co-spectra, are shown in this paper....... The model also reproduces two-point statistics, such as coherence and phases, via cross-spectra between two points separated in space. Model results are compared with observations from the Horizontal Array Turbulence Study (HATS) field program (Horst et al. 2004). The spectral velocity tensor in the model...

  10. Validating agent oriented methodology (AOM) for netlogo modelling and simulation

    Science.gov (United States)

    WaiShiang, Cheah; Nissom, Shane; YeeWai, Sim; Sharbini, Hamizan

    2017-10-01

    AOM (Agent Oriented Modeling) is a comprehensive and unified agent methodology for agent oriented software development. AOM methodology was proposed to aid developers with the introduction of technique, terminology, notation and guideline during agent systems development. Although AOM methodology is claimed to be capable of developing a complex real world system, its potential is yet to be realized and recognized by the mainstream software community and the adoption of AOM is still at its infancy. Among the reason is that there are not much case studies or success story of AOM. This paper presents two case studies on the adoption of AOM for individual based modelling and simulation. It demonstrate how the AOM is useful for epidemiology study and ecological study. Hence, it further validate the AOM in a qualitative manner.

  11. Development and validation of a liquid composite molding model

    Science.gov (United States)

    Bayldon, John Michael

    2007-12-01

    In composite manufacturing, Vacuum Assisted Resin Transfer Molding (VARTM) is becoming increasingly important as a cost effective manufacturing method of structural composites. In this process the dry preform (reinforcement) is placed on a rigid tool and covered by a flexible film to form an airtight vacuum bag. Liquid resin is drawn under vacuum through the preform inside the vacuum bag. Modeling of this process relies on a good understanding of closely coupled phenomena. The resin flow depends on the preform permeability, which in turn depends on the local fluid pressure and the preform compaction behavior. VARTM models for predicting the flow rate in this process do exist, however, they are not able to properly predict the flow for all classes of reinforcement material. In this thesis, the continuity equation used in VARTM models is reexamined and a modified form proposed. In addition, the compaction behavior of the preform in both saturated and dry states is studied in detail and new models are proposed for the compaction behavior. To assess the validity of the proposed models, the shadow moire method was adapted and used to perform full field measurement of the preform thickness during infusion, in addition to the usual measurements of flow front position. A new method was developed and evaluated for the analysis of the moire data related to the VARTM process, however, the method has wider applicability to other full field thickness measurements. The use of this measurement method demonstrated that although the new compaction models work well in the characterization tests, they do not properly describe all the preform features required for modeling the process. In particular the effect of varying saturation on the preform's behavior requires additional study. The flow models developed did, however, improve the prediction of the flow rate for the more compliant preform material tested, and the experimental techniques have shown where additional test methods

  12. A Validation Process for the Groundwater Flow and Transport Model of the Faultless Nuclear Test at Central Nevada Test Area

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed Hassan

    2003-01-01

    Many sites of groundwater contamination rely heavily on complex numerical models of flow and transport to develop closure plans. This has created a need for tools and approaches that can be used to build confidence in model predictions and make it apparent to regulators, policy makers, and the public that these models are sufficient for decision making. This confidence building is a long-term iterative process and it is this process that should be termed ''model validation.'' Model validation is a process not an end result. That is, the process of model validation cannot always assure acceptable prediction or quality of the model. Rather, it provides safeguard against faulty models or inadequately developed and tested models. Therefore, development of a systematic approach for evaluating and validating subsurface predictive models and guiding field activities for data collection and long-term monitoring is strongly needed. This report presents a review of model validation studies that pertain to groundwater flow and transport modeling. Definitions, literature debates, previously proposed validation strategies, and conferences and symposia that focused on subsurface model validation are reviewed and discussed. The review is general in nature, but the focus of the discussion is on site-specific, predictive groundwater models that are used for making decisions regarding remediation activities and site closure. An attempt is made to compile most of the published studies on groundwater model validation and assemble what has been proposed or used for validating subsurface models. The aim is to provide a reasonable starting point to aid the development of the validation plan for the groundwater flow and transport model of the Faultless nuclear test conducted at the Central Nevada Test Area (CNTA). The review of previous studies on model validation shows that there does not exist a set of specific procedures and tests that can be easily adapted and

  13. SWAT application in intensive irrigation systems: Model modification, calibration and validation

    Science.gov (United States)

    Dechmi, Farida; Burguete, Javier; Skhiri, Ahmed

    2012-11-01

    SummaryThe Soil and Water Assessment Tool (SWAT) is a well established, distributed, eco-hydrologic model. However, using the study case of an agricultural intensive irrigated watershed, it was shown that all the model versions are not able to appropriately reproduce the total streamflow in such system when the irrigation source is outside the watershed. The objective of this study was to modify the SWAT2005 version for correctly simulating the main hydrological processes. Crop yield, total streamflow, total suspended sediment (TSS) losses and phosphorus load calibration and validation were performed using field survey information and water quantity and quality data recorded during 2008 and 2009 years in Del Reguero irrigated watershed in Spain. The goodness of the calibration and validation results was assessed using five statistical measures, including the Nash-Sutcliffe efficiency (NSE). Results indicated that the average annual crop yield and actual evapotranspiration estimations were quite satisfactory. On a monthly basis, the values of NSE were 0.90 (calibration) and 0.80 (validation) indicating that the modified model could reproduce accurately the observed streamflow. The TSS losses were also satisfactorily estimated (NSE = 0.72 and 0.52 for the calibration and validation steps). The monthly temporal patterns and all the statistical parameters indicated that the modified SWAT-IRRIG model adequately predicted the total phosphorus (TP) loading. Therefore, the model could be used to assess the impacts of different best management practices on nonpoint phosphorus losses in irrigated systems.

  14. External Validation of Health Economic Decision Models for Chronic Obstructive Pulmonary Disease (COPD) : Report of the Third COPD Modeling Meeting

    NARCIS (Netherlands)

    Hoogendoorn, Martine; Feenstra, Talitha L.; Asukai, Yumi; Briggs, Andrew H.; Hansen, Ryan N.; Leidl, Reiner; Risebrough, Nancy; Samyshkin, Yevgeniy; Wacker, Margarethe; Rutten-van Molken, Maureen P. M. H.

    Objectives: To validate outcomes of presently available chronic obstructive pulmonary disease (COPD) cost-effectiveness models against results of two large COPD trials the 3-year TOwards a Revolution in COPD Health (TORCH) trial and the 4-year Understanding Potential Long-term Impacts on Function

  15. Validation and Analysis of Numerical Results for a Two-Pass Trapezoidal Channel With Different Cooling Configurations of Trailing Edge.

    Science.gov (United States)

    Siddique, Waseem; El-Gabry, Lamyaa; Shevchuk, Igor V; Fransson, Torsten H

    2013-01-01

    High inlet temperatures in a gas turbine lead to an increase in the thermal efficiency of the gas turbine. This results in the requirement of cooling of gas turbine blades/vanes. Internal cooling of the gas turbine blade/vanes with the help of two-pass channels is one of the effective methods to reduce the metal temperatures. In particular, the trailing edge of a turbine vane is a critical area, where effective cooling is required. The trailing edge can be modeled as a trapezoidal channel. This paper describes the numerical validation of the heat transfer and pressure drop in a trapezoidal channel with and without orthogonal ribs at the bottom surface. A new concept of ribbed trailing edge has been introduced in this paper which presents a numerical study of several trailing edge cooling configurations based on the placement of ribs at different walls. The baseline geometries are two-pass trapezoidal channels with and without orthogonal ribs at the bottom surface of the channel. Ribs induce secondary flow which results in enhancement of heat transfer; therefore, for enhancement of heat transfer at the trailing edge, ribs are placed at the trailing edge surface in three different configurations: first without ribs at the bottom surface, then ribs at the trailing edge surface in-line with the ribs at the bottom surface, and finally staggered ribs. Heat transfer and pressure drop is calculated at Reynolds number equal to 9400 for all configurations. Different turbulent models are used for the validation of the numerical results. For the smooth channel low-Re k-ɛ model, realizable k-ɛ model, the RNG k-ω model, low-Re k-ω model, and SST k-ω models are compared, whereas for ribbed channel, low-Re k-ɛ model and SST k-ω models are compared. The results show that the low-Re k-ɛ model, which predicts the heat transfer in outlet pass of the smooth channels with difference of +7%, underpredicts the heat transfer by -17% in case of ribbed channel compared to

  16. Precise orbit determination for quad-constellation satellites at Wuhan University: strategy, result validation, and comparison

    Science.gov (United States)

    Guo, Jing; Xu, Xiaolong; Zhao, Qile; Liu, Jingnan

    2016-02-01

    This contribution summarizes the strategy used by Wuhan University (WHU) to determine precise orbit and clock products for Multi-GNSS Experiment (MGEX) of the International GNSS Service (IGS). In particular, the satellite attitude, phase center corrections, solar radiation pressure model developed and used for BDS satellites are addressed. In addition, this contribution analyzes the orbit and clock quality of the quad-constellation products from MGEX Analysis Centers (ACs) for a common time period of 1 year (2014). With IGS final GPS and GLONASS products as the reference, Multi-GNSS products of WHU (indicated by WUM) show the best agreement among these products from all MGEX ACs in both accuracy and stability. 3D Day Boundary Discontinuities (DBDs) range from 8 to 27 cm for Galileo-IOV satellites among all ACs' products, whereas WUM ones are the largest (about 26.2 cm). Among three types of BDS satellites, MEOs show the smallest DBDs from 10 to 27 cm, whereas the DBDs for all ACs products are at decimeter to meter level for GEOs and one to three decimeter for IGSOs, respectively. As to the satellite laser ranging (SLR) validation for Galileo-IOV satellites, the accuracy evaluated by SLR residuals is at the one decimeter level with the well-known systematic bias of about -5 cm for all ACs. For BDS satellites, the accuracy could reach decimeter level, one decimeter level, and centimeter level for GEOs, IGSOs, and MEOs, respectively. However, there is a noticeable bias in GEO SLR residuals. In addition, systematic errors dependent on orbit angle related to mismodeled solar radiation pressure (SRP) are present for BDS GEOs and IGSOs. The results of Multi-GNSS combined kinematic PPP demonstrate that the best accuracy of position and fastest convergence speed have been achieved using WUM products, particularly in the Up direction. Furthermore, the accuracy of static BDS only PPP degrades when the BDS IGSO and MEO satellites switches to orbit-normal orientation

  17. Prospective study of one million deaths in India: rationale, design, and validation results.

    Directory of Open Access Journals (Sweden)

    Prabhat Jha

    2006-02-01

    Full Text Available Over 75% of the annual estimated 9.5 million deaths in India occur in the home, and the large majority of these do not have a certified cause. India and other developing countries urgently need reliable quantification of the causes of death. They also need better epidemiological evidence about the relevance of physical (such as blood pressure and obesity, behavioral (such as smoking, alcohol, HIV-1 risk taking, and immunization history, and biological (such as blood lipids and gene polymorphisms measurements to the development of disease in individuals or disease rates in populations. We report here on the rationale, design, and implementation of the world's largest prospective study of the causes and correlates of mortality.We will monitor nearly 14 million people in 2.4 million nationally representative Indian households (6.3 million people in 1.1 million households in the 1998-2003 sample frame and 7.6 million people in 1.3 million households in the 2004-2014 sample frame for vital status and, if dead, the causes of death through a well-validated verbal autopsy (VA instrument. About 300,000 deaths from 1998-2003 and some 700,000 deaths from 2004-2014 are expected; of these about 850,000 will be coded by two physicians to provide causes of death by gender, age, socioeconomic status, and geographical region. Pilot studies will evaluate the addition of physical and biological measurements, specifically dried blood spots. Preliminary results from over 35,000 deaths suggest that VA can ascertain the leading causes of death, reduce the misclassification of causes, and derive the probable underlying cause of death when it has not been reported. VA yields broad classification of the underlying causes in about 90% of deaths before age 70. In old age, however, the proportion of classifiable deaths is lower. By tracking underlying demographic denominators, the study permits quantification of absolute mortality rates. Household case-control, proportional

  18. Prospective Study of One Million Deaths in India: Rationale, Design, and Validation Results.

    Directory of Open Access Journals (Sweden)

    2005-12-01

    Full Text Available BACKGROUND: Over 75% of the annual estimated 9.5 million deaths in India occur in the home, and the large majority of these do not have a certified cause. India and other developing countries urgently need reliable quantification of the causes of death. They also need better epidemiological evidence about the relevance of physical (such as blood pressure and obesity, behavioral (such as smoking, alcohol, HIV-1 risk taking, and immunization history, and biological (such as blood lipids and gene polymorphisms measurements to the development of disease in individuals or disease rates in populations. We report here on the rationale, design, and implementation of the world's largest prospective study of the causes and correlates of mortality. METHODS AND FINDINGS: We will monitor nearly 14 million people in 2.4 million nationally representative Indian households (6.3 million people in 1.1 million households in the 1998-2003 sample frame and 7.6 million people in 1.3 million households in the 2004-2014 sample frame for vital status and, if dead, the causes of death through a well-validated verbal autopsy (VA instrument. About 300,000 deaths from 1998-2003 and some 700,000 deaths from 2004-2014 are expected; of these about 850,000 will be coded by two physicians to provide causes of death by gender, age, socioeconomic status, and geographical region. Pilot studies will evaluate the addition of physical and biological measurements, specifically dried blood spots. Preliminary results from over 35,000 deaths suggest that VA can ascertain the leading causes of death, reduce the misclassification of causes, and derive the probable underlying cause of death when it has not been reported. VA yields broad classification of the underlying causes in about 90% of deaths before age 70. In old age, however, the proportion of classifiable deaths is lower. By tracking underlying demographic denominators, the study permits quantification of absolute mortality rates

  19. Validation of a fluid-structure interaction numerical model for predicting flow transients in arteries.

    Science.gov (United States)

    Kanyanta, V; Ivankovic, A; Karac, A

    2009-08-07

    Fluid-structure interaction (FSI) numerical models are now widely used in predicting blood flow transients. This is because of the importance of the interaction between the flowing blood and the deforming arterial wall to blood flow behaviour. Unfortunately, most of these FSI models lack rigorous validation and, thus, cannot guarantee the accuracy of their predictions. This paper presents the comprehensive validation of a two-way coupled FSI numerical model, developed to predict flow transients in compliant conduits such as arteries. The model is validated using analytical solutions and experiments conducted on polyurethane mock artery. Flow parameters such as pressure and axial stress (and precursor) wave speeds, wall deformations and oscillating frequency, fluid velocity and Poisson coupling effects, were used as the basis of this validation. Results show very good comparison between numerical predictions, analytical solutions and experimental data. The agreement between the three approaches is generally over 95%. The model also shows accurate prediction of Poisson coupling effects in unsteady flows through flexible pipes, which up to this stage have only being predicted analytically. Therefore, this numerical model can accurately predict flow transients in compliant vessels such as arteries.

  20. Development and Validation of a Materials Preparation Model from the Perspective of Transformative Pedagogy

    Directory of Open Access Journals (Sweden)

    Hamed Barjesteh

    2015-12-01

    Full Text Available This study is a report on the design, development, and validation of a model within the main tenets of critical pedagogy (CP with a hope to implement in education in general and applied linguistics in particular. To develop a transformative L2 materials preparation (TLMP model, the researchers drew on Crawford’s (1978 principles of CP as a springboard. These principles provide the theoretical framework of the ELT program in general, but they need to be adapted to the specific features of L2 materials development. To this end, Nation and Macalister’s (2010 model of materials development were utilized to base different aspects of materials preparation. The newly developed model has driven 22 principles which was validated through a stepwise process. It was administered among 110 participants in 15 cities of Iran. Exploratory and confirmatory factor analyses were performed. The results indicated a high level of internal consistency and satisfactory construct validity. The TLMP model could be conducive for language policy makers, ELT professionals, materials and curriculum developers. Keywords: Critical pedagogy, materials development, transformative model, ELT community, development, validation

  1. Validation of coastal oceanographic models at Laxemar-Simpevarp. Site descriptive modelling SDM-Site Laxemar

    Energy Technology Data Exchange (ETDEWEB)

    Engqvist, Anders (A och I Engqvist Konsult HB, Vaxholm (SE)); Andrejev, Oleg (Finnish Inst. of Marine Research, Helsinki (FI))

    2008-12-15

    The Swedish Nuclear Fuel and Waste Management Company (SKB) is undertaking site characterization at two different locations, the Forsmark and the Laxemar-Simpevarp areas, with the objective of siting a geological repository for spent nuclear fuel. The characterization work is divided into an initial site investigation phase and a complete site investigation phase. In this context, the water exchange of the coastal zone is one link of the chain of possible nuclide transport mechanisms that must be assessed in the site description of potential repository areas. For the purpose of validating the pair of nested 3D-models and the coupled discrete basin (CDB-) model employed to simulate the water exchange in the near-shore coastal zone in the Laxemar-Simpevarp area, an encompassing measurement program entailing data from six stations (of which two are close) has been performed. The design of this program was to first assess to what degree the forcing of the fine resolution (FR-) model of the Laxemar- Simpevarp study area at its interfacial boundary to the coarse resolution (CR-) model of the entire Baltic was reproduced. In addition to this, it is of particular interest how the time-varying density-determining properties, salinity and temperature, at the borders are propagated into the FR-domain and further influence the water exchange with the interior, more secluded, basins. An important part of the validation process has been to carefully evaluate which measurement data that can be considered reliable. The result was that some periods of foremost near-surface salinity data had to be discarded due to growth of algae on the conductivity sensors. Interference with ship traffic and lack of absolute calibration of the salinity meters necessitated dismissal of measurement data too. In this study so-called Mesan data have been consistently used for the meteorological forcing of the 3D-models. Relative the assessed data that can be accepted as adequate, the outcome of the

  2. Validation of coastal oceanographic models at Laxemar-Simpevarp. Site descriptive modelling SDM-Site Laxemar

    International Nuclear Information System (INIS)

    Engqvist, Anders; Andrejev, Oleg

    2008-12-01

    The Swedish Nuclear Fuel and Waste Management Company (SKB) is undertaking site characterization at two different locations, the Forsmark and the Laxemar-Simpevarp areas, with the objective of siting a geological repository for spent nuclear fuel. The characterization work is divided into an initial site investigation phase and a complete site investigation phase. In this context, the water exchange of the coastal zone is one link of the chain of possible nuclide transport mechanisms that must be assessed in the site description of potential repository areas. For the purpose of validating the pair of nested 3D-models and the coupled discrete basin (CDB-) model employed to simulate the water exchange in the near-shore coastal zone in the Laxemar-Simpevarp area, an encompassing measurement program entailing data from six stations (of which two are close) has been performed. The design of this program was to first assess to what degree the forcing of the fine resolution (FR-) model of the Laxemar- Simpevarp study area at its interfacial boundary to the coarse resolution (CR-) model of the entire Baltic was reproduced. In addition to this, it is of particular interest how the time-varying density-determining properties, salinity and temperature, at the borders are propagated into the FR-domain and further influence the water exchange with the interior, more secluded, basins. An important part of the validation process has been to carefully evaluate which measurement data that can be considered reliable. The result was that some periods of foremost near-surface salinity data had to be discarded due to growth of algae on the conductivity sensors. Interference with ship traffic and lack of absolute calibration of the salinity meters necessitated dismissal of measurement data too. In this study so-called Mesan data have been consistently used for the meteorological forcing of the 3D-models. Relative the assessed data that can be accepted as adequate, the outcome of the

  3. Development and validation of a habitat suitability model for ...

    Science.gov (United States)

    We developed a spatially-explicit, flexible 3-parameter habitat suitability model that can be used to identify and predict areas at higher risk for non-native dwarf eelgrass (Zostera japonica) invasion. The model uses simple environmental parameters (depth, nearshore slope, and salinity) to quantitatively describe habitat suitable for Z. japonica invasion based on ecology and physiology from the primary literature. Habitat suitability is defined with values ranging from zero to one, where one denotes areas most conducive to Z. japonica and zero denotes areas not likely to support Z. japonica growth. The model was applied to Yaquina Bay, Oregon, USA, an area that has well documented Z. japonica expansion over the last two decades. The highest suitability values for Z. japonica occurred in the mid to upper portions of the intertidal zone, with larger expanses occurring in the lower estuary. While the upper estuary did contain suitable habitat, most areas were not as large as in the lower estuary, due to inappropriate depth, a steeply sloping intertidal zone, and lower salinity. The lowest suitability values occurred below the lower intertidal zone, within the Yaquina River channel. The model was validated by comparison to a multi-year time series of Z. japonica maps, revealing a strong predictive capacity. Sensitivity analysis performed to evaluate the contribution of each parameter to the model prediction revealed that depth was the most important factor. Sh

  4. Results and validity of renal blood flow measurements using Xenon 133

    International Nuclear Information System (INIS)

    Serres, P.; Danet, B.; Guiraud, R.; Durand, D.; Ader, J.L.

    1975-01-01

    The renal blood flow was measured by external recording of the xenon 133 excretion curve. The study involved 45 patients with permanent high blood pressure and 7 transplant patients. The validity of the method was checked on 10 dogs. From the results it seems that the cortical blood flow, its fraction and the mean flow rate are the most representative of the renal haemodynamics parameters, from which may be established the repercussions of blood pressure on kidney vascularisation. Experiments are in progress on animals to check the compartment idea by comparing injections into the renal artery and into various kidney tissues in situ [fr

  5. The ASCAT soil moisture product. A Review of its specifications, validation results, and emerging applications

    Energy Technology Data Exchange (ETDEWEB)

    Wagner, Wolfgang; Hahn, Sebastian; Kidd, Richard [Vienna Univ. of Technology (Austria). Dept. of Geodesy and Geoinformation] [and others

    2013-02-15

    provide a comprehensive overview of the major characteristics and caveats of the ASCAT soil moisture product, this paper describes the ASCAT instrument and the soil moisture processor and near-real-time distribution service implemented by the European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT). A review of the most recent validation studies shows that the quality of ASCAT soil moisture product is - with the exception of arid environments -comparable to, and over some regions (e.g. Europe) even better than currently available soil moisture data derived from passive microwave sensors. Further, a review of applications studies shows that the use of the ASCAT soil moisture product is particularly advanced in the fields of numerical weather prediction and hydrologic modelling. But also in other application areas such as yield monitoring, epidemiologic modelling, or societal risks assessment some first progress can be noted. Considering the generally positive evaluation results, it is expected that the ASCAT soil moisture product will increasingly be used by a growing number of rather diverse land applications. (orig.)

  6. Developing and investigating validity of a knowledge management game simulation model

    NARCIS (Netherlands)

    Tsjernikova, Irina

    2009-01-01

    The goals of this research project were to develop a game simulation model which supports learning knowledge management in a game environment and to investigate the validity of that model. The validity of the model is approached from two perspectives: educational validity and representational

  7. Revisiting Runoff Model Calibration: Airborne Snow Observatory Results Allow Improved Modeling Results

    Science.gov (United States)

    McGurk, B. J.; Painter, T. H.

    2014-12-01

    Deterministic snow accumulation and ablation simulation models are widely used by runoff managers throughout the world to predict runoff quantities and timing. Model fitting is typically based on matching modeled runoff volumes and timing with observed flow time series at a few points in the basin. In recent decades, sparse networks of point measurements of the mountain snowpacks have been available to compare with modeled snowpack, but the comparability of results from a snow sensor or course to model polygons of 5 to 50 sq. km is suspect. However, snowpack extent, depth, and derived snow water equivalent have been produced by the NASA/JPL Airborne Snow Observatory (ASO) mission for spring of 20013 and 2014 in the Tuolumne River basin above Hetch Hetchy Reservoir. These high-resolution snowpack data have exposed the weakness in a model calibration based on runoff alone. The U.S. Geological Survey's Precipitation Runoff Modeling System (PRMS) calibration that was based on 30-years of inflow to Hetch Hetchy produces reasonable inflow results, but modeled spatial snowpack location and water quantity diverged significantly from the weekly measurements made by ASO during the two ablation seasons. The reason is that the PRMS model has many flow paths, storages, and water transfer equations, and a calibrated outflow time series can be right for many wrong reasons. The addition of a detailed knowledge of snow extent and water content constrains the model so that it is a better representation of the actual watershed hydrology. The mechanics of recalibrating PRMS to the ASO measurements will be described, and comparisons in observed versus modeled flow for both a small subbasin and the entire Hetch Hetchy basin will be shown. The recalibrated model provided a bitter fit to the snowmelt recession, a key factor for water managers as they balance declining inflows with demand for power generation and ecosystem releases during the final months of snow melt runoff.

  8. Immersive visualization of dynamic CFD model results

    International Nuclear Information System (INIS)

    Comparato, J.R.; Ringel, K.L.; Heath, D.J.

    2004-01-01

    With immersive visualization the engineer has the means for vividly understanding problem causes and discovering opportunities to improve design. Software can generate an interactive world in which collaborators experience the results of complex mathematical simulations such as computational fluid dynamic (CFD) modeling. Such software, while providing unique benefits over traditional visualization techniques, presents special development challenges. The visualization of large quantities of data interactively requires both significant computational power and shrewd data management. On the computational front, commodity hardware is outperforming large workstations in graphical quality and frame rates. Also, 64-bit commodity computing shows promise in enabling interactive visualization of large datasets. Initial interactive transient visualization methods and examples are presented, as well as development trends in commodity hardware and clustering. Interactive, immersive visualization relies on relevant data being stored in active memory for fast response to user requests. For large or transient datasets, data management becomes a key issue. Techniques for dynamic data loading and data reduction are presented as means to increase visualization performance. (author)

  9. Validation of Nonlinear Bipolar Transistor Model by Small-Signal Measurements

    DEFF Research Database (Denmark)

    Vidkjær, Jens; Porra, V.; Zhu, J.

    1992-01-01

    A new method for the validity analysis of nonlinear transistor models is presented based on DC-and small-signal S-parameter measurements and realistic consideration of the measurement and de-embedding errors and singularities of the small-signal equivalent circuit. As an example, some analysis...... results for an extended Gummel Poon model are presented in the case of a UHF bipolar power transistor....

  10. Release Fixed Heel Point (FHP) Accommodation Model Verification and Validation (V and V) Plan - Rev A

    Science.gov (United States)

    2017-01-23

    including the effects of body size, vehicle layout, and Soldier protective equipment and gear. The boundaries defined include the required space and...90% of Soldier population, 85% male). The model can guide vehicle designers in creating an optimized work space for the occupant. The CAD...posture metrics observed in vehicles . The results of the validation testing may lead to the FHP accommodation model being adjusted to address

  11. Water balance at an arid site: a model validation study of bare soil evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Jones, T.L.; Campbell, G.S.; Gee, G.W.

    1984-03-01

    This report contains results of model validation studies conducted by Pacific Northwest Laboratory (PNL) for the Department of Energy's (DOE) National Low Level Waste Management Program (NLLWMP). The model validation tests consisted of using unsaturated water flow models to simulate water balance experiments conducted at the Buried Waste Test Facility (BWTF) located at the Department of Energy's Hanford site, near Richland, Washington. The BWTF is a lysimeter facility designed to collect field data on long-term water balance and radionuclide tracer movement. It has been operated by PNL for the NLLWMP since 1978. An experimental test case, developed from data collected at the BWTF, was used to evaluate predictions from different water flow models. The major focus of the validation study was to evaluate how the use of different evaporation models affected the accuracy of predictions of evaporation, storage, and drainage made by the whole model. Four evaporation models were tested including two empirical models and two mechanistic models. The empirical models estimate actual evaporation from potential evaporation; the mechanistic models describe water vapor diffusion within the soil profile and between the soil and the atmosphere in terms of fundamental soil properties, and transport processes. The water flow models that included the diffusion-type evaporation submodels performed best overall. The empirical models performed poorly in their description of evaporation and profile water storage during summer months. The predictions of drainage were supported quite well by the experimental data. This indicates that the method used to estimate hydraulic conductivity needed for the Darcian submodel was adequate. This important result supports recommendations for these procedures that were made previously based on laboratory results.

  12. Experimental Validation of Various Temperature Modells for Semi-Physical Tyre Model Approaches

    Science.gov (United States)

    Hackl, Andreas; Scherndl, Christoph; Hirschberg, Wolfgang; Lex, Cornelia

    2017-10-01

    With increasing level of complexity and automation in the area of automotive engineering, the simulation of safety relevant Advanced Driver Assistance Systems (ADAS) leads to increasing accuracy demands in the description of tyre contact forces. In recent years, with improvement in tyre simulation, the needs for coping with tyre temperatures and the resulting changes in tyre characteristics are rising significantly. Therefore, experimental validation of three different temperature model approaches is carried out, discussed and compared in the scope of this article. To investigate or rather evaluate the range of application of the presented approaches in combination with respect of further implementation in semi-physical tyre models, the main focus lies on the a physical parameterisation. Aside from good modelling accuracy, focus is held on computational time and complexity of the parameterisation process. To evaluate this process and discuss the results, measurements from a Hoosier racing tyre 6.0 / 18.0 10 LCO C2000 from an industrial flat test bench are used. Finally the simulation results are compared with the measurement data.

  13. Validation of PV-RPM Code in the System Advisor Model.

    Energy Technology Data Exchange (ETDEWEB)

    Klise, Geoffrey Taylor [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lavrova, Olga [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Freeman, Janine [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-04-01

    This paper describes efforts made by Sandia National Laboratories (SNL) and the National Renewable Energy Laboratory (NREL) to validate the SNL developed PV Reliability Performance Model (PV - RPM) algorithm as implemented in the NREL System Advisor Model (SAM). The PV - RPM model is a library of functions that estimates component failure and repair in a photovoltaic system over a desired simulation period. The failure and repair distributions in this paper are probabilistic representations of component failure and repair based on data collected by SNL for a PV power plant operating in Arizona. The validation effort focuses on whether the failure and repair dist ributions used in the SAM implementation result in estimated failures that match the expected failures developed in the proof - of - concept implementation. Results indicate that the SAM implementation of PV - RPM provides the same results as the proof - of - concep t implementation, indicating the algorithms were reproduced successfully.

  14. Uncertainty in Earth System Models: Benchmarks for Ocean Model Performance and Validation

    Science.gov (United States)

    Ogunro, O. O.; Elliott, S.; Collier, N.; Wingenter, O. W.; Deal, C.; Fu, W.; Hoffman, F. M.

    2017-12-01

    The mean ocean CO2 sink is a major component of the global carbon budget, with marine reservoirs holding about fifty times more carbon than the atmosphere. Phytoplankton play a significant role in the net carbon sink through photosynthesis and drawdown, such that about a quarter of anthropogenic CO2 emissions end up in the ocean. Biology greatly increases the efficiency of marine environments in CO2 uptake and ultimately reduces the impact of the persistent rise in atmospheric concentrations. However, a number of challenges remain in appropriate representation of marine biogeochemical processes in Earth System Models (ESM). These threaten to undermine the community effort to quantify seasonal to multidecadal variability in ocean uptake of atmospheric CO2. In a bid to improve analyses of marine contributions to climate-carbon cycle feedbacks, we have developed new analysis methods and biogeochemistry metrics as part of the International Ocean Model Benchmarking (IOMB) effort. Our intent is to meet the growing diagnostic and benchmarking needs of ocean biogeochemistry models. The resulting software package has been employed to validate DOE ocean biogeochemistry results by comparison with observational datasets. Several other international ocean models contributing results to the fifth phase of the Coupled Model Intercomparison Project (CMIP5) were analyzed simultaneously. Our comparisons suggest that the biogeochemical processes determining CO2 entry into the global ocean are not well represented in most ESMs. Polar regions continue to show notable biases in many critical biogeochemical and physical oceanographic variables. Some of these disparities could have first order impacts on the conversion of atmospheric CO2 to organic carbon. In addition, single forcing simulations show that the current ocean state can be partly explained by the uptake of anthropogenic emissions. Combined effects of two or more of these forcings on ocean biogeochemical cycles and ecosystems

  15. Linkage of PRA models. Phase 1, Results

    Energy Technology Data Exchange (ETDEWEB)

    Smith, C.L.; Knudsen, J.K.; Kelly, D.L.

    1995-12-01

    The goal of the Phase I work of the ``Linkage of PRA Models`` project was to postulate methods of providing guidance for US Nuclear Regulator Commission (NRC) personnel on the selection and usage of probabilistic risk assessment (PRA) models that are best suited to the analysis they are performing. In particular, methods and associated features are provided for (a) the selection of an appropriate PRA model for a particular analysis, (b) complementary evaluation tools for the analysis, and (c) a PRA model cross-referencing method. As part of this work, three areas adjoining ``linking`` analyses to PRA models were investigated: (a) the PRA models that are currently available, (b) the various types of analyses that are performed within the NRC, and (c) the difficulty in trying to provide a ``generic`` classification scheme to groups plants based upon a particular plant attribute.

  16. Engineering Glass Passivation Layers -Model Results

    Energy Technology Data Exchange (ETDEWEB)

    Skorski, Daniel C.; Ryan, Joseph V.; Strachan, Denis M.; Lepry, William C.

    2011-08-08

    The immobilization of radioactive waste into glass waste forms is a baseline process of nuclear waste management not only in the United States, but worldwide. The rate of radionuclide release from these glasses is a critical measure of the quality of the waste form. Over long-term tests and using extrapolations of ancient analogues, it has been shown that well designed glasses exhibit a dissolution rate that quickly decreases to a slow residual rate for the lifetime of the glass. The mechanistic cause of this decreased corrosion rate is a subject of debate, with one of the major theories suggesting that the decrease is caused by the formation of corrosion products in such a manner as to present a diffusion barrier on the surface of the glass. Although there is much evidence of this type of mechanism, there has been no attempt to engineer the effect to maximize the passivating qualities of the corrosion products. This study represents the first attempt to engineer the creation of passivating phases on the surface of glasses. Our approach utilizes interactions between the dissolving glass and elements from the disposal environment to create impermeable capping layers. By drawing from other corrosion studies in areas where passivation layers have been successfully engineered to protect the bulk material, we present here a report on mineral phases that are likely have a morphological tendency to encrust the surface of the glass. Our modeling has focused on using the AFCI glass system in a carbonate, sulfate, and phosphate rich environment. We evaluate the minerals predicted to form to determine the likelihood of the formation of a protective layer on the surface of the glass. We have also modeled individual ions in solutions vs. pH and the addition of aluminum and silicon. These results allow us to understand the pH and ion concentration dependence of mineral formation. We have determined that iron minerals are likely to form a complete incrustation layer and we plan

  17. An exercise in model validation: Comparing univariate statistics and Monte Carlo-based multivariate statistics

    International Nuclear Information System (INIS)

    Weathers, J.B.; Luck, R.; Weathers, J.W.

    2009-01-01

    The complexity of mathematical models used by practicing engineers is increasing due to the growing availability of sophisticated mathematical modeling tools and ever-improving computational power. For this reason, the need to define a well-structured process for validating these models against experimental results has become a pressing issue in the engineering community. This validation process is partially characterized by the uncertainties associated with the modeling effort as well as the experimental results. The net impact of the uncertainties on the validation effort is assessed through the 'noise level of the validation procedure', which can be defined as an estimate of the 95% confidence uncertainty bounds for the comparison error between actual experimental results and model-based predictions of the same quantities of interest. Although general descriptions associated with the construction of the noise level using multivariate statistics exists in the literature, a detailed procedure outlining how to account for the systematic and random uncertainties is not available. In this paper, the methodology used to derive the covariance matrix associated with the multivariate normal pdf based on random and systematic uncertainties is examined, and a procedure used to estimate this covariance matrix using Monte Carlo analysis is presented. The covariance matrices are then used to construct approximate 95% confidence constant probability contours associated with comparison error results for a practical example. In addition, the example is used to show the drawbacks of using a first-order sensitivity analysis when nonlinear local sensitivity coefficients exist. Finally, the example is used to show the connection between the noise level of the validation exercise calculated using multivariate and univariate statistics.

  18. Accounting for treatment use when validating a prognostic model: a simulation study

    Directory of Open Access Journals (Sweden)

    Romin Pajouheshnia

    2017-07-01

    Full Text Available Abstract Background Prognostic models often show poor performance when applied to independent validation data sets. We illustrate how treatment use in a validation set can affect measures of model performance and present the uses and limitations of available analytical methods to account for this using simulated data. Methods We outline how the use of risk-lowering treatments in a validation set can lead to an apparent overestimation of risk by a prognostic model that was developed in a treatment-naïve cohort to make predictions of risk without treatment. Potential methods to correct for the effects of treatment use when testing or validating a prognostic model are discussed from a theoretical perspective.. Subsequently, we assess, in simulated data sets, the impact of excluding treated individuals and the use of inverse probability weighting (IPW on the estimated model discrimination (c-index and calibration (observed:expected ratio and calibration plots in scenarios with different patterns and effects of treatment use. Results Ignoring the use of effective treatments in a validation data set leads to poorer model discrimination and calibration than would be observed in the untreated target population for the model. Excluding treated individuals provided correct estimates of model performance only when treatment was randomly allocated, although this reduced the precision of the estimates. IPW followed by exclusion of the treated individuals provided correct estimates of model performance in data sets where treatment use was either random or moderately associated with an individual's risk when the assumptions of IPW were met, but yielded incorrect estimates in the presence of non-positivity or an unobserved confounder. Conclusions When validating a prognostic model developed to make predictions of risk without treatment, treatment use in the validation set can bias estimates of the performance of the model in future targeted individuals, and

  19. System-Level Validation High-Level Modeling and Directed Test Generation Techniques

    CERN Document Server

    Chen, Mingsong; Koo, Heon-Mo; Mishra, Prabhat

    2013-01-01

    This book covers state-of-the art techniques for high-level modeling and validation of complex hardware/software systems, including those with multicore architectures.  Readers will learn to avoid time-consuming and error-prone validation from the comprehensive coverage of system-level validation, including high-level modeling of designs and faults, automated generation of directed tests, and efficient validation methodology using directed tests and assertions.  The methodologies described in this book will help designers to improve the quality of their validation, performing as much validation as possible in the early stages of the design, while reducing the overall validation effort and cost.

  20. Validation Testing of a Peridynamic Impact Damage Model Using NASA's Micro-Particle Gun

    Science.gov (United States)

    Baber, Forrest E.; Zelinski, Brian J.; Guven, Ibrahim; Gray, Perry

    2017-01-01

    Through a collaborative effort between the Virginia Commonwealth University and Raytheon, a peridynamic model for sand impact damage has been developed1-3. Model development has focused on simulating impacts of sand particles on ZnS traveling at velocities consistent with aircraft take-off and landing speeds. The model reproduces common features of impact damage including pit and radial cracks, and, under some conditions, lateral cracks. This study focuses on a preliminary validation exercise in which simulation results from the peridynamic model are compared to a limited experimental data set generated by NASA's recently developed micro-particle gun (MPG). The MPG facility measures the dimensions and incoming and rebound velocities of the impact particles. It also links each particle to a specific impact site and its associated damage. In this validation exercise parameters of the peridynamic model are adjusted to fit the experimentally observed pit diameter, average length of radial cracks and rebound velocities for 4 impacts of 300 µm glass beads on ZnS. Results indicate that a reasonable fit of these impact characteristics can be obtained by suitable adjustment of the peridynamic input parameters, demonstrating that the MPG can be used effectively as a validation tool for impact modeling and that the peridynamic sand impact model described herein possesses not only a qualitative but also a quantitative ability to simulate sand impact events.

  1. A virtual source model for Kilo-voltage cone beam CT: Source characteristics and model validation

    International Nuclear Information System (INIS)

    Spezi, E.; Volken, W.; Frei, D.; Fix, M. K.

    2011-01-01

    Purpose: The purpose of this investigation was to study the source characteristics of a clinical kilo-voltage cone beam CT unit and to develop and validate a virtual source model that could be used for treatment planning purposes. Methods: We used a previously commissioned full Monte Carlo model and new bespoke software to study the source characteristics of a clinical kilo-voltage cone beam CT (CBCT) unit. We identified the main particle sources, their spatial, energy and angular distribution for all the image acquisition presets currently used in our clinical practice. This includes a combination of two energies (100 and 120 kVp), two filters (neutral and bowtie), and eight different x-ray beam apertures. We subsequently built a virtual source model which we validated against full Monte Carlo calculations. Results: We found that the radiation output of the clinical kilo-voltage cone beam CT unit investigated in this study could be reproduced with a virtual model comprising of two sources (target and filtration cone) or three sources (target, filtration cone and bowtie filter) when additional filtration was used. With this model, we accounted for more than 97% of the photons exiting the unit. Each source in our model was characterised by a origin distribution in both X and Y directions, a fluence map, a single energy spectrum for unfiltered beams and a two dimensional energy spectrum for bowtie filtered beams. The percentage dose difference between full Monte Carlo and virtual source model based dose distributions was well within the statistical uncertainty associated with the calculations ( ± 2%, one standard deviation) in all cases studied. Conclusions: The virtual source that we developed is accurate in calculating the dose delivered from a commercial kilo-voltage cone beam CT unit operating with routine clinical image acquisition settings. Our data have also shown that target, filtration cone, and bowtie filter sources needed to be all included in the model

  2. Validity of the Neuromuscular Recovery Scale: a measurement model approach.

    Science.gov (United States)

    Velozo, Craig; Moorhouse, Michael; Ardolino, Elizabeth; Lorenz, Doug; Suter, Sarah; Basso, D Michele; Behrman, Andrea L

    2015-08-01

    To determine how well the Neuromuscular Recovery Scale (NRS) items fit the Rasch, 1-parameter, partial-credit measurement model. Confirmatory factor analysis (CFA) and principal components analysis (PCA) of residuals were used to determine dimensionality. The Rasch, 1-parameter, partial-credit rating scale model was used to determine rating scale structure, person/item fit, point-measure item correlations, item discrimination, and measurement precision. Seven NeuroRecovery Network clinical sites. Outpatients (N=188) with spinal cord injury. Not applicable. NRS. While the NRS met 1 of 3 CFA criteria, the PCA revealed that the Rasch measurement dimension explained 76.9% of the variance. Ten of 11 items and 91% of the patients fit the Rasch model, with 9 of 11 items showing high discrimination. Sixty-nine percent of the ratings met criteria. The items showed a logical item-difficulty order, with Stand retraining as the easiest item and Walking as the most challenging item. The NRS showed no ceiling or floor effects and separated the sample into almost 5 statistically distinct strata; individuals with an American Spinal Injury Association Impairment Scale (AIS) D classification showed the most ability, and those with an AIS A classification showed the least ability. Items not meeting the rating scale criteria appear to be related to the low frequency counts. The NRS met many of the Rasch model criteria for construct validity. Copyright © 2015 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  3. Using of Structural Equation Modeling Techniques in Cognitive Levels Validation

    Directory of Open Access Journals (Sweden)

    Natalija Curkovic

    2012-10-01

    Full Text Available When constructing knowledge tests, cognitive level is usually one of the dimensions comprising the test specifications with each item assigned to measure a particular level. Recently used taxonomies of the cognitive levels most often represent some modification of the original Bloom’s taxonomy. There are many concerns in current literature about existence of predefined cognitive levels. The aim of this article is to investigate can structural equation modeling techniques confirm existence of different cognitive levels. For the purpose of the research, a Croatian final high-school Mathematics exam was used (N = 9626. Confirmatory factor analysis and structural regression modeling were used to test three different models. Structural equation modeling techniques did not support existence of different cognitive levels in this case. There is more than one possible explanation for that finding. Some other techniques that take into account nonlinear behaviour of the items as well as qualitative techniques might be more useful for the purpose of the cognitive levels validation. Furthermore, it seems that cognitive levels were not efficient descriptors of the items and so improvements are needed in describing the cognitive skills measured by items.

  4. Modelling and Validating a Deoiling Hydrocyclone for Fault Diagnosis using Multilevel Flow Modeling

    DEFF Research Database (Denmark)

    Nielsen, Emil Krabbe; Bram, Mads Valentin; Frutiger, Jerome

    Decision support systems are a key focus in research on developing control rooms to aidoperators in making reliable decisions, and reducing incidents caused by human errors. For thispurpose, models of complex systems can be developed to diagnose causes or consequences forspecific alarms. Models a...... experiments are used for validation of two simpleMultilevel Flow Modeling models of a deoiling hydrocyclone, used for water and oil separation....

  5. Computer-aided test selection and result validation-opportunities and pitfalls

    DEFF Research Database (Denmark)

    McNair, P; Brender, J; Talmon, J

    1998-01-01

    Dynamic test scheduling is concerned with pre-analytical preprocessing of the individual samples within a clinical laboratory production by means of decision algorithms. The purpose of such scheduling is to provide maximal information with minimal data production (to avoid data pollution and...... implementing such dynamic test scheduling within a Laboratory Information System (and/or an advanced analytical workstation). The challenge is related to 1) generation of appropriately validated decision models, and 2) mastering consequences of analytical imprecision and bias......./or to increase cost-efficiency). Our experience shows that there is a practical limit to the extent of exploitation of the principle of dynamic test scheduling, unless it is automated in one way or the other. This paper analyses some issues of concern related to the profession of clinical biochemistry, when...

  6. Planck intermediate results: IV. the XMM-Newton validation programme for new Planck galaxy clusters

    DEFF Research Database (Denmark)

    Bartlett, J.G.; Delabrouille, J.; Ganga, K.

    2013-01-01

    We present the final results from the XMM-Newton validation follow-up of new Planck galaxy cluster candidates. We observed 15 new candidates, detected with signal-to-noise ratios between 4.0 and 6.1 in the 15.5-month nominal Planck survey. The candidates were selected using ancillary data flags...... derived from the ROSAT All Sky Survey (RASS) and Digitized Sky Survey all-sky maps, with the aim of pushing into the low SZ flux, high-z regime and testing RASS flags as indicators of candidate reliability. Fourteen new clusters were detected by XMM-Newton, ten single clusters and two double systems....... Redshifts from X-ray spectroscopy lie in the range 0.2 to 0.9, with six clusters at z > 0.5. Estimated masses (M500) range from 2.5 × 1014 to 8 × 1014 MM·. We discuss our results in the context of the full XMM-Newton validation programme, in which 51 new clusters have been detected. This includes four...

  7. A model to predict element redistribution in unsaturated soil: Its simplification and validation

    International Nuclear Information System (INIS)

    Sheppard, M.I.; Stephens, M.E.; Davis, P.A.; Wojciechowski, L.

    1991-01-01

    A research model has been developed to predict the long-term fate of contaminants entering unsaturated soil at the surface through irrigation or atmospheric deposition, and/or at the water table through groundwater. The model, called SCEMR1 (Soil Chemical Exchange and Migration of Radionuclides, Version 1), uses Darcy's law to model water movement, and the soil solid/liquid partition coefficient, K d , to model chemical exchange. SCEMR1 has been validated extensively on controlled field experiments with several soils, aeration statuses and the effects of plants. These validation results show that the model is robust and performs well. Sensitivity analyses identified soil K d , annual effective precipitation, soil type and soil depth to be the four most important model parameters. SCEMR1 consumes too much computer time for incorporation into a probabilistic assessment code. Therefore, we have used SCEMR1 output to derive a simple assessment model. The assessment model reflects the complexity of its parent code, and provides a more realistic description of containment transport in soils than would a compartment model. Comparison of the performance of the SCEMR1 research model, the simple SCEMR1 assessment model and the TERRA compartment model on a four-year soil-core experiment shows that the SCEMR1 assessment model generally provides conservative soil concentrations. (15 refs., 3 figs.)

  8. Integrated Process Modeling-A Process Validation Life Cycle Companion.

    Science.gov (United States)

    Zahel, Thomas; Hauer, Stefan; Mueller, Eric M; Murphy, Patrick; Abad, Sandra; Vasilieva, Elena; Maurer, Daniel; Brocard, Cécile; Reinisch, Daniela; Sagmeister, Patrick; Herwig, Christoph

    2017-10-17

    During the regulatory requested process validation of pharmaceutical manufacturing processes, companies aim to identify, control, and continuously monitor process variation and its impact on critical quality attributes (CQAs) of the final product. It is difficult to directly connect the impact of single process parameters (PPs) to final product CQAs, especially in biopharmaceutical process development and production, where multiple unit operations are stacked together and interact with each other. Therefore, we want to present the application of Monte Carlo (MC) simulation using an integrated process model (IPM) that enables estimation of process capability even in early stages of process validation. Once the IPM is established, its capability in risk and criticality assessment is furthermore demonstrated. IPMs can be used to enable holistic production control strategies that take interactions of process parameters of multiple unit operations into account. Moreover, IPMs can be trained with development data, refined with qualification runs, and maintained with routine manufacturing data which underlines the lifecycle concept. These applications will be shown by means of a process characterization study recently conducted at a world-leading contract manufacturing organization (CMO). The new IPM methodology therefore allows anticipation of out of specification (OOS) events, identify critical process parameters, and take risk-based decisions on counteractions that increase process robustness and decrease the likelihood of OOS events.

  9. Modeling and Field Results from Seismic Stimulation

    International Nuclear Information System (INIS)

    Majer, E.; Pride, S.; Lo, W.; Daley, T.; Nakagawa, Seiji; Sposito, Garrison; Roberts, P.

    2006-01-01

    Modeling the effect of seismic stimulation employing Maxwell-Boltzmann theory shows that the important component of stimulation is mechanical rather than fluid pressure effects. Modeling using Biot theory (two phases) shows that the pressure effects diffuse too quickly to be of practical significance. Field data from actual stimulation will be shown to compare to theory

  10. A new stationary droplet evaporation model and its validation

    Directory of Open Access Journals (Sweden)

    Fang WANG

    2017-08-01

    Full Text Available The liquid droplet evaporation character is important for not only combustion chamber design process but also high-accuracy spray combustion simulation. In this paper, the suspended droplets’ evaporation character was measured in a quiescent high-temperature environment by micro high-speed camera system. The gasoline and kerosene experimental results are consistent with the reference data. Methanol, common kerosene and aviation kerosene droplet evaporation characteristics, as well as their evaporation rate changing with temperature, were obtained. The evaporation rate experimental data were compared with the prediction result of Ranz-Marshall boiling temperature model (RMB, Ranz-Marshall low-temperature model (RML, drift flux model (DFM, mass analogy model (MAM, and stagnant film model (SFM. The disparity between the experimental data and the model prediction results was mainly caused by the neglect of the natural convection effect, which was never introduced into the droplet evaporation concept. A new droplet evaporation model with consideration of natural convection buoyancy force effect was proposed in this paper. Under the experimental conditions in this paper, the calculation results of the new droplet evaporation model were agreed with the experimental data for kerosene, methanol and other fuels, with less than 20% relative deviations. The relative deviations between the new evaporation model predictions for kerosene and the experimental data from the references were within 10%.

  11. Numerical modelling of the bonding process for wind turbine blades: model validation

    DEFF Research Database (Denmark)

    Uzal, Anil; Spangenberg, Jon; W. Nielsen, Michael

    numerical modelis developed in order to analyse adhesive propagation in squeeze flow problems with 3-D flow effects.The model is validated by comparison with an experiment where a rectangular prism shaped adhesivesample is squeezed between two parallel plates. In the numerical model the rheological...

  12. Multi-criteria validation of artificial neural network rainfall-runoff modeling

    Directory of Open Access Journals (Sweden)

    R. Modarres

    2009-03-01

    Full Text Available In this study we propose a comprehensive multi-criteria validation test for rainfall-runoff modeling by artificial neural networks. This study applies 17 global statistics and 3 additional non-parametric tests to evaluate the ANNs. The weakness of global statistics for validation of ANN is demonstrated by rainfall-runoff modeling of the Plasjan Basin in the western region of the Zayandehrud watershed, Iran. Although the global statistics showed that the multi layer perceptron with 4 hidden layers (MLP4 is the best ANN for the basin comparing with other MLP networks and empirical regression model, the non-parametric tests illustrate that neither the ANNs nor the regression model are able to reproduce the probability distribution of observed runoff in validation phase. However, the MLP4 network is the best network to reproduce the mean and variance of the observed runoff based on non-parametric tests. The performance of ANNs and empirical model was also demonstrated for low, medium and high flows. Although the MLP4 network gives the best performance among ANNs for low, medium and high flows based on different statistics, the empirical model shows better results. However, none of the models is able to simulate the frequency distribution of low, medium and high flows according to non-parametric tests. This study illustrates that the modelers should select appropriate and relevant evaluation measures from the set of existing metrics based on the particular requirements of each individual applications.

  13. A Simulation Tool for Geometrical Analysis and Optimization of Fuel Cell Bipolar Plates: Development, Validation and Results

    Directory of Open Access Journals (Sweden)

    Javier Pino

    2009-07-01

    Full Text Available Bipolar plates (BPs are one of the most important components in Proton Exchange Membrane Fuel Cells (PEMFC due to the numerous functions they perform. The objective of the research work described in this paper was to develop a simplified and validated method based on Computational Fluid Dynamics (CFD, aimed at the analysis and study of the influence of geometrical parameters of BPs on the operation of a cell. A complete sensibility analysis of the influence of dimensions and shape of the BP can be obtained through a simplified CFD model without including the complexity of other components of the PEMFC. This model is compared with the PEM Fuel Cell Module of the FLUENT software, which includes the physical and chemical phenomena relevant in PEMFCs. Results with both models regarding the flow field inside the channels and local current densities are obtained and compared. The results show that it is possible to use the simple model as a standard tool for geometrical analysis of BPs, and results of a sensitivity analysis using the simplified model are presented and discussed.

  14. Assessing the Validity of Single-item Life Satisfaction Measures: Results from Three Large Samples

    Science.gov (United States)

    Cheung, Felix; Lucas, Richard E.

    2014-01-01

    Purpose The present paper assessed the validity of single-item life satisfaction measures by comparing single-item measures to the Satisfaction with Life Scale (SWLS) - a more psychometrically established measure. Methods Two large samples from Washington (N=13,064) and Oregon (N=2,277) recruited by the Behavioral Risk Factor Surveillance System (BRFSS) and a representative German sample (N=1,312) recruited by the Germany Socio-Economic Panel (GSOEP) were included in the present analyses. Single-item life satisfaction measures and the SWLS were correlated with theoretically relevant variables, such as demographics, subjective health, domain satisfaction, and affect. The correlations between the two life satisfaction measures and these variables were examined to assess the construct validity of single-item life satisfaction measures. Results Consistent across three samples, single-item life satisfaction measures demonstrated substantial degree of criterion validity with the SWLS (zero-order r = 0.62 – 0.64; disattenuated r = 0.78 – 0.80). Patterns of statistical significance for correlations with theoretically relevant variables were the same across single-item measures and the SWLS. Single-item measures did not produce systematically different correlations compared to the SWLS (average difference = 0.001 – 0.005). The average absolute difference in the magnitudes of the correlations produced by single-item measures and the SWLS were very small (average absolute difference = 0.015 −0.042). Conclusions Single-item life satisfaction measures performed very similarly compared to the multiple-item SWLS. Social scientists would get virtually identical answer to substantive questions regardless of which measure they use. PMID:24890827

  15. A Supervised Learning Process to Validate Online Disease Reports for Use in Predictive Models.

    Science.gov (United States)

    Patching, Helena M M; Hudson, Laurence M; Cooke, Warrick; Garcia, Andres J; Hay, Simon I; Roberts, Mark; Moyes, Catherine L

    2015-12-01

    Pathogen distribution models that predict spatial variation in disease occurrence require data from a large number of geographic locations to generate disease risk maps. Traditionally, this process has used data from public health reporting systems; however, using online reports of new infections could speed up the process dramatically. Data from both public health systems and online sources must be validated before they can be used, but no mechanisms exist to validate data from online media reports. We have developed a supervised learning process to validate geolocated disease outbreak data in a timely manner. The process uses three input features, the data source and two metrics derived from the location of each disease occurrence. The location of disease occurrence provides information on the probability of disease occurrence at that location based on environmental and socioeconomic factors and the distance within or outside the current known disease extent. The process also uses validation scores, generated by disease experts who review a subset of the data, to build a training data set. The aim of the supervised learning process is to generate validation scores that can be used as weights going into the pathogen distribution model. After analyzing the three input features and testing the performance of alternative processes, we selected a cascade of ensembles comprising logistic regressors. Parameter values for the training data subset size, number of predictors, and number of layers in the cascade were tested before the process was deployed. The final configuration was tested using data for two contrasting diseases (dengue and cholera), and 66%-79% of data points were assigned a validation score. The remaining data points are scored by the experts, and the results inform the training data set for the next set of predictors, as well as going to the pathogen distribution model. The new supervised learning process has been implemented within our live site and is

  16. Utilizing Chamber Data for Developing and Validating Climate Change Models

    Science.gov (United States)

    Monje, Oscar

    2012-01-01

    Controlled environment chambers (e.g. growth chambers, SPAR chambers, or open-top chambers) are useful for measuring plant ecosystem responses to climatic variables and CO2 that affect plant water relations. However, data from chambers was found to overestimate responses of C fluxes to CO2 enrichment. Chamber data may be confounded by numerous artifacts (e.g. sidelighting, edge effects, increased temperature and VPD, etc) and this limits what can be measured accurately. Chambers can be used to measure canopy level energy balance under controlled conditions and plant transpiration responses to CO2 concentration can be elucidated. However, these measurements cannot be used directly in model development or validation. The response of stomatal conductance to CO2 will be the same as in the field, but the measured response must be recalculated in such a manner to account for differences in aerodynamic conductance, temperature and VPD between the chamber and the field.

  17. Validation Of Developed Materials Requirement Planning MRP Integrated Flow System Model Of Ims For Piemf

    Directory of Open Access Journals (Sweden)

    T.T Amachree

    2017-08-01

    Full Text Available Developed MRP as the Most Significant Inventory management Strategy that will correlate strongly with PIEMF. The result of the test case of MRP-based integrated flow system model as shown in table 6 Indicate that the model is effective and valid for PIEMF at 95 confidence interval with F-value 3.121 and P-value sig. 0.034. The model provides abstract representation and timely understanding of the subject matter and as a true indication of a situation of IMS for PIEMF. The flow system model will serve as a veritable decision support system of inventory management for PIEMF

  18. Groundwater Model Validation for the Project Shoal Area, Corrective Action Unit 447

    Energy Technology Data Exchange (ETDEWEB)

    Hassan, Ahmed [Desert Research Inst. (DRI), Las Vegas, NV (United States). Division of Hydrologic Sciences; Chapman, Jenny [Desert Research Inst. (DRI), Las Vegas, NV (United States). Division of Hydrologic Sciences; Lyles, Brad [Desert Research Inst. (DRI), Las Vegas, NV (United States). Division of Hydrologic Sciences

    2008-05-19

    Stoller has examined newly collected water level data in multiple wells at the Shoal site. On the basis of these data and information presented in the report, we are currently unable to confirm that the model is successfully validated. Most of our concerns regarding the model stem from two findings: (1) measured water level data do not provide clear evidence of a prevailing lateral flow direction; and (2) the groundwater flow system has been and continues to be in a transient state, which contrasts with assumed steady-state conditions in the model. The results of DRI's model validation efforts and observations made regarding water level behavior are discussed in the following sections. A summary of our conclusions and recommendations for a path forward are also provided in this letter report.

  19. Transient validation of RELAP5 model with the DISS facility in once through operation mode

    Science.gov (United States)

    Serrano-Aguilera, J. J.; Valenzuela, L.

    2016-05-01

    Thermal-hydraulic code RELAP5 has been used to model a Solar Direct Steam Generation (DSG) system. Experimental data from the DISS facility located at Plataforma Solar de Almería is compared to the numerical results of the RELAP5 model in order to validate it. Both the model and the experimental set-up are in once through operation mode where no injection or active control is regarded. Time dependent boundary conditions are taken into account. This work is a preliminary study of further research that will be carried out in order to achieve a thorough validation of RELAP5 models in the context of DSG in line-focus solar collectors.

  20. Using the mouse to model human disease: increasing validity and reproducibility

    Directory of Open Access Journals (Sweden)

    Monica J. Justice

    2016-02-01

    Full Text Available Experiments that use the mouse as a model for disease have recently come under scrutiny because of the repeated failure of data, particularly derived from preclinical studies, to be replicated or translated to humans. The usefulness of mouse models has been questioned because of irreproducibility and poor recapitulation of human conditions. Newer studies, however, point to bias in reporting results and improper data analysis as key factors that limit reproducibility and validity of preclinical mouse research. Inaccurate and incomplete descriptions of experimental conditions also contribute. Here, we provide guidance on best practice in mouse experimentation, focusing on appropriate selection and validation of the model, sources of variation and their influence on phenotypic outcomes, minimum requirements for control sets, and the importance of rigorous statistics. Our goal is to raise the standards in mouse disease modeling to enhance reproducibility, reliability and clinical translation of findings.

  1. Mold-filling experiments for validation of modeling encapsulation. Part 1, "wine glass" mold.

    Energy Technology Data Exchange (ETDEWEB)

    Castaneda, Jaime N.; Grillet, Anne Mary; Altobelli, Stephen A. (New Mexico Resonance, Albuquerque, NM); Cote, Raymond O.; Mondy, Lisa Ann

    2005-06-01

    The C6 project 'Encapsulation Processes' has been designed to obtain experimental measurements for discovery of phenomena critical to improving these processes, as well as data required in the verification and validation plan (Rao et al. 2001) for model validation of flow in progressively complex geometries. We have observed and recorded the flow of clear, Newtonian liquids and opaque, rheologically complex suspensions in two mold geometries. The first geometry is a simple wineglass geometry in a cylinder and is reported here in Part 1. The results in a more realistic encapsulation geometry are reported in Part 2.

  2. Soil process modelling in CZO research: gains in data harmonisation and model validation

    Science.gov (United States)

    van Gaans, Pauline; Andrianaki, Maria; Kobierska, Florian; Kram, Pavel; Lamacova, Anna; Lair, Georg; Nikolaidis, Nikos; Duffy, Chris; Regelink, Inge; van Leeuwen, Jeroen P.; de Ruiter, Peter

    2014-05-01

    Various soil process models were applied to four European Critical Zone observatories (CZOs), the core research sites of the FP7 project SoilTrEC: the Damma glacier forefield (CH), a set of three forested catchments on geochemically contrasing bedrocks in the Slavkov Forest (CZ), a chronosequence of soils in the former floodplain of the Danube of Fuchsenbigl/Marchfeld (AT), and the Koiliaris catchments in the north-western part of Crete, (GR). The aim of the modelling exercises was to apply and test soil process models with data from the CZOs for calibration/validation, identify potential limits to the application scope of the models, interpret soil state and soil functions at key stages of the soil life cycle, represented by the four SoilTrEC CZOs, contribute towards harmonisation of data and data acquisition. The models identified as specifically relevant were: The Penn State Integrated Hydrologic Model (PIHM), a fully coupled, multiprocess, multi-scale hydrologic model, to get a better understanding of water flow and pathways, The Soil and Water Assessment Tool (SWAT), a deterministic, continuous time (daily time step) basin scale model, to evaluate the impact of soil management practices, The Rothamsted Carbon model (Roth-C) to simulate organic carbon turnover and the Carbon, Aggregation, and Structure Turnover (CAST) model to include the role of soil aggregates in carbon dynamics, The Ligand Charge Distribution (LCD) model, to understand the interaction between organic matter and oxide surfaces in soil aggregate formation, and The Terrestrial Ecology Model (TEM) to obtain insight into the link between foodweb structure and carbon and nutrient turnover. With some exceptions all models were applied to all four CZOs. The need for specific model input contributed largely to data harmonisation. The comparisons between the CZOs turned out to be of great value for understanding the strength and limitations of the models, as well as the differences in soil conditions

  3. Coupled Michigan MHD - Rice Convection Model Results

    Science.gov (United States)

    de Zeeuw, D.; Sazykin, S.; Wolf, D.; Gombosi, T.; Powell, K.

    2002-12-01

    A new high performance Rice Convection Model (RCM) has been coupled to the adaptive-grid Michigan MHD model (BATSRUS). This fully coupled code allows us to self-consistently simulate the physics in the inner and middle magnetosphere. A study will be presented of the basic characteristics of the inner and middle magnetosphere in the context of a single coupled-code run for idealized storm inputs. The analysis will include region-2 currents, shielding of the inner magnetosphere, partial ring currents, pressure distribution, magnetic field inflation, and distribution of pV^gamma.

  4. Nonparametric model validations for hidden Markov models with applications in financial econometrics.

    Science.gov (United States)

    Zhao, Zhibiao

    2011-06-01

    We address the nonparametric model validation problem for hidden Markov models with partially observable variables and hidden states. We achieve this goal by constructing a nonparametric simultaneous confidence envelope for transition density function of the observable variables and checking whether the parametric density estimate is contained within such an envelope. Our specification test procedure is motivated by a functional connection between the transition density of the observable variables and the Markov transition kernel of the hidden states. Our approach is applicable for continuous time diffusion models, stochastic volatility models, nonlinear time series models, and models with market microstructure noise.

  5. Graphical interpretation of numerical model results

    International Nuclear Information System (INIS)

    Drewes, D.R.

    1979-01-01

    Computer software has been developed to produce high quality graphical displays of data from a numerical grid model. The code uses an existing graphical display package (DISSPLA) and overcomes some of the problems of both line-printer output and traditional graphics. The software has been designed to be flexible enough to handle arbitrarily placed computation grids and a variety of display requirements

  6. Some results on hyperscaling in the 3D Ising model

    Energy Technology Data Exchange (ETDEWEB)

    Baker, G.A. Jr. [Los Alamos National Lab., NM (United States). Theoretical Div.; Kawashima, Naoki [Univ. of Tokyo (Japan). Dept. of Physics

    1995-09-01

    The authors review exact studies on finite-sized 2 dimensional Ising models and show that the point for an infinite-sized model at the critical temperature is a point of nonuniform approach in the temperature-size plane. They also illuminate some strong effects of finite-size on quantities which do not diverge at the critical point. They then review Monte Carlo studies for 3 dimensional Ising models of various sizes (L = 2--100) at various temperatures. From these results they find that the data for the renormalized coupling constant collapses nicely when plotted against the correlation length, determined in a system of edge length L, divided by L. They also find that {zeta}{sub L}/L {ge} 0.26 is definitely too large for reliable studies of the critical value, g*, of the renormalized coupling constant. They have reasonable evidence that {zeta}{sub L}/L {approx} 0.1 is adequate for results that are within one percent of those for the infinite system size. On this basis, they have conducted a series of Monte Carlo calculations with this condition imposed. These calculations were made practical by the development of improved estimators for use in the Swendsen-Wang cluster method. The authors found from these results, coupled with a reversed limit computation (size increases with the temperature fixed at the critical temperature), that g* > 0, although there may well be a sharp downward drop in g as the critical temperature is approached in accord with the predictions of series analysis. The results support the validity of hyperscaling in the 3 dimensional Ising model.

  7. Validation study of the magnetically self-consistent inner magnetosphere model RAM-SCB

    Science.gov (United States)

    Yu, Yiqun; Jordanova, Vania; Zaharia, Sorin; Koller, Josef; Zhang, Jichun; Kistler, Lynn M.

    2012-03-01

    The validation of the magnetically self-consistent inner magnetospheric model RAM-SCB developed at Los Alamos National Laboratory is presented here. The model consists of two codes: a kinetic ring current-atmosphere interaction model (RAM) and a 3-D equilibrium magnetic field code (SCB). The validation is conducted by simulating two magnetic storm events and then comparing the model results against a variety of satellite in situ observations, including the magnetic field from Cluster and Polar spacecraft, ion differential flux from the Cluster/CODIF (Composition and Distribution Function) analyzer, and the ground-based SYM-H index. The model prediction of the magnetic field is in good agreement with observations, which indicates the model's capability of representing well the inner magnetospheric field configuration. This provides confidence for the RAM-SCB model to be utilized for field line and drift shell tracing, which are needed in radiation belt studies. While the SYM-H index, which reflects the total ring current energy content, is generally reasonably reproduced by the model using the Weimer electric field model, the modeled ion differential flux clearly depends on the electric field strength, local time, and magnetic activity level. A self-consistent electric field approach may be needed to improve the model performance in this regard.

  8. Validation and uncertainty analysis of a pre-treatment 2D dose prediction model

    Science.gov (United States)

    Baeza, Jose A.; Wolfs, Cecile J. A.; Nijsten, Sebastiaan M. J. J. G.; Verhaegen, Frank

    2018-02-01

    Independent verification of complex treatment delivery with megavolt photon beam radiotherapy (RT) has been effectively used to detect and prevent errors. This work presents the validation and uncertainty analysis of a model that predicts 2D portal dose images (PDIs) without a patient or phantom in the beam. The prediction model is based on an exponential point dose model with separable primary and secondary photon fluence components. The model includes a scatter kernel, off-axis ratio map, transmission values and penumbra kernels for beam-delimiting components. These parameters were derived through a model fitting procedure supplied with point dose and dose profile measurements of radiation fields. The model was validated against a treatment planning system (TPS; Eclipse) and radiochromic film measurements for complex clinical scenarios, including volumetric modulated arc therapy (VMAT). Confidence limits on fitted model parameters were calculated based on simulated measurements. A sensitivity analysis was performed to evaluate the effect of the parameter uncertainties on the model output. For the maximum uncertainty, the maximum deviating measurement sets were propagated through the fitting procedure and the model. The overall uncertainty was assessed using all simulated measurements. The validation of the prediction model against the TPS and the film showed a good agreement, with on average 90.8% and 90.5% of pixels passing a (2%,2 mm) global gamma analysis respectively, with a low dose threshold of 10%. The maximum and overall uncertainty of the model is dependent on the type of clinical plan used as input. The results can be used to study the robustness of the model. A model for predicting accurate 2D pre-treatment PDIs in complex RT scenarios can be used clinically and its uncertainties can be taken into account.

  9. Results of a survey on accident and safety analysis codes, benchmarks, verification and validation methods

    International Nuclear Information System (INIS)

    Lee, A.G.; Wilkin, G.B.

    1996-03-01

    During the 'Workshop on R and D needs' at the 3rd Meeting of the International Group on Research Reactors (IGORR-III), the participants agreed that it would be useful to compile a survey of the computer codes and nuclear data libraries used in accident and safety analyses for research reactors and the methods various organizations use to verify and validate their codes and libraries. Five organizations, Atomic Energy of Canada Limited (AECL, Canada), China Institute of Atomic Energy (CIAE, People's Republic of China), Japan Atomic Energy Research Institute (JAERI, Japan), Oak Ridge National Laboratories (ORNL, USA), and Siemens (Germany) responded to the survey. The results of the survey are compiled in this report. (author) 36 refs., 3 tabs

  10. Computational modeling and validation of human nasal airflow under various breathing conditions.

    Science.gov (United States)

    Li, Chengyu; Jiang, Jianbo; Dong, Haibo; Zhao, Kai

    2017-11-07

    The human nose serves vital physiological functions, including warming, filtration, humidification, and olfaction. These functions are based on transport phenomena that depend on nasal airflow patterns and turbulence. Accurate prediction of these airflow properties requires careful selection of computational fluid dynamics models and rigorous validation. The validation studies in the past have been limited by poor representations of the complex nasal geometry, lack of detailed airflow comparisons, and restricted ranges of flow rate. The objective of this study is to validate various numerical methods based on an anatomically accurate nasal model against published experimentally measured data under breathing flow rates from 180 to 1100ml/s. The numerical results of velocity profiles and turbulence intensities were obtained using the laminar model, four widely used Reynolds-averaged Navier-Stokes (RANS) turbulence models (i.e., k-ε, standard k-ω, Shear Stress Transport k-ω, and Reynolds Stress Model), large eddy simulation (LES) model, and direct numerical simulation (DNS). It was found that, despite certain irregularity in the flow field, the laminar model achieved good agreement with experimental results under restful breathing condition (180ml/s) and performed better than the RANS models. As the breathing flow rate increased, the RANS models achieved more accurate predictions but still performed worse than LES and DNS. As expected, LES and DNS can provide accurate predictions of the nasal airflow under all flow conditions but have an approximately 100-fold higher computational cost. Among all the RANS models tested, the standard k-ω model agrees most closely with the experimental values in terms of velocity profile and turbulence intensity. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Validation of the galactic cosmic ray and geomagnetic transmission models

    International Nuclear Information System (INIS)

    Badhwar, G.D.; Truong, A.G.; O'Neill, P.M.; Choutko, Vitaly

    2001-01-01

    A very high-momentum resolution particle spectrometer called the Alpha Magnetic Spectrometer (AMS) was flown in the payload bay of the Space Shuttle in a 51.65 deg. x 380-km orbit during the last solar minimum. This spectrometer has provided the first high statistics data set for galactic cosmic radiation protons, and helium, as well as limited spectral data on carbon and oxygen nuclei in the International Space Station orbit. First measurements of the albedo protons at this inclination were also made. Because of the high-momentum resolution and high statistics, the data can be separated as a function of magnetic latitude. A related investigation, the balloon borne experiment with a superconducting solenoid spectrometer (BESS), has been flown from Lynn Lake, Canada and has also provided excellent high-resolution data on protons and helium. These two data sets have been used here to study the validity of two galactic cosmic ray models and the geomagnetic transmission function developed from the 1990 geomagnetic reference field model. The predictions of both the CREME96 and NASA/JSC models are in good agreement with the AMS data. The shape of the AMS measured albedo proton spectrum, up to 2 GeV, is in excellent agreement with the previous balloon and satellite observations. A new LIS spectrum was developed that is consistent with both previous and new BESS 3 He observations. Because the astronaut radiation exposures onboard ISS will be highest around the time of the solar minimum, these AMS measurements and these models provide important benchmarks for future radiation studies. AMS-02 slated for launch in September 2003, will provide even better momentum resolution and higher statistics data

  12. Prognostic models for locally advanced cervical cancer: external validation of the published models.

    Science.gov (United States)

    Lora, David; Gómez de la Cámara, Agustín; Fernández, Sara Pedraza; Enríquez de Salamanca, Rafael; Gómez, José Fermín Pérez Regadera

    2017-09-01

    To externally validate the prognostic models for predicting the time-dependent outcome in patients with locally advanced cervical cancer (LACC) who were treated with concurrent chemoradiotherapy in an independent cohort. A historical cohort of 297 women with LACC who were treated with radical concurrent chemoradiotherapy from 1999 to 2014 at the 12 de Octubre University Hospital (H12O), Madrid, Spain. The external validity of prognostic models was quantified regarding discrimination, calibration, measures of overall performance, and decision curve analyses. The review identified 8 studies containing 13 prognostic models. Different (International Federation of Gynecology and Obstetrics [FIGO] stages, parametrium involvement, hydronephrosis, location of positive nodes, and race) but related cohorts with validation cohort (5-year overall survival [OS]=70%; 5-year disease-free survival [DFS]=64%; average age of 50; and over 79% squamous cell) were evaluated. The following models exhibited good external validity in terms of discrimination and calibration but limited clinical utility: the OS model at 3 year from Kidd et al.'s study (area under the receiver operating characteristic curve [AUROC]=0.69; threshold of clinical utility [TCU] between 36% and 50%), the models of DFS at 1 year from Kidd et al.'s study (AUROC=0.64; TCU between 24% and 32%) and 2 years from Rose et al.'s study (AUROC=0.70; TCU between 19% and 58%) and the distant recurrence model at 5 years from Kang et al.'s study (AUROC=0.67; TCU between 12% and 36%). The external validation revealed the statistical and clinical usefulness of 4 prognostic models published in the literature. Copyright © 2017. Asian Society of Gynecologic Oncology, Korean Society of Gynecologic Oncology

  13. Validation of individual and aggregate global flood hazard models for two major floods in Africa.

    Science.gov (United States)

    Trigg, M.; Bernhofen, M.; Whyman, C.

    2017-12-01

    A recent intercomparison of global flood hazard models undertaken by the Global Flood Partnership shows that there is an urgent requirement to undertake more validation of the models against flood observations. As part of the intercomparison, the aggregated model dataset resulting from the project was provided as open access data. We compare the individual and aggregated flood extent output from the six global models and test these against two major floods in the African Continent within the last decade, namely severe flooding on the Niger River in Nigeria in 2012, and on the Zambezi River in Mozambique in 2007. We test if aggregating different number and combination of models increases model fit to the observations compared with the individual model outputs. We present results that illustrate some of the challenges of comparing imperfect models with imperfect observations and also that of defining the probability of a real event in order to test standard model output probabilities. Finally, we propose a collective set of open access validation flood events, with associated observational data and descriptions that provide a standard set of tests across different climates and hydraulic conditions.

  14. Validated Analytical Model of a Pressure Compensation Drip Irrigation Emitter

    Science.gov (United States)

    Shamshery, Pulkit; Wang, Ruo-Qian; Taylor, Katherine; Tran, Davis; Winter, Amos

    2015-11-01

    This work is focused on analytically characterizing the behavior of pressure-compensating drip emitters in order to design low-cost, low-power irrigation solutions appropriate for off-grid communities in developing countries. There are 2.5 billion small acreage farmers worldwide who rely solely on their land for sustenance. Drip, compared to flood, irrigation leads to up to 70% reduction in water consumption while increasing yields by 90% - important in countries like India which are quickly running out of water. To design a low-power drip system, there is a need to decrease the pumping pressure requirement at the emitters, as pumping power is the product of pressure and flow rate. To efficiently design such an emitter, the relationship between the fluid-structure interactions that occur in an emitter need to be understood. In this study, a 2D analytical model that captures the behavior of a common drip emitter was developed and validated through experiments. The effects of independently changing the channel depth, channel width, channel length and land height on the performance were studied. The model and the key parametric insights presented have the potential to be optimized in order to guide the design of low-pressure, clog-resistant, pressure-compensating emitters.

  15. Validation of Swarm accelerometer data by modelled nongravitational forces

    Science.gov (United States)

    Bezděk, Aleš; Sebera, Josef; Klokočník, Jaroslav

    2017-05-01

    Swarm is a three-satellite mission of the European Space Agency, in orbit since November 2013, whose main objective is the study of the Earth's magnetic field from space. As part of other scientific payload, each Swarm satellite is equipped with an accelerometer that measures the nongravitational forces (e.g. atmospheric drag and radiation pressure). Since the mission beginning, the Swarm onboard accelerometer observations have been facing a problem of much higher temperature influence than it had been anticipated in the pre-launch tests. In our paper, we use the a posteriori computed models of physical nongravitational forces acting on each satellite for external validation of the accelerometer measurements. To reduce the high temperature dependence, we apply a simple and straightforward method of linear temperature correction. The most successful application of this approach is for the along-track component of the accelerometer data, where the signal magnitude is strongest. The best performing accelerometer is that of the Swarm C satellite, the accelerometer of Swarm A displays more temperature dependence and noise, the noisiest accelerometer data set is provided by Swarm B. We analyzed the occurrence of anomalous periods in the along-track accelerometer component of Swarm A and Swarm C, when the number of accelerometer hardware anomalies is peaking. Over the time interval from June 2014 to December 2015, we found a correlation between these anomalous periods and the minima in the time-varying part of the modelled nongravitational signal.

  16. Ignalina NPP Safety Analysis: Models and Results

    International Nuclear Information System (INIS)

    Uspuras, E.

    1999-01-01

    Research directions, linked to safety assessment of the Ignalina NPP, of the scientific safety analysis group are presented: Thermal-hydraulic analysis of accidents and operational transients; Thermal-hydraulic assessment of Ignalina NPP Accident Localization System and other compartments; Structural analysis of plant components, piping and other parts of Main Circulation Circuit; Assessment of RBMK-1500 reactor core and other. Models and main works carried out last year are described. (author)

  17. Modeling clicks beyond the first result page

    NARCIS (Netherlands)

    Chuklin, A.; Serdyukov, P.; de Rijke, M.

    2013-01-01

    Most modern web search engines yield a list of documents of a fixed length (usually 10) in response to a user query. The next ten search results are usually available in one click. These documents either replace the current result page or are appended to the end. Hence, in order to examine more

  18. Published diagnostic models safely excluded colorectal cancer in an independent primary care validation study

    NARCIS (Netherlands)

    Elias, Sjoerd G; Kok, Liselotte; Witteman, Ben J M; Goedhard, Jelle G; Romberg-Camps, Mariëlle J L; Muris, Jean W M; de Wit, Niek J; Moons, Karel G M

    OBJECTIVE: To validate published diagnostic models for their ability to safely reduce unnecessary endoscopy referrals in primary care patients suspected of significant colorectal disease. STUDY DESIGN AND SETTING: Following a systematic literature search, we independently validated the identified

  19. Benchmark validation of statistical models: Application to mediation analysis of imagery and memory.

    Science.gov (United States)

    MacKinnon, David P; Valente, Matthew J; Wurpts, Ingrid C

    2018-03-29

    This article describes benchmark validation, an approach to validating a statistical model. According to benchmark validation, a valid model generates estimates and research conclusions consistent with a known substantive effect. Three types of benchmark validation-(a) benchmark value, (b) benchmark estimate, and (c) benchmark effect-are described and illustrated with examples. Benchmark validation methods are especially useful for statistical models with assumptions that are untestable or very difficult to test. Benchmark effect validation methods were applied to evaluate statistical mediation analysis in eight studies using the established effect that increasing mental imagery improves recall of words. Statistical mediation analysis led to conclusions about mediation that were consistent with established theory that increased imagery leads to increased word recall. Benchmark validation based on established substantive theory is discussed as a general way to investigate characteristics of statistical models and a complement to mathematical proof and statistical simulation. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  20. Impact Flash Physics: Modeling and Comparisons With Experimental Results

    Science.gov (United States)

    Rainey, E.; Stickle, A. M.; Ernst, C. M.; Schultz, P. H.; Mehta, N. L.; Brown, R. C.; Swaminathan, P. K.; Michaelis, C. H.; Erlandson, R. E.

    2015-12-01

    horizontal. High-speed radiometer measurements were made of the time-dependent impact flash at wavelengths of 350-1100 nm. We will present comparisons between these measurements and the output of APL's model. The results of this validation allow us to determine basic relationships between observed optical signatures and impact conditions.

  1. Quality of life and hormone use: new validation results of MRS scale

    Directory of Open Access Journals (Sweden)

    Heinemann Lothar AJ

    2006-05-01

    Full Text Available Abstract Background The Menopause Rating Scale is a health-related Quality of Life scale developed in the early 1990s and step-by-step validated since then. Recently the MRS scale was validated as outcomes measure for hormone therapy. The suspicion however was expressed that the data were too optimistic due to methodological problems of the study. A new study became available to check how founded this suspicion was. Method An open post-marketing study of 3282 women with pre- and post- treatment data of the self-administered version of the MRS scale was analyzed to evaluate the capacity of the scale to detect hormone treatment related effects with the MRS scale. The main results were then compared with the old study where the interview-based version of the MRS scale was used. Results The hormone-therapy related improvement of complaints relative to the baseline score was about or less than 30% in total or domain scores, whereas it exceeded 30% improvement in the old study. Similarly, the relative improvement after therapy, stratified by the degree of severity at baseline, was lower in the new than in the old study, but had the same slope. Although we cannot exclude different treatment effects with the study method used, this supports our hypothesis that the individual MRS interviews performed by the physician biased the results towards over-estimation of the treatment effects. This hypothesis is underlined by the degree of concordance of physician's assessment and patient's perception of treatment success (MRS results: Sensitivity (correct prediction of the positive assessment by the treating physician of the MRS and specificity (correct prediction of a negative assessment by the physician were lower than the results obtained with the interview-based MRS scale in the previous publication. Conclusion The study confirmed evidence for the capacity of the MRS scale to measure treatment effects on quality of life across the full range of severity of

  2. Thermal-Chemical Model Of Subduction: Results And Tests

    Science.gov (United States)

    Gorczyk, W.; Gerya, T. V.; Connolly, J. A.; Yuen, D. A.; Rudolph, M.

    2005-12-01

    Seismic structures with strong positive and negative velocity anomalies in the mantle wedge above subduction zones have been interpreted as thermally and/or chemically induced phenomena. We have developed a thermal-chemical model of subduction, which constrains the dynamics of seismic velocity structure beneath volcanic arcs. Our simulations have been calculated over a finite-difference grid with (201×101) to (201×401) regularly spaced Eulerian points, using 0.5 million to 10 billion markers. The model couples numerical thermo-mechanical solution with Gibbs energy minimization to investigate the dynamic behavior of partially molten upwellings from slabs (cold plumes) and structures associated with their development. The model demonstrates two chemically distinct types of plumes (mixed and unmixed), and various rigid body rotation phenomena in the wedge (subduction wheel, fore-arc spin, wedge pin-ball). These thermal-chemical features strongly perturb seismic structure. Their occurrence is dependent on the age of subducting slab and the rate of subduction.The model has been validated through a series of test cases and its results are consistent with a variety of geological and geophysical data. In contrast to models that attribute a purely thermal origin for mantle wedge seismic anomalies, the thermal-chemical model is able to simulate the strong variations of seismic velocity existing beneath volcanic arcs which are associated with development of cold plumes. In particular, molten regions that form beneath volcanic arcs as a consequence of vigorous cold wet plumes are manifest by > 20% variations in the local Poisson ratio, as compared to variations of ~ 2% expected as a consequence of temperature variation within the mantle wedge.

  3. Empirical model development and validation with dynamic learning in the recurrent multilayer perception

    International Nuclear Information System (INIS)

    Parlos, A.G.; Chong, K.T.; Atiya, A.F.

    1994-01-01

    A nonlinear multivariable empirical model is developed for a U-tube steam generator using the recurrent multilayer perceptron network as the underlying model structure. The recurrent multilayer perceptron is a dynamic neural network, very effective in the input-output modeling of complex process systems. A dynamic gradient descent learning algorithm is used to train the recurrent multilayer perceptron, resulting in an order of magnitude improvement in convergence speed over static learning algorithms. In developing the U-tube steam generator empirical model, the effects of actuator, process,and sensor noise on the training and testing sets are investigated. Learning and prediction both appear very effective, despite the presence of training and testing set noise, respectively. The recurrent multilayer perceptron appears to learn the deterministic part of a stochastic training set, and it predicts approximately a moving average response. Extensive model validation studies indicate that the empirical model can substantially generalize (extrapolate), though online learning becomes necessary for tracking transients significantly different than the ones included in the training set and slowly varying U-tube steam generator dynamics. In view of the satisfactory modeling accuracy and the associated short development time, neural networks based empirical models in some cases appear to provide a serious alternative to first principles models. Caution, however, must be exercised because extensive on-line validation of these models is still warranted

  4. Development and validation of a new two-dimensional wake model for wind turbine wakes

    DEFF Research Database (Denmark)

    Tian, Linlin; Zhu, Wei Jun; Shen, Wen Zhong

    2015-01-01

    , wind tunnel experiments, and results of an advanced k-ω turbulence model as well as large eddy simulations. From the comparisons, it is found that the proposed new wake model gives a good prediction in terms of both shape and velocity amplitude of the wake deficit, especially in the far wake which......A new two-dimensional (2D) wake model is developed and validated in this article to predict the velocity and turbulence distribution in the wake of a wind turbine. Based on the classical Jensen wake model, this model is further employing a cosine shape function to redistribute the spread...... of the wake deficit in the crosswind direction. Moreover, a variable wake decay rate is proposed to take into account both the ambient turbulence and the rotor generated turbulence, different from a constant wake decay rate used in the Jensen model. The obtained results are compared to field measurements...

  5. Calibrating and validating a FE model for long-term behavior of RC beams

    Directory of Open Access Journals (Sweden)

    Tošić Nikola D.

    2014-01-01

    Full Text Available This study presents the research carried out in finding an optimal finite element (FE model for calculating the long-term behavior of reinforced concrete (RC beams. A multi-purpose finite element software DIANA was used. A benchmark test in the form of a simply supported beam loaded in four point bending was selected for model calibration. The result was the choice of 3-node beam elements, a multi-directional fixed crack model with constant stress cut-off, nonlinear tension softening and constant shear retention and a creep and shrinkage model according to CEB-FIP Model Code 1990. The model was then validated on 14 simply supported beams and 6 continuous beams. Good agreement was found with experimental results (within ±15%.

  6. A Component-Based Modeling and Validation Method for PLC Systems

    Directory of Open Access Journals (Sweden)

    Rui Wang

    2014-05-01

    Full Text Available Programmable logic controllers (PLCs are complex embedded systems that are widely used in industry. This paper presents a component-based modeling and validation method for PLC systems using the behavior-interaction-priority (BIP framework. We designed a general system architecture and a component library for a type of device control system. The control software and hardware of the environment were all modeled as BIP components. System requirements were formalized as monitors. Simulation was carried out to validate the system model. A realistic example from industry of the gates control system was employed to illustrate our strategies. We found a couple of design errors during the simulation, which helped us to improve the dependability of the original systems. The results of experiment demonstrated the effectiveness of our approach.

  7. Numerical modeling and preliminary validation of drag-based vertical axis wind turbine

    Directory of Open Access Journals (Sweden)

    Krysiński Tomasz

    2015-03-01

    Full Text Available The main purpose of this article is to verify and validate the mathematical description of the airflow around a wind turbine with vertical axis of rotation, which could be considered as representative for this type of devices. Mathematical modeling of the airflow around wind turbines in particular those with the vertical axis is a problematic matter due to the complex nature of this highly swirled flow. Moreover, it is turbulent flow accompanied by a rotation of the rotor and the dynamic boundary layer separation. In such conditions, the key aspects of the mathematical model are accurate turbulence description, definition of circular motion as well as accompanying effects like centrifugal force or the Coriolis force and parameters of spatial and temporal discretization. The paper presents the impact of the different simulation parameters on the obtained results of the wind turbine simulation. Analysed models have been validated against experimental data published in the literature.

  8. Validation of the USAWC Student War Gaming Model.

    Science.gov (United States)

    1983-05-18

    battles of the 1973 Arab -Israeli War were used in the study. The battle period was 15-18 October during which time the Israelis moved to cross the Suez...the model results with the actual historical events. Selected battles of the 1973 Arab -Israeli War were used in the study. The battle period was 15...General .. .. ...................... 5 Background--The 1973 Arab -Israeli War. .. ........ 5 The Battle Period Played in the USAWC Computer Model . 9

  9. Validity of "Hi_Science" as instructional media based-android refer to experiential learning model

    Science.gov (United States)

    Qamariah, Jumadi, Senam, Wilujeng, Insih

    2017-08-01

    Hi_Science is instructional media based-android in learning science on material environmental pollution and global warming. This study is aimed: (a) to show the display of Hi_Science that will be applied in Junior High School, and (b) to describe the validity of Hi_Science. Hi_Science as instructional media created with colaboration of innovative learning model and development of technology at the current time. Learning media selected is based-android and collaborated with experiential learning model as an innovative learning model. Hi_Science had adapted student worksheet by Taufiq (2015). Student worksheet had very good category by two expert lecturers and two science teachers (Taufik, 2015). This student worksheet is refined and redeveloped in android as an instructional media which can be used by students for learning science not only in the classroom, but also at home. Therefore, student worksheet which has become instructional media based-android must be validated again. Hi_Science has been validated by two experts. The validation is based on assessment of meterials aspects and media aspects. The data collection was done by media assessment instrument. The result showed the assessment of material aspects has obtained the average value 4,72 with percentage of agreement 96,47%, that means Hi_Science on the material aspects is in excellent category or very valid category. The assessment of media aspects has obtained the average value 4,53 with percentage of agreement 98,70%, that means Hi_Science on the media aspects is in excellent category or very valid category. It was concluded that Hi_Science as instructional media can be applied in the junior high school.

  10. A comparison of internal model validation methods for multifactor dimensionality reduction in the case of genetic heterogeneity.

    Science.gov (United States)

    Gory, Jeffrey J; Sweeney, Holly C; Reif, David M; Motsinger-Reif, Alison A

    2012-11-05

    Determining the genes responsible for certain human traits can be challenging when the underlying genetic model takes a complicated form such as heterogeneity (in which different genetic models can result in the same trait) or epistasis (in which genes interact with other genes and the environment). Multifactor Dimensionality Reduction (MDR) is a widely used method that effectively detects epistasis; however, it does not perform well in the presence of heterogeneity partly due to its reliance on cross-validation for internal model validation. Cross-validation allows for only one "best" model and is therefore inadequate when more than one model could cause the same trait. We hypothesize that another internal model validation method known as a three-way split will be better at detecting heterogeneity models. In this study, we test this hypothesis by performing a simulation study to compare the performance of MDR to detect models of heterogeneity with the two different internal model validation techniques. We simulated a range of disease models with both main effects and gene-gene interactions with a range of effect sizes. We assessed the performance of each method using a range of definitions of power. Overall, the power of MDR to detect heterogeneity models was relatively poor, especially under more conservative (strict) definitions of power. While the overall power was low, our results show that the cross-validation approach greatly outperformed the three-way split approach in detecting heterogeneity. This would motivate using cross-validation with MDR in studies where heterogeneity might be present. These results also emphasize the challenge of detecting heterogeneity models and the need for further methods development.

  11. MODELING OF MTJ AND ITS VALIDATION USING NANOSCALE MRAM BITCELL

    Directory of Open Access Journals (Sweden)

    CHANDRAMAULESHWAR ROY

    2017-06-01

    Full Text Available Magnetic Tunnel Junction (MTJ is a promising candidate for nonvolatile and low power memory design. MTJ is basic building block of STT-MRAM bitcell. We develop a Verilog-A based behavioral model of MTJ which effectively exhibits electrical characteristics of MTJ with a very low switching current (27.2 µA for parallel to antiparallel and 19.2 µA for antiparallel to parallel switching. To verify the versatility of the proposed model, we have employed it to design MTJ- based MRAM bitcell. Simulation results (of read margin, write margin and variability analysis of MTJ-MRAM bitcell demonstrate importance of our proposed model.

  12. Comparative Validation of Realtime Solar Wind Forecasting Using the UCSD Heliospheric Tomography Model

    Science.gov (United States)

    MacNeice, Peter; Taktakishvili, Alexandra; Jackson, Bernard; Clover, John; Bisi, Mario; Odstrcil, Dusan

    2011-01-01

    The University of California, San Diego 3D Heliospheric Tomography Model reconstructs the evolution of heliospheric structures, and can make forecasts of solar wind density and velocity up to 72 hours in the future. The latest model version, installed and running in realtime at the Community Coordinated Modeling Center(CCMC), analyzes scintillations of meter wavelength radio point sources recorded by the Solar-Terrestrial Environment Laboratory(STELab) together with realtime measurements of solar wind speed and density recorded by the Advanced Composition Explorer(ACE) Solar Wind Electron Proton Alpha Monitor(SWEPAM).The solution is reconstructed using tomographic techniques and a simple kinematic wind model. Since installation, the CCMC has been recording the model forecasts and comparing them with ACE measurements, and with forecasts made using other heliospheric models hosted by the CCMC. We report the preliminary results of this validation work and comparison with alternative models.

  13. Analytical Performance Modeling and Validation of Intel’s Xeon Phi Architecture

    Energy Technology Data Exchange (ETDEWEB)

    Chunduri, Sudheer; Balaprakash, Prasanna; Morozov, Vitali; Vishwanath, Venkatram; Kumaran, Kalyan

    2017-01-01

    Modeling the performance of scientific applications on emerging hardware plays a central role in achieving extreme-scale computing goals. Analytical models that capture the interaction between applications and hardware characteristics are attractive because even a reasonably accurate model can be useful for performance tuning before the hardware is made available. In this paper, we develop a hardware model for Intel’s second-generation Xeon Phi architecture code-named Knights Landing (KNL) for the SKOPE framework. We validate the KNL hardware model by projecting the performance of mini-benchmarks and application kernels. The results show that our KNL model can project the performance with prediction errors of 10% to 20%. The hardware model also provides informative recommendations for code transformations and tuning.

  14. Microplasticity of MMC. Experimental results and modelling

    Energy Technology Data Exchange (ETDEWEB)

    Maire, E. (Groupe d' Etude de Metallurgie Physique et de Physique des Materiaux, INSA, 69 Villeurbanne (France)); Lormand, G. (Groupe d' Etude de Metallurgie Physique et de Physique des Materiaux, INSA, 69 Villeurbanne (France)); Gobin, P.F. (Groupe d' Etude de Metallurgie Physique et de Physique des Materiaux, INSA, 69 Villeurbanne (France)); Fougeres, R. (Groupe d' Etude de Metallurgie Physique et de Physique des Materiaux, INSA, 69 Villeurbanne (France))

    1993-11-01

    The microplastic behavior of several MMC is investigated by means of tension and compression tests. This behavior is assymetric : the proportional limit is higher in tension than in compression but the work hardening rate is higher in compression. These differences are analysed in terms of maxium of the Tresca's shear stress at the interface (proportional limit) and of the emission of dislocation loops during the cooling (work hardening rate). On another hand, a model is proposed to calculate the value of the yield stress, describing the composite as a material composed of three phases : inclusion, unaffected matrix and matrix surrounding the inclusion having a gradient in the density of the thermally induced dilocations. (orig.).

  15. The development and validation of a thermal model for the cabin of a vehicle

    International Nuclear Information System (INIS)

    Marcos, David; Pino, Francisco J.; Bordons, Carlos; Guerra, José J.

    2014-01-01

    Energy management in modern vehicles is a crucial issue, especially in the case of electric vehicles (EV) or hybrid vehicles (HV), in which different energy sources and loads must be considered for the operation of a vehicle. Air conditioning is an important load that must be thoroughly analysed because it can constitute a considerable percentage of the energy demand. In this paper, a simplified and dynamic thermal model for the cabin of a vehicle is proposed and validated. The developed model can be used for the design and testing of the heating, ventilation, and air conditioning (HVAC) system of a vehicle and for the study of its effects on the performance and fuel consumption of vehicles, such as EVs or HVs. The model is based on theoretical heat transfer, thermal inertia, and radiation treatment equations. The model results obtained from simulations are compared with the cabin air temperature of a vehicle under different conditions. This comparison demonstrates the accuracy between the simulation results and actual results. - Highlights: •A thermal model of a vehicle cabin with two thermal inertias is developed. •The model is validated with experimental data. •The simulation results and the experimental data fit

  16. Hydroforming Of Patchwork Blanks — Numerical Modeling And Experimental Validation

    Science.gov (United States)

    Lamprecht, Klaus; Merklein, Marion; Geiger, Manfred

    2005-08-01

    In comparison to the commonly applied technology of tailored blanks the concept of patchwork blanks offers a number of additional advantages. Potential application areas for patchwork blanks in automotive industry are e.g. local reinforcements of automotive closures, structural reinforcements of rails and pillars as well as shock towers. But even if there is a significant application potential for patchwork blanks in automobile production, industrial realization of this innovative technique is decelerated due to a lack of knowledge regarding the forming behavior and the numerical modeling of patchwork blanks. Especially for the numerical simulation of hydroforming processes, where one part of the forming tool is replaced by a fluid under pressure, advanced modeling techniques are required to ensure an accurate prediction of the blanks' forming behavior. The objective of this contribution is to provide an appropriate model for the numerical simulation of patchwork blanks' forming processes. Therefore, different finite element modeling techniques for patchwork blanks are presented. In addition to basic shell element models a combined finite element model consisting of shell and solid elements is defined. Special emphasis is placed on the modeling of the weld seam. For this purpose the local mechanical properties of the weld metal, which have been determined by means of Martens-hardness measurements and uniaxial tensile tests, are integrated in the finite element models. The results obtained from the numerical simulations are compared to experimental data from a hydraulic bulge test. In this context the focus is laid on laser- and spot-welded patchwork blanks.

  17. Cultural adaptation and validation of an instrument on barriers for the use of research results.

    Science.gov (United States)

    Ferreira, Maria Beatriz Guimarães; Haas, Vanderlei José; Dantas, Rosana Aparecida Spadoti; Felix, Márcia Marques Dos Santos; Galvão, Cristina Maria

    2017-03-02

    to culturally adapt The Barriers to Research Utilization Scale and to analyze the metric validity and reliability properties of its Brazilian Portuguese version. methodological research conducted by means of the cultural adaptation process (translation and back-translation), face and content validity, construct validity (dimensionality and known groups) and reliability analysis (internal consistency and test-retest). The sample consisted of 335 nurses, of whom 43 participated in the retest phase. the validity of the adapted version of the instrument was confirmed. The scale investigates the barriers for the use of the research results in clinical practice. Confirmatory factorial analysis demonstrated that the Brazilian Portuguese version of the instrument is adequately adjusted to the dimensional structure the scale authors originally proposed. Statistically significant differences were observed among the nurses holding a Master's or Doctoral degree, with characteristics favorable to Evidence-Based Practice, and working at an institution with an organizational cultural that targets this approach. The reliability showed a strong correlation (r ranging between 0.77 and 0.84, pescala investiga las barreras para la utilización de resultados de investigación en la práctica clínica. El análisis factorial confirmatorio demostró que la versión del instrumento para el portugués brasileño está adecuadamente ajustado a la estructura dimensional originalmente propuesta por los autores de la escala. Fueron observadas diferencias estadísticamente significantes entre los enfermeros con título de maestría o doctorado, con características favorables a la Práctica Basada en Evidencias, y actuantes en institución con cultura organizacional dirigida hacia tal aproximación. La fiabilidad presentó correlación fuerte (r variando entre 0,77 y 0,84, pescala investiga as barreiras para a utilização de resultados de pesquisa na prática clínica. A análise fatorial

  18. Validity of Self-Reported Birth Weight: Results from a Norwegian Twin Sample.

    Science.gov (United States)

    Nilsen, Thomas S; Kutschke, Julia; Brandt, Ingunn; Harris, Jennifer R

    2017-10-01

    The association between birth weight and later life outcomes is of considerable interest in life-course epidemiology. Research often relies on self-reported measures of birth weight, and its validity is consequently of importance. We assessed agreement between self-reported birth weight and official birth records for Norwegian twins born 1967-1974. The intraclass correlation between self-reported birth weight and register-based birth weight was 0.91 in our final sample of 363 twins. It could be expected that 95% of self-reported birth-weight values will deviate from official records within a maximum of +446 grams and a minimum of -478 grams - around a mean deviation of 16 grams. Self-reported birth weight had a sensitivity of 0.78-0.89 and a positive predictive value of 0.59-0.85, and an overall weighted kappa of 0.71. We further assessed agreement by conducting two linear regression models where we respectively regressed self-reported birth weight and register-based birth weight on adult body mass index, a known association. The two models were not significantly different; however, there were different levels of significance in parameter estimates that warrant some caution in using self-reported birth weight. Reliability of self-reported birth weight was also assessed, based on self-reports in another sample of twins born 1935-1960 who had reported their birth weight in two questionnaires 34 years apart. The intraclass correlation was 0.86, which indicates a high degree of reliability. In conclusion, self-reported birth weight, depending on context and age when birth weight was reported, can be cautiously used.

  19. Experimental investigations and validation of two dimensional model for multistream plate fin heat exchangers

    Science.gov (United States)

    Goyal, Mukesh; Chakravarty, Anindya; Atrey, M. D.

    2017-03-01

    Experimental investigations are carried out using a specially developed three-layer plate fin heat exchanger (PFHE), with helium as the working fluid cooled to cryogenic temperatures using liquid nitrogen (LN2) as a coolant. These results are used for validation of an already proposed and reported numerical model based on finite volume analysis for multistream (MS) plate fin heat exchangers (PFHE) for cryogenic applications (Goyal et al., 2014). The results from the experiments are presented and a reasonable agreement is observed with the already reported numerical model.

  20. Empirical Validation of a Thermal Model of a Complex Roof Including Phase Change Materials

    Directory of Open Access Journals (Sweden)

    Stéphane Guichard

    2015-12-01

    Full Text Available This paper deals with the empirical validation of a building thermal model of a complex roof including a phase change material (PCM. A mathematical model dedicated to PCMs based on the heat apparent capacity method was implemented in a multi-zone building simulation code, the aim being to increase the understanding of the thermal behavior of the whole building with PCM technologies. In order to empirically validate the model, the methodology is based both on numerical and experimental studies. A parametric sensitivity analysis was performed and a set of parameters of the thermal model has been identified for optimization. The use of the generic optimization program called GenOpt® coupled to the building simulation code enabled to determine the set of adequate parameters. We first present the empirical validation methodology and main results of previous work. We then give an overview of GenOpt® and its coupling with the building simulation code. Finally, once the optimization results are obtained, comparisons of the thermal predictions with measurements are found to be acceptable and are presented.

  1. Description and Validation of a MATLAB - Simulink Single Family House Energy Model with Furniture and Phase Change Materials (Update)

    DEFF Research Database (Denmark)

    Johra, Hicham; Heiselberg, Per Kvols

    This report aims to present in details the numerical building model and each of its elements. In the second part, the results of different validation tests are presented to certify the reliability of the model and thus the results of numerical analyses using it....

  2. Validation of Code ASTEC with LIVE-L1 Experimental Results

    International Nuclear Information System (INIS)

    Bachrata, Andrea

    2008-01-01

    The severe accidents with core melting are considered at the design stage of project at Generation 3+ of Nuclear Power Plants (NPP). Moreover, there is an effort to apply the severe accident management to the operated NPP. The one of main goals of severe accidents mitigation is corium localization and stabilization. The two strategies that fulfil this requirement are: the in-vessel retention (e.g. AP-600, AP- 1000) and the ex-vessel retention (e.g. EPR). To study the scenario of in-vessel retention, a large experimental program and the integrated codes have been developed. The LIVE-L1 experimental facility studied the formation of melt pools and the melt accumulation in the lower head using different cooling conditions. Nowadays, a new European computer code ASTEC is being developed jointly in France and Germany. One of the important steps in ASTEC development in the area of in-vessel retention of corium is its validation with LIVE-L1 experimental results. Details of the experiment are reported. Results of the ASTEC (module DIVA) application to the analysis of the test are presented. (author)

  3. Assessing Religious Orientations: Replication and Validation of the Commitment-Reflectivity Circumplex (CRC Model

    Directory of Open Access Journals (Sweden)

    Steven L. Isaak

    2017-09-01

    Full Text Available The Commitment-Reflectivity Circumplex (CRC model is a structural model of religious orientation that was designed to help organize and clarify measurement of foundational aspect of religiousness. The current study successfully replicated the CRC model using multidimensional scaling, and further evaluated the reliability, structure, and validity of their measures in both a university student sample (Study 1 and a nationally representative sample (Study 2. All 10 subscales of the Circumplex Religious Orientation Inventory (CROI demonstrated good reliability across both samples. A two-week test-retest of the CROI showed that the subscales are stable over time. A confirmatory factor analysis of the CROI in the representative adult sample demonstrated good model fit. Finally, the CROI’s validity was examined in relation to the Intrinsic, Extrinsic and Quest measures. Overall, the CROI appears to clarify much of the ambiguity inherent in the established scales by breaking down what were very broad orientations into very specific suborientations. The results suggest that the CRC model is applicable for diverse populations of adults. In addition, the CROI appears to be construct valid with good structural and psychometric properties across all 10 subscales.

  4. The European Integrated Tokamak Modelling (ITM) effort: achievements and first physics results

    NARCIS (Netherlands)

    G.L. Falchetto,; Coster, D.; Coelho, R.; Scott, B. D.; Figini, L.; Kalupin, D.; Nardon, E.; Nowak, S.; L.L. Alves,; Artaud, J. F.; Basiuk, V.; João P.S. Bizarro,; C. Boulbe,; Dinklage, A.; Farina, D.; B. Faugeras,; Ferreira, J.; Figueiredo, A.; Huynh, P.; Imbeaux, F.; Ivanova-Stanik, I.; Jonsson, T.; H.-J. Klingshirn,; Konz, C.; Kus, A.; Marushchenko, N. B.; Pereverzev, G.; M. Owsiak,; Poli, E.; Peysson, Y.; R. Reimer,; Signoret, J.; Sauter, O.; Stankiewicz, R.; Strand, P.; Voitsekhovitch, I.; Westerhof, E.; T. Zok,; Zwingmann, W.; ITM-TF contributors,; ASDEX Upgrade team,; JET-EFDA Contributors,

    2014-01-01

    A selection of achievements and first physics results are presented of the European Integrated Tokamak Modelling Task Force (EFDA ITM-TF) simulation framework, which aims to provide a standardized platform and an integrated modelling suite of validated numerical codes for the simulation and

  5. On the verification and validation of detonation models

    Science.gov (United States)

    Quirk, James

    2013-06-01

    This talk will consider the verification and validation of detonation models, such as Wescott-Stewart-Davis (Journal of Applied Physics. 2005), from the perspective of the American Institute of Aeronautics and Astronautics policy on numerical accuracy (AIAA J. Vol. 36, No. 1, 1998). A key aspect of the policy is that accepted documentation procedures must be used for journal articles with the aim of allowing the reported work to be reproduced by the interested reader. With the rise of electronic documents, since the policy was formulated, it is now possible to satisfy this mandate in its strictest sense: that is, it is now possible to run a comptuational verification study directly in a PDF, thereby allowing a technical author to report numerical subtleties that traditionally have been ignored. The motivation for this document-centric approach is discussed elsewhere (Quirk2003, Adaptive Mesh Refinement Theory and Practice, Springer), leaving the talk to concentrate on specific detonation examples that should be of broad interest to the shock-compression community.

  6. Validation of an employee satisfaction model: A structural equation model approach

    OpenAIRE

    Ophillia Ledimo; Nico Martins

    2015-01-01

    The purpose of this study was to validate an employee satisfaction model and to determine the relationships between the different dimensions of the concept, using the structural equation modelling approach (SEM). A cross-sectional quantitative survey design was used to collect data from a random sample of (n=759) permanent employees of a parastatal organisation. Data was collected using the Employee Satisfaction Survey (ESS) to measure employee satisfaction dimensions. Following the steps of ...

  7. Predictive validity of a non-induced mouse model of compulsive-like behavior.

    Science.gov (United States)

    Greene-Schloesser, D M; Van der Zee, E A; Sheppard, D K; Castillo, M R; Gregg, K A; Burrow, T; Foltz, H; Slater, M; Bult-Ito, A

    2011-08-01

    A key to advancing the understanding of obsessive-compulsive disorder (OCD)-like symptoms is the development of spontaneous animal models. Over 55 generations of bidirectional selection for nest-building behavior in house mice, Mus musculus, resulted in a 40-fold difference in the amount of cotton used for a nest in high (BIG) and low (SMALL) selected lines. The nesting behavior of BIG mice appears to be compulsive-like and has initial face validity as an animal model for OCD in humans. Compulsive-like digging behavior was assessed; BIG male mice buried about three times as many marbles as SMALL male mice, strengthening face validity. Using the open field and elevated plus maze, SMALL male mice showed higher levels of anxiety/fear-like behavior than BIG male mice, indicating that compulsive-like and not anxiety-like behavior was measured. To establish predictive validity, chronic (4 weeks) oral administration of fluoxetine (30, 50 and 100mg/kg/day) and clomipramine (80 mg/kg/day), both effective in treating OCD, significantly reduced compulsive-like nest-building behavior in BIG male mice. Compulsive-like digging behavior was also significantly reduced by chronic oral fluoxetine (30 and 80 mg/kg/day) treatment in BIG male mice. General locomotor activity was not affected by chronic oral fluoxetine (30 and 80 mg/kg/day) treatment; chronic oral treatment with desipramine (30 mg/kg/day), an antidepressant not effective in treating OCD, had no effect on nesting behavior of BIG male mice, strengthening predictive validity. Together, the results indicate that these mice have good face and predictive validity as a non-induced mouse model of compulsive-like behavior relevant to OCD. Copyright © 2011 Elsevier B.V. All rights reserved.

  8. Predictive Validity of a Non-Induced Mouse Model of Compulsive-Like Behavior

    Science.gov (United States)

    Greene-Schloesser, DM; Van der Zee, EA; Sheppard, DK; Castillo, MR; Gregg, KA; Burrow, T; Foltz, H; Slater, M; Bult-Ito, A

    2011-01-01

    A key to advancing the understanding of obsessive-compulsive disorder (OCD)-like symptoms is the development of spontaneous animal models. Over 55 generations of bidirectional selection for nest-building behavior in house mice, Mus musculus, resulted in a 40-fold difference in the amount of cotton used for a nest in high (BIG) and low (SMALL) selected lines. The nesting behavior of BIG mice appears to be compulsive-like and has initial face validity as an animal model for OCD in humans. Compulsive-like digging behavior was assessed; BIG male mice buried about three times as many marbles as SMALL male mice, strengthening face validity. Using the open field and elevated plus maze, SMALL male mice showed higher levels of anxiety/fear-like behavior than BIG male mice, indicating that compulsive-like and not anxiety-like behavior was measured. To establish predictive validity, chronic (4 weeks) oral administration of fluoxetine (30, 50 and 100 mg/kg/day) and clomipramine (80 mg/kg/day), both effective in treating OCD, significantly reduced compulsive-like nest-building behavior in BIG male mice. Compulsive-like digging behavior was also significantly reduced by chronic oral fluoxetine (30 and 80 mg/kg/day) treatment in BIG male mice. General locomotor activity was not affected by chronic oral fluoxetine (30 and 80 mg/kg/day) treatment; chronic oral treatment with desipramine (30 mg/kg/day), an antidepressant not effective in treating OCD, had no effect on nesting behavior of BIG male mice, strengthening predictive validity. Together, the results indicate that these mice have good face and predictive validity as a non-induced mouse model of compulsive-like behavior relevant to OCD. PMID:21316394

  9. IMPLEMENTATION AND VALIDATION OF A FULLY IMPLICIT ACCUMULATOR MODEL IN RELAP-7

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, Haihua [Idaho National Laboratory; Zou, Ling [Idaho National Laboratory; Zhang, Hongbin [Idaho National Laboratory; Martineau, Richard Charles [Idaho National Laboratory

    2016-01-01

    This paper presents the implementation and validation of an accumulator model in RELAP-7 under the framework of preconditioned Jacobian free Newton Krylov (JFNK) method, based on the similar model used in RELAP5. RELAP-7 is a new nuclear reactor system safety analysis code being developed at the Idaho National Laboratory (INL). RELAP-7 is a fully implicit system code. The JFNK and preconditioning methods used in RELAP-7 is briefly discussed. The slightly modified accumulator model is summarized for completeness. The implemented model was validated with LOFT L3-1 test and benchmarked with RELAP5 results. RELAP-7 and RELAP5 had almost identical results for the accumulator gas pressure and water level, although there were some minor difference in other parameters such as accumulator gas temperature and tank wall temperature. One advantage of the JFNK method is its easiness to maintain and modify models due to fully separation of numerical methods from physical models. It would be straightforward to extend the current RELAP-7 accumulator model to simulate the advanced accumulator design.

  10. PIV-validated numerical modeling of pulsatile flows in distal coronary end-to-side anastomoses.

    Science.gov (United States)

    Xiong, F L; Chong, C K

    2007-01-01

    This study employed particle image velocimetry (PIV) to validate a numerical model in a complementary approach to quantify hemodynamic factors in distal coronary anastomoses and to gain more insights on their relationship with anastomotic geometry. Instantaneous flow fields and wall shear stresses (WSS) were obtained from PIV measurement in a modified life-size silastic anastomosis model adapted from a conventional geometry by incorporating a smooth graft-artery transition. The results were compared with those predicted by a concurrent numerical model. The numerical method was then used to calculate cycle-averaged WSS (WSS(cyc)) and spatial wall shear stress gradient (SWSSG), two critical hemodynamic factors in the pathogenesis of intimal thickening (IT), to compare the conventional and modified geometries. Excellent qualitative agreement and satisfactory quantitative agreement with averaged normalized error in WSS between 0.8% and 8.9% were achieved between the PIV experiment and numerical model. Compared to the conventional geometry, the modified geometry produces a more uniform WSS(cyc) distribution eliminating both high and low WSS(cyc) around the toe, critical in avoiding IT. Peak SWSSG on the artery floor of the modified model is less than one-half that in the conventional case, and high SWSSG at the toe is eliminated. The validated numerical model is useful for modeling unsteady coronary anastomotic flows and elucidating the significance of geometry regulated hemodynamics. The results suggest the clinical relevance of constructing smooth graft-artery transition in distal coronary anastomoses to improve their hemodynamic performance.

  11. Development, Validation and Parametric study of a 3-Year-Old Child Head Finite Element Model

    Science.gov (United States)

    Cui, Shihai; Chen, Yue; Li, Haiyan; Ruan, ShiJie

    2015-12-01

    Traumatic brain injury caused by drop and traffic accidents is an important reason for children's death and disability. Recently, the computer finite element (FE) head model has been developed to investigate brain injury mechanism and biomechanical responses. Based on CT data of a healthy 3-year-old child head, the FE head model with detailed anatomical structure was developed. The deep brain structures such as white matter, gray matter, cerebral ventricle, hippocampus, were firstly created in this FE model. The FE model was validated by comparing the simulation results with that of cadaver experiments based on reconstructing the child and adult cadaver experiments. In addition, the effects of skull stiffness on the child head dynamic responses were further investigated. All the simulation results confirmed the good biofidelity of the FE model.

  12. Prediction model and experimental validation for the thermal deformation of motorized spindle

    Science.gov (United States)

    Zhang, Lixiu; Li, Jinpeng; Wu, Yuhou; Zhang, Ke; Wang, Yawen

    2018-02-01

    The thermal deformation of motorized spindle has a great influence on the precision of numerical control (NC) machine tools. Thus, it is crucial to predict the thermal deformation in the design and operation control phase by numerical simulation and improve the precision of NC machine tools. In order to achieve this, an accurate thermal deformation prediction model for motorized spindle is required. In this paper, a model for predicting the thermal error of motorized spindle based on finite element method and parameter optimization is proposed. Levenberg-Marquardt (LM) method is applied to optimize the heat transfer coefficient of motorized spindle by using surface temperature data measured. The optimized heat transfer coefficient is then taken as one of the boundary condition of the finite element model. The boundary conditions about heat of the finite element model are obtained by energy loss experiment. The proposed model is validated by experimental results, and the results have shown well correlation.

  13. Medical biomodelling in surgical applications: results of a multicentric European validation of 466 cases.

    Science.gov (United States)

    Wulf, J; Vitt, K D; Erben, C M; Bill, J S; Busch, L C

    2003-01-01

    The study started in September 1999 and ended in April 2002. It is based on a questionnaire [www.phidias.org] assessing case-related questions due to the application of stereolithographic models. Each questionnaire contains over 50 items. These variables take into account diagnosis, indications and benefits of stereolithographic models with view on different steps of the surgical procedures: preoperative planning, intraoperative application and overall outcome after surgical intervervention. These questionnaires were completed by the surgeons who performed operation. Over the time course of our multicentric study (30 months), we evaluated 466 cases. The study population consists of n=231 male and n= 235 female patients. 54 surgeons from 9 European countries were involved. There are main groups of diagnosis that related to the use of a model. Most models were used in maxillofacial surgery. The operative planning may help to determine the resection line of tumor and optimize reconstructive procedures. Correction of large calvarian defects can be simulated and implants can be produced preoperatively. Overall in 58 % of all cases a time- saving effect was reported. The study strongly suggests, that medical modeling has utility in surgical specialities, especially in the craniofacial and maxillofacial area, however increasingly in the orthopedic field. Due to our results, medical modeling optimizes the preoperative surgical planning. Surgeons are enabeled to perform realistic and interactive simulations. The fabrication of implants, its design and fit on the model, allow to reduce operation time and in consequence risk and cost of operation. In addition, the understanging of volumetric data is improved, especially if medical models are combined with standart imaging modalities. Finally, surgeons are able to improve communication between their patientents and colleagues.

  14. VALIDATION OF THEORETICAL MODEL FOR DECISION MAKING ABOUT E-LEARNING IMPLEMENTATION

    Directory of Open Access Journals (Sweden)

    Nina Begičević

    2006-12-01

    Full Text Available In the paper the possibility to use mathematical models and statistical techniques in strategic planning and decision making about e-learning is presented. Strategic planning and decision making has been covered as consisting of four phases: (1 intelligence, (2 design, (3 choice and (4 implementation. Each of the phases will be described in this paper, but the accent will be put on the statistical evaluation of the results of the questionnaire which was based on the developed theoretical model for decision making about e-learning implementation in the higher education sector. In general, the main objectives of this paper are: (1 validation of the first theoretical model for decision making about e-learning implementation in the higher education sector, by means of factor analysis and (2 reduction of a large number of variables to a smaller number of factors, i.e. designing the improved theoretical model, for modelling purposes (developing AHP & ANP models.

  15. Off-take Model of the SPACE Code and Its Validation

    International Nuclear Information System (INIS)

    Oh, Myung Taek; Park, Chan Eok; Sohn, Jong Joo

    2011-01-01

    Liquid entrainment and vapor pull-through models of horizontal pipe have been implemented in the SPACE code. The model of SPACE accounts for the phase separation phenomena and computes the flux of mass and energy through an off-take attached to a horizontal pipe when stratified conditions occur in the horizontal pipe. This model is referred to as the off-take model. The importance of predicting the fluid conditions through an off-take in a small-break LOCA has been well known. In this case, the occurrence of the stratification can affect the break node void fraction and thus the break flow discharged from the primary system. In order to validate the off-take model newly developed for the SPACE code, a simulation of the HDU experiments has been performed. The main feature of the off-take model and its application results will be presented in this paper

  16. An advanced simulation model for membrane bioreactors: development, calibration and validation.

    Science.gov (United States)

    Ludwig, T; Gaida, D; Keysers, C; Pinnekamp, J; Bongards, M; Kern, P; Wolf, C; Sousa Brito, A L

    2012-01-01

    Membrane wastewater treatment plants (WWTPs) have several advantages compared with conventionally designed WWTPs with classical purification techniques. The filtration process is the key to their commercial success in Germany with respect to energy consumption and effectiveness, enabled by the optimization of filtration using a dynamic simulation model. This work is focused on the development of a robust, flexible and practically applicable membrane simulation model for submerged hollow-fibre and flat-sheet membrane modules. The model is based on standard parameters usually measured on membrane WWTPs. The performance of the model is demonstrated by successful calibration and validation for three different full-scale membrane WWTPs achieving good results. Furthermore, the model is combinable with Activated Sludge Models.

  17. The PROMIS model to highlight the importance of the foetus to the validation of a pregnant woman model

    OpenAIRE

    AURIAULT, Florent; THOLLON, Lionel; PERES, Jérémie; DELOTTE, J; KAYVANTASH, K; BRUNET, Christian; BEHR, Michel

    2013-01-01

    The percentage of trauma during pregnancy related to road accident is between 50% and 75%. This type of trauma can result in premature birth or even foetal loss. To analyse and understand the injury mechanisms in pregnant women involved in a car accident, several studies proposed computational or physical tools to simulate accidents. Specific dummy and numerical models have been proposed and validated using experimental data from post-mortem human surrogate (PMHS) scaled with the equal-stress...

  18. Estimating the predictive validity of diabetic animal models in rosiglitazone studies.

    Science.gov (United States)

    Varga, O E; Zsíros, N; Olsson, I A S

    2015-06-01

    For therapeutic studies, predictive validity of animal models - arguably the most important feature of animal models in terms of human relevance - can be calculated retrospectively by obtaining data on treatment efficacy from human and animal trials. Using rosiglitazone as a case study, we aim to determine the predictive validity of animal models of diabetes, by analysing which models perform most similarly to humans during rosiglitazone treatment in terms of changes in standard diabetes diagnosis parameters (glycosylated haemoglobin [HbA1c] and fasting glucose levels). A further objective of this paper was to explore the impact of four covariates on the predictive capacity: (i) diabetes induction method; (ii) drug administration route; (iii) sex of animals and (iv) diet during the experiments. Despite the variable consistency of animal species-based models with the human reference for glucose and HbA1c treatment effects, our results show that glucose and HbA1c treatment effects in rats agreed better with the expected values based on human data than in other species. Induction method was also found to be a substantial factor affecting animal model performance. The study concluded that regular reassessment of animal models can help to identify human relevance of each model and adapt research design for actual research goals. © 2015 World Obesity.

  19. Local Validation of Global Estimates of Biosphere Properties: Synthesis of Scaling Methods and Results Across Several Major Biomes

    Science.gov (United States)

    Cohen, Warren B.; Wessman, Carol A.; Aber, John D.; VanderCaslte, John R.; Running, Steven W.

    1998-01-01

    To assist in validating future MODIS land cover, LAI, IPAR, and NPP products, this project conducted a series of prototyping exercises that resulted in enhanced understanding of the issues regarding such validation. As a result, we have several papers to appear as a special issue of Remote Sensing of Environment in 1999. Also, we have been successful at obtaining a follow-on grant to pursue actual validation of these products over the next several years. This document consists of a delivery letter, including a listing of published papers.

  20. Improved Conceptual Models Methodology (ICoMM) for Validation of Non-Observable Systems

    Science.gov (United States)

    2015-12-01

    engineering (SE) and systems architecture (SA) methods during the model development process ( MDP ). A MDP is used to ensure that the models are...validated and represent the real world as accurately as possible. There are several varieties of MDPs presented in literature, but all share the...early in the MDP for face validation. A well-constructed CoM supports model exploration of NOS when operational validation is not feasible. This

  1. Methods and Results for a Global Precipitation Measurement (GPM) Validation Network Prototype

    Science.gov (United States)

    Morris, Kenneth R.; Schwaller, Mathew R.

    2010-01-01

    As one component of a ground validation system to meet requirements for the upcoming Global Precipitation Measurement (GPM) mission, a quasi-operational prototype a system to compare satellite- and ground-based radar measurements has been developed. This prototype, the GPM Validation Network (VN), acquires data from the Precipitation Radar (PR) on the Tropical Rainfall Measuring Mission (TRMM) satellite and from ground radar (GR) networks in the continental U.S. and participating international sites. PR data serve as a surrogate for similar observations from the Dual-frequency Precipitation Radar (DPR) to be present on GPM. Primary goals of the VN prototype are to understand and characterize the variability and bias of precipitation retrievals between the PR and GR in various precipitation regimes at large scales, and to improve precipitation retrieval algorithms for the GPM instruments. The current VN capabilities concentrate on comparisons of the base reflectivity observations between the PR and GR, and include support for rain rate comparisons. The VN algorithm resamples PR and GR reflectivity and other 2-D and 3-D data fields to irregular common volumes defined by the geometric intersection of the instrument observations, and performs statistical comparisons of PR and GR reflectivity and estimated rain rates. Algorithmic biases and uncertainties introduced by traditional data analysis techniques are minimized by not performing interpolation or extrapolation of data to a fixed grid. The core VN dataset consists of WSR-88D GR data and matching PR orbit subset data covering 21 sites in the southeastern U. S., from August, 2006 to the present. On average, about 3.5 overpass events per month for these WSR-88D sites meet VN criteria for significant precipitation, and have matching PR and GR data available. This large statistical sample has allowed the relative calibration accuracy and stability of the individual ground radars, and the quality of the PR reflectivity

  2. Material model validation for laser shock peening process simulation

    International Nuclear Information System (INIS)

    Amarchinta, H K; Grandhi, R V; Langer, K; Stargel, D S

    2009-01-01

    Advanced mechanical surface enhancement techniques have been used successfully to increase the fatigue life of metallic components. These techniques impart deep compressive residual stresses into the component to counter potentially damage-inducing tensile stresses generated under service loading. Laser shock peening (LSP) is an advanced mechanical surface enhancement technique used predominantly in the aircraft industry. To reduce costs and make the technique available on a large-scale basis for industrial applications, simulation of the LSP process is required. Accurate simulation of the LSP process is a challenging task, because the process has many parameters such as laser spot size, pressure profile and material model that must be precisely determined. This work focuses on investigating the appropriate material model that could be used in simulation and design. In the LSP process material is subjected to strain rates of 10 6  s −1 , which is very high compared with conventional strain rates. The importance of an accurate material model increases because the material behaves significantly different at such high strain rates. This work investigates the effect of multiple nonlinear material models for representing the elastic–plastic behavior of materials. Elastic perfectly plastic, Johnson–Cook and Zerilli–Armstrong models are used, and the performance of each model is compared with available experimental results

  3. Validation of Superelement Modelling of Complex Offshore Support Structures

    DEFF Research Database (Denmark)

    Wang, Shaofeng; Larsen, Torben J.; Hansen, Anders Melchior

    2016-01-01

    calculations consisting of up to thousands design load cases needs to be evaluated. However, even the simplest aero-elastic model of such structures has many more DOFs than monopile, resulting in excessive computation burden. In order to deal with this problem, the superelement method has been introduced...... for modelling such structures. One superelement method has been proven very promising in the previous project of Wave Loads [1] and a fundamental question in such DOFs reduction methods is which modes that are essential and which modes can be neglected. For the jacket structure, the introduction of a gravity...

  4. Regionalization of climate model results for the North Sea

    Energy Technology Data Exchange (ETDEWEB)

    Kauker, F.

    1999-07-01

    A dynamical downscaling is presented that allows an estimation of potential effects of climate change on the North Sea. Therefore, the ocean general circulation model OPYC is adapted for application on a shelf by adding a lateral boundary formulation and a tide model. In this set-up the model is forced, first, with data from the ECMWF reanalysis for model validation and the study of the natural variability, and, second, with data from climate change experiments to estimate the effects of climate change on the North Sea. (orig.)

  5. ADVISHE: A new tool to report validation of health-economic decision models

    NARCIS (Netherlands)

    Vemer, P.; Corro Ramos, I.; Van Voorn, G.; Al, M.J.; Feenstra, T.L.

    2014-01-01

    Background: Modelers and reimbursement decision makers could both profit from a more systematic reporting of the efforts to validate health-economic (HE) models. Objectives: Development of a tool to systematically report validation efforts of HE decision models and their outcomes. Methods: A gross

  6. Prediction Models for Prolonged Intensive Care Unit Stay after Cardiac Surgery: Systematic Review and Validation Study

    NARCIS (Netherlands)

    Linda Peelen; Karel Moons; Cor Kalkman; Prof. Dr. Marieke J. Schuurmans; Roelof G.A. Ettema; Arno Nierich

    2010-01-01

    Several models have been developed to predict prolonged stay in the intensive care unit (ICU) after cardiac surgery. However, no extensive quantitative validation of these models has yet been conducted. This study sought to identify and validate existing prediction models for prolonged ICU length of

  7. Calibration and validation of earthquake catastrophe models. Case study: Impact Forecasting Earthquake Model for Algeria

    Science.gov (United States)

    Trendafiloski, G.; Gaspa Rebull, O.; Ewing, C.; Podlaha, A.; Magee, B.

    2012-04-01

    Calibration and validation are crucial steps in the production of the catastrophe models for the insurance industry in order to assure the model's reliability and to quantify its uncertainty. Calibration is needed in all components of model development including hazard and vulnerability. Validation is required to ensure that the losses calculated by the model match those observed in past events and which could happen in future. Impact Forecasting, the catastrophe modelling development centre of excellence within Aon Benfield, has recently launched its earthquake model for Algeria as a part of the earthquake model for the Maghreb region. The earthquake model went through a detailed calibration process including: (1) the seismic intensity attenuation model by use of macroseismic observations and maps from past earthquakes in Algeria; (2) calculation of the country-specific vulnerability modifiers by use of past damage observations in the country. The use of Benouar, 1994 ground motion prediction relationship was proven as the most appropriate for our model. Calculation of the regional vulnerability modifiers for the country led to 10% to 40% larger vulnerability indexes for different building types compared to average European indexes. The country specific damage models also included aggregate damage models for residential, commercial and industrial properties considering the description of the buildings stock given by World Housing Encyclopaedia and the local rebuilding cost factors equal to 10% for damage grade 1, 20% for damage grade 2, 35% for damage grade 3, 75% for damage grade 4 and 100% for damage grade 5. The damage grades comply with the European Macroseismic Scale (EMS-1998). The model was validated by use of "as-if" historical scenario simulations of three past earthquake events in Algeria M6.8 2003 Boumerdes, M7.3 1980 El-Asnam and M7.3 1856 Djidjelli earthquake. The calculated return periods of the losses for client market portfolio align with the

  8. 3D Core Model for simulation of nuclear power plants: Simulation requirements, model features, and validation

    International Nuclear Information System (INIS)

    Zerbino, H.

    1999-01-01

    In 1994-1996, Thomson Training and Simulation (TT and S) earned out the D50 Project, which involved the design and construction of optimized replica simulators for one Dutch and three German Nuclear Power Plants. It was recognized early on that the faithful reproduction of the Siemens reactor control and protection systems would impose extremely stringent demands on the simulation models, particularly the Core physics and the RCS thermohydraulics. The quality of the models, and their thorough validation, were thus essential. The present paper describes the main features of the fully 3D Core model implemented by TT and S, and its extensive validation campaign, which was defined in extremely positive collaboration with the Customer and the Core Data suppliers. (author)

  9. Validation of the United States Marine Corps Qualified Candidate Population Model

    National Research Council Canada - National Science Library

    Hallahan, William

    2003-01-01

    .... This research conducted a survey of colleges and officer recruiters. It determined that the model cannot be validated, for the post secondary education system cannot provide precise measurements...

  10. Model of Market Orientation of High-Tech Firms in Germany: Validation Study

    Directory of Open Access Journals (Sweden)

    Patrik Jangl

    2016-09-01

    Full Text Available The aim of this validation study was to verify a four-factor model of market orientation on a sample of German high-tech companies in the manufacturing industry. A modified version of the measuring scale (MMOS, composed of twelve items, was used for measurement. The target group consisted of business and marketing managers (N = 374 who recorded their answers on a seven-point Likert scale. The main methods used to achieve the objectives were exploratory and confirmatory factor analyses. Having verified the psychometric properties of the instrument, I proceeded to confirm the hypothesis of a four-factor solution model. Reliability and validity of the measuring instrument was also examined, which allowed a confirmation of the second hypothesis. The research may serve as the evidence of the instrument suitability for measuring market orientation in European cultural conditions. To obtain results from other sectors, the author recommends further research replication using the MMOS instrument.

  11. Assessing the validity of two indirect questioning techniques: A Stochastic Lie Detector versus the Crosswise Model.

    Science.gov (United States)

    Hoffmann, Adrian; Musch, Jochen

    2016-09-01

    Estimates of the prevalence of sensitive attributes obtained through direct questions are prone to being distorted by untruthful responding. Indirect questioning procedures such as the Randomized Response Technique (RRT) aim to control for the influence of social desirability bias. However, even on RRT surveys, some participants may disobey the instructions in an attempt to conceal their true status. In the present study, we experimentally compared the validity of two competing indirect questioning techniques that presumably offer a solution to the problem of nonadherent respondents: the Stochastic Lie Detector and the Crosswise Model. For two sensitive attributes, both techniques met the "more is better" criterion. Their