WorldWideScience

Sample records for model validation results

  1. Challenges in validating model results for first year ice

    Science.gov (United States)

    Melsom, Arne; Eastwood, Steinar; Xie, Jiping; Aaboe, Signe; Bertino, Laurent

    2017-04-01

    In order to assess the quality of model results for the distribution of first year ice, a comparison with a product based on observations from satellite-borne instruments has been performed. Such a comparison is not straightforward due to the contrasting algorithms that are used in the model product and the remote sensing product. The implementation of the validation is discussed in light of the differences between this set of products, and validation results are presented. The model product is the daily updated 10-day forecast from the Arctic Monitoring and Forecasting Centre in CMEMS. The forecasts are produced with the assimilative ocean prediction system TOPAZ. Presently, observations of sea ice concentration and sea ice drift are introduced in the assimilation step, but data for sea ice thickness and ice age (or roughness) are not included. The model computes the age of the ice by recording and updating the time passed after ice formation as sea ice grows and deteriorates as it is advected inside the model domain. Ice that is younger than 365 days is classified as first year ice. The fraction of first-year ice is recorded as a tracer in each grid cell. The Ocean and Sea Ice Thematic Assembly Centre in CMEMS redistributes a daily product from the EUMETSAT OSI SAF of gridded sea ice conditions which include "ice type", a representation of the separation of regions between those infested by first year ice, and those infested by multi-year ice. The ice type is parameterized based on data for the gradient ratio GR(19,37) from SSMIS observations, and from the ASCAT backscatter parameter. This product also includes information on ambiguity in the processing of the remote sensing data, and the product's confidence level, which have a strong seasonal dependency.

  2. Satellite data for systematic validation of wave model results in the Black Sea

    Science.gov (United States)

    Behrens, Arno; Staneva, Joanna

    2017-04-01

    The Black Sea is with regard to the availability of traditional in situ wave measurements recorded by usual waverider buoys a data sparse semi-enclosed sea. The only possibility for systematic validations of wave model results in such a regional area is the use of satellite data. In the frame of the COPERNICUS Marine Evolution System for the Black Sea that requires wave predictions, the third-generation spectral wave model WAM is used. The operational system is demonstrated based on four years' systematic comparisons with satellite data. The aim of this investigation was to answer two questions. Is the wave model able to provide a reliable description of the wave conditions in the Black Sea and are the satellite measurements suitable for validation purposes on such a regional scale ? Detailed comparisons between measured data and computed model results for the Black Sea including yearly statistics have been done for about 300 satellite overflights per year. The results discussed the different verification schemes needed to review the forecasting skills of the operational system. The good agreement between measured and modeled data supports the expectation that the wave model provides reasonable results and that the satellite data is of good quality and offer an appropriate validation alternative to buoy measurements. This is the required step towards further use of those satellite data for assimilation into the wave fields to improve the wave predictions. Additional support for the good quality of the wave predictions is provided by comparisons between ADCP measurements that are available for a short time period in February 2012 and the corresponding model results at a location near the Bulgarian coast in the western Black Sea. Sensitivity tests with different wave model options and different driving wind fields have been done which identify the appropriate model configuration that provides the best wave predictions. In addition to the comparisons between measured

  3. V-SUIT Model Validation Using PLSS 1.0 Test Results

    Science.gov (United States)

    Olthoff, Claas

    2015-01-01

    The dynamic portable life support system (PLSS) simulation software Virtual Space Suit (V-SUIT) has been under development at the Technische Universitat Munchen since 2011 as a spin-off from the Virtual Habitat (V-HAB) project. The MATLAB(trademark)-based V-SUIT simulates space suit portable life support systems and their interaction with a detailed and also dynamic human model, as well as the dynamic external environment of a space suit moving on a planetary surface. To demonstrate the feasibility of a large, system level simulation like V-SUIT, a model of NASA's PLSS 1.0 prototype was created. This prototype was run through an extensive series of tests in 2011. Since the test setup was heavily instrumented, it produced a wealth of data making it ideal for model validation. The implemented model includes all components of the PLSS in both the ventilation and thermal loops. The major components are modeled in greater detail, while smaller and ancillary components are low fidelity black box models. The major components include the Rapid Cycle Amine (RCA) CO2 removal system, the Primary and Secondary Oxygen Assembly (POS/SOA), the Pressure Garment System Volume Simulator (PGSVS), the Human Metabolic Simulator (HMS), the heat exchanger between the ventilation and thermal loops, the Space Suit Water Membrane Evaporator (SWME) and finally the Liquid Cooling Garment Simulator (LCGS). Using the created model, dynamic simulations were performed using same test points also used during PLSS 1.0 testing. The results of the simulation were then compared to the test data with special focus on absolute values during the steady state phases and dynamic behavior during the transition between test points. Quantified simulation results are presented that demonstrate which areas of the V-SUIT model are in need of further refinement and those that are sufficiently close to the test results. Finally, lessons learned from the modelling and validation process are given in combination

  4. Results comparison and model validation for flood loss functions in Australian geographical conditions

    Science.gov (United States)

    Hasanzadeh Nafari, R.; Ngo, T.; Lehman, W.

    2015-06-01

    Rapid urbanisation, climate change and unsustainable developments are increasing the risk of floods, namely flood frequency and intensity. Flood is a frequent natural hazard that has significant financial consequences for Australia. The emergency response system in Australia is very successful and has saved many lives over the years. However, the preparedness for natural disaster impacts in terms of loss reduction and damage mitigation has been less successful. This study aims to quantify the direct physical damage to residential structures that are prone to flood phenomena in Australia. In this paper, the physical consequences of two floods from Queensland have been simulated, and the results have been compared with the performance of two selected methodologies and one newly derived model. Based on this analysis, the adaptability and applicability of the selected methodologies will be assessed in terms of Australian geographical conditions. Results obtained from the new empirically-based function and non-adapted methodologies indicate that it is apparent that the precision of flood damage models are strongly dependent on selected stage damage curves, and flood damage estimation without model validation results in inaccurate prediction of losses. Therefore, it is very important to be aware of the associated uncertainties in flood risk assessment, especially if models have not been adapted with real damage data.

  5. Model validation: Issues regarding comparisons of point measurements and high-resolution modeling results

    Science.gov (United States)

    Sandvik, Anne D.; Skagseth, Øystein; Skogen, Morten D.

    2016-10-01

    In this study we compare a high resolution model of waters on the Norwegian Shelf with hydrographic observations obtained during 2009 at Ingøy, a fixed coastal station off northwestern Norway operated by the Institute of Marine Research. The observations comprise snapshots from Ingøy every two weeks, whereas the model represents an average over a certain volume and is continuous in time. We suggest that bias is the best way to compare the modeled and observed times series, while acknowledging the short-term variability (within a day) it is recommended to use the modeled range to estimate an acceptable deviation between single points in the series. Using the suggested method we conclude that an acceptable deviation between the modeled and observed surface temperatures at Ingøy is 0.6 °C. With such an acceptance level the model is correct in 27 out of 33 points for the time series considered.

  6. Hadronic Shower Models in GEANT4: Validation Strategy and Results.

    Institute of Scientific and Technical Information of China (English)

    JohannesPeterWellisch

    2001-01-01

    Optimal exploitation of hadronic final states played a key role in successes of all recent hadron collider experiment in HEP,and the ability to use hadronic final states will continue to be one of the decisive issues during the analysis phase of the LHC experinents Monte Carlo implementations of hadronic shower models provided with GEANT4 facilitate the use of hadronic final states,and have been developed for many years.We will give an overview on the physics underlying hadronic shower simulation,discussing the three basic types of modelling;data driven,parametrisation driven,and theory driven modelling,and their respective implementation status in GEANT4.We will confront the different types of modelling with a validation suite for hadronic generators based on cross-sections measurements from thin target experiments,and expose the strength and weaknesses of the individual approaches.

  7. Initial Content Validation Results of a New Simulation Model for Flexible Ureteroscopy: The Key-Box.

    Science.gov (United States)

    Villa, Luca; Şener, Tarik Emre; Somani, Bhaskar K; Cloutier, Jonathan; Butticè, Salvatore; Marson, Francesco; Doizi, Steeve; Proietti, Silvia; Traxer, Olivier

    2017-01-01

    We sought to test the content validity of a new training model for flexible ureteroscopy: the Key-Box. Sixteen medical students were randomized to undergo a 10-day training consisting of performing 10 different exercises aimed at learning specific movements with the flexible ureteroscope, and how to catch and release stones with a nitinol basket using the Key-Box (n = 8 students in the training group, n = 8 students in the nontraining control group). Subsequently, an expert endourologist (O.T.) blindly assessed skills acquired by the whole cohort of students through two exercises on ureteroscope manipulation and one exercise on stone capture selected among those used for the training. A performance scale (1-5) assessing different steps of the procedure was used to evaluate each student. Time to complete the exercises was measured. Mann-Whitney Rank Sum test was used for comparisons between the two groups. Mean scores obtained by trained students were significantly higher compared with those obtained by nontrained students (all p six (75%) nontrained students were not able to finish one out of the two exercises on ureteroscope manipulation and the exercise on stone capture, respectively. The mean time to complete the three exercises was 76.3, 69.9, and 107 and 172.5, 137.9, and 168 seconds in the trained and nontrained groups, respectively (all p Box(®) seems to be a valid easy-to-use training model for initiating novel endoscopists to flexible ureteroscopy.

  8. ASTER Global Digital Elevation Model Version 2 - summary of validation results

    Science.gov (United States)

    Tachikawa, Tetushi; Kaku, Manabu; Iwasaki, Akira; Gesch, Dean B.; Oimoen, Michael J.; Zhang, Z.; Danielson, Jeffrey J.; Krieger, Tabatha; Curtis, Bill; Haase, Jeff; Abrams, Michael; Carabajal, C.; Meyer, Dave

    2011-01-01

    On June 29, 2009, NASA and the Ministry of Economy, Trade and Industry (METI) of Japan released a Global Digital Elevation Model (GDEM) to users worldwide at no charge as a contribution to the Global Earth Observing System of Systems (GEOSS). This “version 1” ASTER GDEM (GDEM1) was compiled from over 1.2 million scenebased DEMs covering land surfaces between 83°N and 83°S latitudes. A joint U.S.-Japan validation team assessed the accuracy of the GDEM1, augmented by a team of 20 cooperators. The GDEM1 was found to have an overall accuracy of around 20 meters at the 95% confidence level. The team also noted several artifacts associated with poor stereo coverage at high latitudes, cloud contamination, water masking issues and the stacking process used to produce the GDEM1 from individual scene-based DEMs (ASTER GDEM Validation Team, 2009). Two independent horizontal resolution studies estimated the effective spatial resolution of the GDEM1 to be on the order of 120 meters.

  9. The Dutch Rhine-Meuse delta in 3D: A validation of model results

    NARCIS (Netherlands)

    Maljers, D.; Stafleu, J.; Busschers, F.; Gunnink, J.L.

    2010-01-01

    The Geological Survey of the Netherlands aims at building a 3D geological property model of the upper 30 meters of the Dutch subsurface. This model, called GeoTOP, provides a basis for answering subsurface related questions on, amongst others, sand and gravel resources. Modelling is carried out per

  10. Validation Techniques of network harmonic models based on switching of a series linear component and measuring resultant harmonic increments

    DEFF Research Database (Denmark)

    Wiechowski, Wojciech Tomasz; Lykkegaard, Jan; Bak, Claus Leth

    2007-01-01

    In this paper two methods of validation of transmission network harmonic models are introduced. The methods were developed as a result of the work presented in [1]. The first method allows calculating the transfer harmonic impedance between two nodes of a network. Switching a linear, series network...... are used for calculation of the transfer harmonic impedance between the nodes. The determined transfer harmonic impedance can be used to validate a computer model of the network. The second method is an extension of the fist one. It allows switching a series element that contains a shunt branch......, as for example a transmission line. Both methods require that harmonic measurements performed at two ends of the disconnected element are precisely synchronized....

  11. Validating management simulation models and implications for communicating results to stakeholders

    NARCIS (Netherlands)

    Pastoors, M.A.; Poos, J.J.; Kraak, S.B.M.; Machiels, M.A.M.

    2007-01-01

    Simulations of management plans generally aim to demonstrate the robustness of the plans to assumptions about population dynamics and fleet dynamics. Such modelling is characterized by specification of an operating model (OM) representing the underlying truth and a management procedure that mimics t

  12. The composite reinforcement production in digital manufacturing: experimental validation of the heat transfer and cure modeling results

    Science.gov (United States)

    Kazakov, I.; Krasnovskii, A.; Kutin, A.

    2017-02-01

    The experimental validation of the heat transfer and cure modeling results for 8-mm fiber-reinforced thermosetting composite reinforcement is reported in this article. The temperature and degree of cure of composite reinforcement are predicted using a two-dimensional heat transfer and curing model. The model uses the infrared radiant heating theory and takes into account the heat transfer between the composite rod and the surrounding air. The implicit finite difference method was used to solve the system of governing equations. The results obtained using mathematical model was compared to experimental data: the temperature field inside the composite reinforcement was measured by means of naked thermocouple; Differential Scanning Calorimetry (DSC) was used to measure the degree of cure of the final product. Calculated and measured temperature and degree of cure fields were in good agreement.

  13. Model validation, science and application

    NARCIS (Netherlands)

    Builtjes, P.J.H.; Flossmann, A.

    1998-01-01

    Over the last years there is a growing interest to try to establish a proper validation of atmospheric chemistry-transport (ATC) models. Model validation deals with the comparison of model results with experimental data, and in this way adresses both model uncertainty and uncertainty in, and adequac

  14. Validation of simulation models

    DEFF Research Database (Denmark)

    Rehman, Muniza; Pedersen, Stig Andur

    2012-01-01

    In philosophy of science, the interest for computational models and simulations has increased heavily during the past decades. Different positions regarding the validity of models have emerged but the views have not succeeded in capturing the diversity of validation methods. The wide variety...... of models has been somewhat narrow-minded reducing the notion of validation to establishment of truth. This article puts forward the diversity in applications of simulation models that demands a corresponding diversity in the notion of validation....... of models with regards to their purpose, character, field of application and time dimension inherently calls for a similar diversity in validation approaches. A classification of models in terms of the mentioned elements is presented and used to shed light on possible types of validation leading...

  15. Validation of model-based brain shift correction in neurosurgery via intraoperative magnetic resonance imaging: preliminary results

    Science.gov (United States)

    Luo, Ma; Frisken, Sarah F.; Weis, Jared A.; Clements, Logan W.; Unadkat, Prashin; Thompson, Reid C.; Golby, Alexandra J.; Miga, Michael I.

    2017-03-01

    The quality of brain tumor resection surgery is dependent on the spatial agreement between preoperative image and intraoperative anatomy. However, brain shift compromises the aforementioned alignment. Currently, the clinical standard to monitor brain shift is intraoperative magnetic resonance (iMR). While iMR provides better understanding of brain shift, its cost and encumbrance is a consideration for medical centers. Hence, we are developing a model-based method that can be a complementary technology to address brain shift in standard resections, with resource-intensive cases as referrals for iMR facilities. Our strategy constructs a deformation `atlas' containing potential deformation solutions derived from a biomechanical model that account for variables such as cerebrospinal fluid drainage and mannitol effects. Volumetric deformation is estimated with an inverse approach that determines the optimal combinatory `atlas' solution fit to best match measured surface deformation. Accordingly, preoperative image is updated based on the computed deformation field. This study is the latest development to validate our methodology with iMR. Briefly, preoperative and intraoperative MR images of 2 patients were acquired. Homologous surface points were selected on preoperative and intraoperative scans as measurement of surface deformation and used to drive the inverse problem. To assess the model accuracy, subsurface shift of targets between preoperative and intraoperative states was measured and compared to model prediction. Considering subsurface shift above 3 mm, the proposed strategy provides an average shift correction of 59% across 2 cases. While further improvements in both the model and ability to validate with iMR are desired, the results reported are encouraging.

  16. Presal36: a high resolution ocean current model for Brazilian pre-salt area: implementation and validation results

    Energy Technology Data Exchange (ETDEWEB)

    Schoellkopf, Jacques P. [Advanced Subsea do Brasil Ltda., Rio de Janeiro, RJ (Brazil)

    2012-07-01

    The PRESAL 36 JIP is a project for the development of a powerful Ocean Current Model of 1/36 of a degree resolution, nested in an existing Global Ocean global Model, Mercator PSY4 (1/12-a-degree resolution ), with tide corrections, improved bathymetry accuracy and high frequency atmospheric forcing (every 3 hours). The simulation outputs will be the 3 dimensional structure of the velocity fields (u,v,w) at 50 vertical levels over the water column, including geostrophic, Ekman and tidal currents, together with Temperature, Salinity and sea surface height at a sub-mesoscale spatial resolution. Simulations will run in hindcast, nowcast and forecast modes, with a temporal resolution of 3 hours . This Ocean current model will allow to perform detailed statistical studies on various areas using conditions analysed using hindcast mode, short term operational condition prediction for various surface and sub sea operations using realtime and Forecast modes. The paper presents a publication of significant results of the project, in term of pre-sal zoomed model implementation, and high resolution model validation. It demonstrate the capability to properly describe ocean current phenomenon at beyond mesoscale frontier. This project demonstrate the feasibility of obtaining accurate information for engineering studies and operational conditions, based on a 'zoom technique' starting from global ocean models. (author)

  17. Validation of satellite SAR offshore wind speed maps to in-situ data, microscala and mesoscale model results

    Energy Technology Data Exchange (ETDEWEB)

    Hasager, C.B.; Astrup, P.; Barthelmie, R.; Dellwik, E.; Hoffmann Joergensen, B.; Gylling Mortensen, N.; Nielsen, M.; Pryor, S.; Rathmann, O.

    2002-05-01

    A validation study has been performed in order to investigate the precision and accuracy of the satellite-derived ERS-2 SAR wind products in offshore regions. The overall project goal is to develop a method for utilizing the satellite wind speed maps for offshore wind resources, e.g. in future planning of offshore wind farms. The report describes the validation analysis in detail for three sites in Denmark, Italy and Egypt. The site in Norway is analyzed by the Nansen Environmental and Remote Sensing Centre (NERSC). Wind speed maps and wind direction maps from Earth Observation data recorded by the ERS-2 SAR satellite have been obtained from the NERSC. For the Danish site the wind speed and wind direction maps have been compared to in-situ observations from a met-mast at Horns Rev in the North Sea located 14 km offshore. The SAR wind speeds have been area-averaged by simple and advanced footprint modelling, ie. the upwind conditions to the meteorological mast are explicitly averaged in the SAR wind speed maps before comparison. The comparison results are very promising with a standard error of {+-} 0.61 m s{sup -1}, a bias {approx}2 m s{sup -1} and R{sup 2} {approx}0.88 between in-situ wind speed observations and SAR footprint averaged values at 10 m level. Wind speeds predicted by the local scale model LINCOM and the mesoscale model KAMM2 have been compared to the spatial variations in the SAR wind speed maps. The finding is a good correspondence between SAR observations and model results. Near the coast is an 800 m wide band in which the SAR wind speed observations have a strong negative bias. The bathymetry of Horns Rev combined with tidal currents give rise to bias in the SAR wind speed maps near areas of shallow, complex bottom topography in some cases. A total of 16 cases were analyzed for Horns Rev. For Maddalena in Italy five cases were analyzed. At the Italian site the SAR wind speed maps were compared to WAsP and KAMM2 model results. The WAsP model

  18. Model Validation Status Review

    Energy Technology Data Exchange (ETDEWEB)

    E.L. Hardin

    2001-11-28

    The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified, and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M&O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and

  19. Application of regional physically-based landslide early warning model: tuning of the input parameters and validation of the results

    Science.gov (United States)

    D'Ambrosio, Michele; Tofani, Veronica; Rossi, Guglielmo; Salvatici, Teresa; Tacconi Stefanelli, Carlo; Rosi, Ascanio; Benedetta Masi, Elena; Pazzi, Veronica; Vannocci, Pietro; Catani, Filippo; Casagli, Nicola

    2017-04-01

    runs in real-time by assimilating weather data and uses Monte Carlo simulation techniques to manage the geotechnical and hydrological input parameters. In this context, an assessment of the factors controlling the geotechnical and hydrological features is crucial in order to understand the occurrence of slope instability mechanisms and to provide reliable forecasting of the hydrogeological hazard occurrence, especially in relation to weather events. In particular, the model and the soil characterization were applied in back analysis, in order to assess the reliability of the model through validation of the results with landslide events that occurred during the period. The validation was performed on four past events of intense rainfall that have affected Valle d'Aosta region between 2008 and 2010 years triggering fast shallows landslides. The simulations show substantial improvement of the reliability of the results compared to the use of literature parameters. A statistical analysis of the HIRESSS outputs in terms of failure probability has been carried out in order to define reliable alert levels for regional landslide early warning systems.

  20. Validated dynamic flow model

    DEFF Research Database (Denmark)

    Knudsen, Torben

    2011-01-01

    The purpose with this deliverable 2.5 is to use fresh experimental data for validation and selection of a flow model to be used for control design in WP3-4. Initially the idea was to investigate the models developed in WP2. However, in the project it was agreed to include and focus on a additive...... model turns out not to be useful for prediction of the flow. Moreover, standard Box Jenkins model structures and multiple output auto regressive models proves to be superior as they can give useful predictions of the flow....

  1. Validation of remote sensed precipitation with the use of hydrological models - methodology and first results achieved in the frame of EUMETSAT H-SAF

    Science.gov (United States)

    Lapeta, B.; Niedbala, J. M.; Niedbala, J. S.; Struzik, P.

    2009-04-01

    High variability of precipitation in space and time causes difficulties in proper validation of remote sensed rain rates using conventional ground measurements and observations. Insufficient number and spatial resolution of ground data and their questionable quality make this task even more difficult. Therefore, the idea of independent assessment of the quality of satellite-derived data with the use of operational hydrological models has been implemented in the frame of EUMETSAT. In the paper, the assumptions and methodology of H-SAF hydrological validation will be described. Additionally, the preliminary hydrological validation results obtained for the six month time series of H-SAF precipitation rain rate will be presented. The quality of the rain rate were analyzed using two hydrological model MIKE 11 and Modelling Platform, run in Hydrological Forecasting Office in Krakow, Poland. The differences between the outcomes from these models will be discussed as well.

  2. Physical control oriented model of large scale refrigerators to synthesize advanced control schemes. Design, validation, and first control results

    Science.gov (United States)

    Bonne, François; Alamir, Mazen; Bonnay, Patrick

    2014-01-01

    In this paper, a physical method to obtain control-oriented dynamical models of large scale cryogenic refrigerators is proposed, in order to synthesize model-based advanced control schemes. These schemes aim to replace classical user experience designed approaches usually based on many independent PI controllers. This is particularly useful in the case where cryoplants are submitted to large pulsed thermal loads, expected to take place in the cryogenic cooling systems of future fusion reactors such as the International Thermonuclear Experimental Reactor (ITER) or the Japan Torus-60 Super Advanced Fusion Experiment (JT-60SA). Advanced control schemes lead to a better perturbation immunity and rejection, to offer a safer utilization of cryoplants. The paper gives details on how basic components used in the field of large scale helium refrigeration (especially those present on the 400W @1.8K helium test facility at CEA-Grenoble) are modeled and assembled to obtain the complete dynamic description of controllable subsystems of the refrigerator (controllable subsystems are namely the Joule-Thompson Cycle, the Brayton Cycle, the Liquid Nitrogen Precooling Unit and the Warm Compression Station). The complete 400W @1.8K (in the 400W @4.4K configuration) helium test facility model is then validated against experimental data and the optimal control of both the Joule-Thompson valve and the turbine valve is proposed, to stabilize the plant under highly variable thermals loads. This work is partially supported through the European Fusion Development Agreement (EFDA) Goal Oriented Training Program, task agreement WP10-GOT-GIRO.

  3. Assimilation of Geosat Altimetric Data in a Nonlinear Shallow-Water Model of the Indian Ocean by Adjoint Approach. Part II: Some Validation and Interpretation of the Assimilated Results. Part 2; Some Validation and Interpretation of the Assimilated Results

    Science.gov (United States)

    Greiner, Eric; Perigaud, Claire

    1996-01-01

    This paper examines the results of assimilating Geosat sea level variations relative to the November 1986-November 1988 mean reference, in a nonlinear reduced-gravity model of the Indian Ocean, Data have been assimilated during one year starting in November 1986 with the objective of optimizing the initial conditions and the yearly averaged reference surface. The thermocline slope simulated by the model with or without assimilation is validated by comparison with the signal, which can be derived from expandable bathythermograph measurements performed in the Indian Ocean at that time. The topography simulated with assimilation on November 1986 is in very good agreement with the hydrographic data. The slopes corresponding to the South Equatorial Current and to the South Equatorial Countercurrent are better reproduced with assimilation than without during the first nine months. The whole circulation of the cyclonic gyre south of the equator is then strongly intensified by assimilation. Another assimilation experiment is run over the following year starting in November 1987. The difference between the two yearly mean surfaces simulated with assimilation is in excellent agreement with Geosat. In the southeastern Indian Ocean, the correction to the yearly mean dynamic topography due to assimilation over the second year is negatively correlated to the one the year before. This correction is also in agreement with hydrographic data. It is likely that the signal corrected by assimilation is not only due to wind error, because simulations driven by various wind forcings present the same features over the two years. Model simulations run with a prescribed throughflow transport anomaly indicate that assimilation is rather correcting in the interior of the model domain for inadequate boundary conditions with the Pacific.

  4. Voltage dips ride-through capability. Model validation of a resistance-commutated rotor wind turbine generator from in-field testing results

    Energy Technology Data Exchange (ETDEWEB)

    Martinez Guillen, Miguel A.; Paz Comech, M.; Ruiz Guillen, Javier; Giraut Ruso, Elizabeth; Garcia-Gracia, Miguel

    2009-07-01

    The present wind energy penetration into the electrical network has forced system operators to adapt their Grid Codes to this new generation, preventing an unacceptable effect on the system safety and reliability. There are several wind turbine models that can be used to study the effects of voltage dips and the corresponding wind turbine responses but these models need to be validated by comparing their results with the data obtained during field tests. This paper describe the process followed for the validation of a Resistance-Commutated rotor wind turbine generator from in-field testing results according to the Spanish procedure for verification, validation and certification of the requirements of the P.O. 12.3 on the response of wind farms in the event of voltage dips. (orig.)

  5. Three dimensional stereolithography models of cancellous bone structures from muCT data: testing and validation of finite element results.

    Science.gov (United States)

    Dobson, C A; Sisias, G; Phillips, R; Fagan, M J; Langton, C M

    2006-04-01

    Stereolithography (STL) models of complex cancellous bone structures have been produced from three-dimensional micro-computed tomography data sets of human cancellous bone histological samples from four skeletal sites. The STL models have been mechanically tested and the derived stiffness compared with that predicted by finite element analysis. The results show a strong correlation (R2 = 0.941) between the predicted and calculated stiffnesses of the structures and show promise for the use of STL as an additional technique to complement the use of finite element models, for the assessment of the mechanical properties of complex cancellous bone structures.

  6. Roll-up of validation results to a target application.

    Energy Technology Data Exchange (ETDEWEB)

    Hills, Richard Guy

    2013-09-01

    Suites of experiments are preformed over a validation hierarchy to test computational simulation models for complex applications. Experiments within the hierarchy can be performed at different conditions and configurations than those for an intended application, with each experiment testing only part of the physics relevant for the application. The purpose of the present work is to develop methodology to roll-up validation results to an application, and to assess the impact the validation hierarchy design has on the roll-up results. The roll-up is accomplished through the development of a meta-model that relates validation measurements throughout a hierarchy to the desired response quantities for the target application. The meta-model is developed using the computation simulation models for the experiments and the application. The meta-model approach is applied to a series of example transport problems that represent complete and incomplete coverage of the physics of the target application by the validation experiments.

  7. Validation of satellite SAR offshore wind speed maps to in-situ data, microscale and mesoscale model results

    DEFF Research Database (Denmark)

    Hasager, C.B.; Astrup, Poul; Barthelmie, R.J.

    2002-01-01

    planning of offshore wind farms. The report describes the validation analysis in detail for three sites in Denmark, Italy and Egypt. The site in Norway is analyzed by the Nansen Environmental and Remote SensingCentre (NERSC). Wind speed maps and wind direction maps from Earth Observation data recorded...... band in which the SAR wind speed observations have a strong negative bias. The bathymetry of Horns Rev combined with tidal currents give rise to bias in the SAR wind speed maps near areas of shallow, complex bottom topography in some cases. Atotal of 16 cases were analyzed for Horns Rev. For Maddalena...

  8. Bed slope effects on turbulent wave boundary layers: 1. Model validation and quantification of rough-turbulent results

    DEFF Research Database (Denmark)

    Fuhrman, David R.; Fredsøe, Jørgen; Sumer, B. Mutlu

    2009-01-01

    A numerical model solving incompressible Reynolds-averaged Navier-Stokes equations, combined with a two-equation k-omega turbulence closure, is used to study converging-diverging effects from a sloping bed on turbulent (oscillatory) wave boundary layers. Bed shear stresses from the numerical model...

  9. Validating a Modified Gagnean Concept-Acquisition Model: The Results of an Experimental Study Using Art-Related Content.

    Science.gov (United States)

    Stahl, Robert J.

    This paper describes a study to determine whether students can, with appropriate instructional materials, develop and apply a knowledge of art concepts. An opening section reviews research on concept acquisition. This study utilized a concept acquisition model developed in 1970 by R. Gagne. Gagne's model proposed that concepts are learned through…

  10. Validating Animal Models

    Directory of Open Access Journals (Sweden)

    Nina Atanasova

    2015-06-01

    Full Text Available In this paper, I respond to the challenge raised against contemporary experimental neurobiology according to which the field is in a state of crisis because of the multiple experimental protocols employed in different laboratories and strengthening their reliability that presumably preclude the validity of neurobiological knowledge. I provide an alternative account of experimentation in neurobiology which makes sense of its experimental practices. I argue that maintaining a multiplicity of experimental protocols and strengthening their reliability are well justified and they foster rather than preclude the validity of neurobiological knowledge. Thus, their presence indicates thriving rather than crisis of experimental neurobiology.

  11. PEMFC modeling and experimental validation

    Energy Technology Data Exchange (ETDEWEB)

    Vargas, J.V.C. [Federal University of Parana (UFPR), Curitiba, PR (Brazil). Dept. of Mechanical Engineering], E-mail: jvargas@demec.ufpr.br; Ordonez, J.C.; Martins, L.S. [Florida State University, Tallahassee, FL (United States). Center for Advanced Power Systems], Emails: ordonez@caps.fsu.edu, martins@caps.fsu.edu

    2009-07-01

    In this paper, a simplified and comprehensive PEMFC mathematical model introduced in previous studies is experimentally validated. Numerical results are obtained for an existing set of commercial unit PEM fuel cells. The model accounts for pressure drops in the gas channels, and for temperature gradients with respect to space in the flow direction, that are investigated by direct infrared imaging, showing that even at low current operation such gradients are present in fuel cell operation, and therefore should be considered by a PEMFC model, since large coolant flow rates are limited due to induced high pressure drops in the cooling channels. The computed polarization and power curves are directly compared to the experimentally measured ones with good qualitative and quantitative agreement. The combination of accuracy and low computational time allow for the future utilization of the model as a reliable tool for PEMFC simulation, control, design and optimization purposes. (author)

  12. Testing and validating environmental models

    Science.gov (United States)

    Kirchner, J.W.; Hooper, R.P.; Kendall, C.; Neal, C.; Leavesley, G.

    1996-01-01

    Generally accepted standards for testing and validating ecosystem models would benefit both modellers and model users. Universally applicable test procedures are difficult to prescribe, given the diversity of modelling approaches and the many uses for models. However, the generally accepted scientific principles of documentation and disclosure provide a useful framework for devising general standards for model evaluation. Adequately documenting model tests requires explicit performance criteria, and explicit benchmarks against which model performance is compared. A model's validity, reliability, and accuracy can be most meaningfully judged by explicit comparison against the available alternatives. In contrast, current practice is often characterized by vague, subjective claims that model predictions show 'acceptable' agreement with data; such claims provide little basis for choosing among alternative models. Strict model tests (those that invalid models are unlikely to pass) are the only ones capable of convincing rational skeptics that a model is probably valid. However, 'false positive' rates as low as 10% can substantially erode the power of validation tests, making them insufficiently strict to convince rational skeptics. Validation tests are often undermined by excessive parameter calibration and overuse of ad hoc model features. Tests are often also divorced from the conditions under which a model will be used, particularly when it is designed to forecast beyond the range of historical experience. In such situations, data from laboratory and field manipulation experiments can provide particularly effective tests, because one can create experimental conditions quite different from historical data, and because experimental data can provide a more precisely defined 'target' for the model to hit. We present a simple demonstration showing that the two most common methods for comparing model predictions to environmental time series (plotting model time series

  13. Base Flow Model Validation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation is the systematic "building-block" validation of CFD/turbulence models employing a GUI driven CFD code (RPFM) and existing as well as new data sets to...

  14. Validating Dart Model

    Directory of Open Access Journals (Sweden)

    Mazur Jolanta

    2014-12-01

    Full Text Available The primary objective of the study was to quantitatively test the DART model, which despite being one of the most popular representations of co-creation concept was so far studied almost solely with qualitative methods. To this end, the researchers developed a multiple measurement scale and employed it in interviewing managers. The statistical evidence for adequacy of the model was obtained through CFA with AMOS software. The findings suggest that the DART model may not be an accurate representation of co-creation practices in companies. From the data analysis it was evident that the building blocks of DART had too much of conceptual overlap to be an effective framework for quantitative analysis. It was also implied that the phenomenon of co-creation is so rich and multifaceted that it may be more adequately captured by a measurement model where co-creation is conceived as a third-level factor with two layers of intermediate latent variables.

  15. Results of the AVATAR project for the validation of 2D aerodynamic models with experimental data of the DU95W180 airfoil with unsteady flap

    Science.gov (United States)

    Ferreira, C.; Gonzalez, A.; Baldacchino, D.; Aparicio, M.; Gómez, S.; Munduate, X.; Garcia, N. R.; Sørensen, J. N.; Jost, E.; Knecht, S.; Lutz, T.; Chassapogiannis, P.; Diakakis, K.; Papadakis, G.; Voutsinas, S.; Prospathopoulos, J.; Gillebaart, T.; van Zuijlen, A.

    2016-09-01

    The FP7 AdVanced Aerodynamic Tools for lArge Rotors - Avatar project aims to develop and validate advanced aerodynamic models, to be used in integral design codes for the next generation of large scale wind turbines (10-20MW). One of the approaches towards reaching rotors for 10-20MW size is the application of flow control devices, such as flaps. In Task 3.2: Development of aerodynamic codes for modelling of flow devices on aerofoils and, rotors of the Avatar project, aerodynamic codes are benchmarked and validated against the experimental data of a DU95W180 airfoil in steady and unsteady flow, for different angle of attack and flap settings, including unsteady oscillatory trailing-edge-flap motion, carried out within the framework of WP3: Models for Flow Devices and Flow Control, Task 3.1: CFD and Experimental Database. The aerodynamics codes are: AdaptFoil2D, Foil2W, FLOWer, MaPFlow, OpenFOAM, Q3UIC, ATEFlap. The codes include unsteady Eulerian CFD simulations with grid deformation, panel models and indicial engineering models. The validation cases correspond to 18 steady flow cases, and 42 unsteady flow cases, for varying angle of attack, flap deflection and reduced frequency, with free and forced transition. The validation of the models show varying degrees of agreement, varying between models and flow cases.

  16. Comparison and validation of HEU and LEU modeling results to HEU experimental benchmark data for the Massachusetts Institute of Technology MITR reactor.

    Energy Technology Data Exchange (ETDEWEB)

    Newton, T. H.; Wilson, E. H; Bergeron, A.; Horelik, N.; Stevens, J. (Nuclear Engineering Division); (MIT Nuclear Reactor Lab.)

    2011-03-02

    The Massachusetts Institute of Technology Reactor (MITR-II) is a research reactor in Cambridge, Massachusetts designed primarily for experiments using neutron beam and in-core irradiation facilities. It delivers a neutron flux comparable to current LWR power reactors in a compact 6 MW core using Highly Enriched Uranium (HEU) fuel. In the framework of its non-proliferation policies, the international community presently aims to minimize the amount of nuclear material available that could be used for nuclear weapons. In this geopolitical context, most research and test reactors both domestic and international have started a program of conversion to the use of Low Enriched Uranium (LEU) fuel. A new type of LEU fuel based on an alloy of uranium and molybdenum (UMo) is expected to allow the conversion of U.S. domestic high performance reactors like the MITR-II reactor. Towards this goal, comparisons of MCNP5 Monte Carlo neutronic modeling results for HEU and LEU cores have been performed. Validation of the model has been based upon comparison to HEU experimental benchmark data for the MITR-II. The objective of this work was to demonstrate a model which could represent the experimental HEU data, and therefore could provide a basis to demonstrate LEU core performance. This report presents an overview of MITR-II model geometry and material definitions which have been verified, and updated as required during the course of validation to represent the specifications of the MITR-II reactor. Results of calculations are presented for comparisons to historical HEU start-up data from 1975-1976, and to other experimental benchmark data available for the MITR-II Reactor through 2009. This report also presents results of steady state neutronic analysis of an all-fresh LEU fueled core. Where possible, HEU and LEU calculations were performed for conditions equivalent to HEU experiments, which serves as a starting point for safety analyses for conversion of MITR-II from the use of HEU

  17. Validation of Magnetospheric Magnetohydrodynamic Models

    Science.gov (United States)

    Curtis, Brian

    Magnetospheric magnetohydrodynamic (MHD) models are commonly used for both prediction and modeling of Earth's magnetosphere. To date, very little validation has been performed to determine their limits, uncertainties, and differences. In this work, we performed a comprehensive analysis using several commonly used validation techniques in the atmospheric sciences to MHD-based models of Earth's magnetosphere for the first time. The validation techniques of parameter variability/sensitivity analysis and comparison to other models were used on the OpenGGCM, BATS-R-US, and SWMF magnetospheric MHD models to answer several questions about how these models compare. The questions include: (1) the difference between the model's predictions prior to and following to a reversal of Bz in the upstream interplanetary field (IMF) from positive to negative, (2) the influence of the preconditioning duration, and (3) the differences between models under extreme solar wind conditions. A differencing visualization tool was developed and used to address these three questions. We find: (1) For a reversal in IMF Bz from positive to negative, the OpenGGCM magnetopause is closest to Earth as it has the weakest magnetic pressure near-Earth. The differences in magnetopause positions between BATS-R-US and SWMF are explained by the influence of the ring current, which is included in SWMF. Densities are highest for SWMF and lowest for OpenGGCM. The OpenGGCM tail currents differ significantly from BATS-R-US and SWMF; (2) A longer preconditioning time allowed the magnetosphere to relax more, giving different positions for the magnetopause with all three models before the IMF Bz reversal. There were differences greater than 100% for all three models before the IMF Bz reversal. The differences in the current sheet region for the OpenGGCM were small after the IMF Bz reversal. The BATS-R-US and SWMF differences decreased after the IMF Bz reversal to near zero; (3) For extreme conditions in the solar

  18. Validity Examination of EFQM’s Results by DEA Models = Examen de la validez de los resultados de EFQM mediante modelos DEA

    Directory of Open Access Journals (Sweden)

    Ben Mustafa, Adli

    2008-01-01

    Full Text Available Validity Examination of EFQM’s Results by DEA Models = Examen de la validez de los resultados de EFQM mediante modelos DEAAbstract: The European Foundation Quality Management is one of the models which deal with the assessment of function of an organization using a self-assessment for measuring the concepts some of which are more and more qualitative. Consequently, complete understanding and correct usage of this model in an organization depend on the comprehensive recognition of that model and different strategies of self-assessment. The process of self-assessment on the basis of this model in an organization needs to use the experienced auditors. This leads to reduce the wrong privilege making to the criteria and to subcriteria probable way. In this paper, first some of the weaknesses of the EFQM model are studied, then with the usage of structure of input-output governing of the model and using of Data Envelopment Analysis, a method is offered to recognize the lack of the proportion between Enablers and the results of organization which may occur due to problems and obstacles hidden in the heart of organization. = La Fundación Europea de Gestión de la Calidad (EFQM significa uno de los modelos para la evaluación de las funciones de las organizaciones, utilizando la autoevaluación para medir aspectos que, algunos de los cuales, son cada vez más cualitativos. Consecuentemente, la comprensión completa y el uso correcto de este modelo en una organización dependen del conocimiento profundo del modelo y de las diferentes estrategias de autoevaluación. El proceso de autoevaluación en la base de este modelo, en cualquier organización, necesita la intervención de auditores experimentados. Esto es precisamente lo que lleva a reducir el uso incorrecto de los criterios y de los subcriterios. En este artículo, primero se estudian algunas de las debilidades del modelo EFQM y después, mediante la utilización de estructura de control de

  19. Quantitative model validation techniques: new insights

    CERN Document Server

    Ling, You

    2012-01-01

    This paper develops new insights into quantitative methods for the validation of computational model prediction. Four types of methods are investigated, namely classical and Bayesian hypothesis testing, a reliability-based method, and an area metric-based method. Traditional Bayesian hypothesis testing is extended based on interval hypotheses on distribution parameters and equality hypotheses on probability distributions, in order to validate models with deterministic/stochastic output for given inputs. Two types of validation experiments are considered - fully characterized (all the model/experimental inputs are measured and reported as point values) and partially characterized (some of the model/experimental inputs are not measured or are reported as intervals). Bayesian hypothesis testing can minimize the risk in model selection by properly choosing the model acceptance threshold, and its results can be used in model averaging to avoid Type I/II errors. It is shown that Bayesian interval hypothesis testing...

  20. Results of the AVATAR project for the validation of 2D aerodynamic models with experimental data of the DU95W180 airfoil with unsteady flap

    DEFF Research Database (Denmark)

    Ferreira, C.; Gonzalez, A.; Baldacchino, D.;

    2016-01-01

    The FP7 AdVanced Aerodynamic Tools for lArge Rotors - Avatar project aims to develop and validate advanced aerodynamic models, to be used in integral design codes for the next generation of large scale wind turbines (10-20MW). One of the approaches towards reaching rotors for 10-20MW size...... is the application of flow control devices, such as flaps. In Task 3.2: Development of aerodynamic codes for modelling of flow devices on aerofoils and, rotors of the Avatar project, aerodynamic codes are benchmarked and validated against the experimental data of a DU95W180 airfoil in steady and unsteady flow......, for different angle of attack and flap settings, including unsteady oscillatory trailing-edge-flap motion, carried out within the framework of WP3: Models for Flow Devices and Flow Control, Task 3.1: CFD and Experimental Database. The aerodynamics codes are: AdaptFoil2D, Foil2W, FLOWer, MaPFlow, OpenFOAM, Q3UIC...

  1. Validation of the Hot Strip Mill Model

    Energy Technology Data Exchange (ETDEWEB)

    Richard Shulkosky; David Rosberg; Jerrud Chapman

    2005-03-30

    The Hot Strip Mill Model (HSMM) is an off-line, PC based software originally developed by the University of British Columbia (UBC) and the National Institute of Standards and Technology (NIST) under the AISI/DOE Advanced Process Control Program. The HSMM was developed to predict the temperatures, deformations, microstructure evolution and mechanical properties of steel strip or plate rolled in a hot mill. INTEG process group inc. undertook the current task of enhancing and validating the technology. With the support of 5 North American steel producers, INTEG process group tested and validated the model using actual operating data from the steel plants and enhanced the model to improve prediction results.

  2. Validating the passenger traffic model for Copenhagen

    DEFF Research Database (Denmark)

    Overgård, Christian Hansen; VUK, Goran

    2006-01-01

    The paper presents a comprehensive validation procedure for the passenger traffic model for Copenhagen based on external data from the Danish national travel survey and traffic counts. The model was validated for the years 2000 to 2004, with 2004 being of particular interest because the Copenhagen...... Metro became operational in autumn 2002. We observed that forecasts from the demand sub-models agree well with the data from the 2000 national travel survey, with the mode choice forecasts in particular being a good match with the observed modal split. The results of the 2000 car assignment model...... matched the observed traffic better than those of the transit assignment model. With respect to the metro forecasts, the model over-predicts metro passenger flows by 10% to 50%. The wide range of findings from the project resulted in two actions. First, a project was started in January 2005 to upgrade...

  3. Validation for a recirculation model.

    Science.gov (United States)

    LaPuma, P T

    2001-04-01

    Recent Clean Air Act regulations designed to reduce volatile organic compound (VOC) emissions have placed new restrictions on painting operations. Treating large volumes of air which contain dilute quantities of VOCs can be expensive. Recirculating some fraction of the air allows an operator to comply with environmental regulations at reduced cost. However, there is a potential impact on employee safety because indoor pollutants will inevitably increase when air is recirculated. A computer model was developed, written in Microsoft Excel 97, to predict compliance costs and indoor air concentration changes with respect to changes in the level of recirculation for a given facility. The model predicts indoor air concentrations based on product usage and mass balance equations. This article validates the recirculation model using data collected from a C-130 aircraft painting facility at Hill Air Force Base, Utah. Air sampling data and air control cost quotes from vendors were collected for the Hill AFB painting facility and compared to the model's predictions. The model's predictions for strontium chromate and isocyanate air concentrations were generally between the maximum and minimum air sampling points with a tendency to predict near the maximum sampling points. The model's capital cost predictions for a thermal VOC control device ranged from a 14 percent underestimate to a 50 percent overestimate of the average cost quotes. A sensitivity analysis of the variables is also included. The model is demonstrated to be a good evaluation tool in understanding the impact of recirculation.

  4. Design and validation of a specialized training model for tissue bank personnel as a result of the European Quality System for Tissue Banking (EQSTB) project.

    Science.gov (United States)

    Kaminski, A; Uhrynowska-Tyszkiewicz, I; Miranda, B; Navarro, A; Manyalich, M

    2007-11-01

    The main objective of European Quality System for Tissue Banking (EQSTB) project was to analyze throughout different working areas the factors that may influence the final tissue quality and safety for transplantation, providing greater benefit to recipients. Fifteen national organizations and tissue establishments from 12 European countries took part in this project. The Sanco-EQSTB project was organized in four Working Groups. The objectives of each was focused on a specific area. The Standards Working Group analyzed different standards or guides used in various European tissue banks as a quality and safety system. The Registry Working Group created a Tissue Registry through a multinational European network database. The Education Working Group created a specialized training model for tissue bank personnel. The Audit Working Group created an European model of Auditing for tissue establishments. The aim of this article was to describe the activities of Working Group 3 in designing and validating a specialized training model among tissue bank personnel that could become the approved education system recommended by European Union members.

  5. Methods for Validation and Intercomparison of Remote Sensing and In situ Ice Water Measurements: Case Studies from CRYSTAL-FACE and Model Results

    Science.gov (United States)

    Sayres, D.S.; Pittman, J. V.; Smith, J. B.; Weinstock, E. M.; Anderson, J. G.; Heymsfield, G.; Li, L.; Fridlind, A.; Ackerman, A. S.

    2004-01-01

    Remote sensing observations, such as those from AURA, are necessary to understand the role of cirrus in determining the radiative and humidity budgets of the upper troposphere. Using these measurements quantitatively requires comparisons with in situ measurements that have previously been validated. However, a direct comparison of remote and in situ measurements is difficult due to the requirement that the spatial and temporal overlap be sufficient in order to guarantee that both instruments are measuring the same air parcel. A difficult as this might be for gas phase intercomparisons, cloud inhomogeneities significantly exacerbate the problem for cloud ice water content measurements. The CRYSTAL-FACE mission provided an opportunity to assess how well such intercomparisons can be performed and to establish flight plans that will be necessary for validation of future satellite instruments. During CRYSTAL-FACE, remote and in situ instruments were placed on different aircraft (NASA's ER-2 and WB-59, and the two planes flew in tandem so that the in situ payload flew in the field of view of the remote instruments. We show here that, even with this type of careful flight planning, it is not always possible to guarantee that remote and in situ instruments are viewing the same air parcel. We use ice water data derived from the in situ Harvard Total Water (HV-TW) instrument, and the remote Goddard Cloud Radar System (CRS) and show that agreement between HV-TW and CRS is a strong function of the horizontal separation and the time delay between the aircraft transects. We also use a cloud model to simulate possible trajectories through a cloud and evaluate the use of statistical analysis in determining the agreement between the two instruments. This type of analysis should guide flight planning for future intercomparison efforts, whether for aircraft or satellite-borne instrumentation.

  6. Feature extraction for structural dynamics model validation

    Energy Technology Data Exchange (ETDEWEB)

    Hemez, Francois [Los Alamos National Laboratory; Farrar, Charles [Los Alamos National Laboratory; Park, Gyuhae [Los Alamos National Laboratory; Nishio, Mayuko [UNIV OF TOKYO; Worden, Keith [UNIV OF SHEFFIELD; Takeda, Nobuo [UNIV OF TOKYO

    2010-11-08

    This study focuses on defining and comparing response features that can be used for structural dynamics model validation studies. Features extracted from dynamic responses obtained analytically or experimentally, such as basic signal statistics, frequency spectra, and estimated time-series models, can be used to compare characteristics of structural system dynamics. By comparing those response features extracted from experimental data and numerical outputs, validation and uncertainty quantification of numerical model containing uncertain parameters can be realized. In this study, the applicability of some response features to model validation is first discussed using measured data from a simple test-bed structure and the associated numerical simulations of these experiments. issues that must be considered were sensitivity, dimensionality, type of response, and presence or absence of measurement noise in the response. Furthermore, we illustrate a comparison method of multivariate feature vectors for statistical model validation. Results show that the outlier detection technique using the Mahalanobis distance metric can be used as an effective and quantifiable technique for selecting appropriate model parameters. However, in this process, one must not only consider the sensitivity of the features being used, but also correlation of the parameters being compared.

  7. Software Validation via Model Animation

    Science.gov (United States)

    Dutle, Aaron M.; Munoz, Cesar A.; Narkawicz, Anthony J.; Butler, Ricky W.

    2015-01-01

    This paper explores a new approach to validating software implementations that have been produced from formally-verified algorithms. Although visual inspection gives some confidence that the implementations faithfully reflect the formal models, it does not provide complete assurance that the software is correct. The proposed approach, which is based on animation of formal specifications, compares the outputs computed by the software implementations on a given suite of input values to the outputs computed by the formal models on the same inputs, and determines if they are equal up to a given tolerance. The approach is illustrated on a prototype air traffic management system that computes simple kinematic trajectories for aircraft. Proofs for the mathematical models of the system's algorithms are carried out in the Prototype Verification System (PVS). The animation tool PVSio is used to evaluate the formal models on a set of randomly generated test cases. Output values computed by PVSio are compared against output values computed by the actual software. This comparison improves the assurance that the translation from formal models to code is faithful and that, for example, floating point errors do not greatly affect correctness and safety properties.

  8. Verifying and Validating Simulation Models

    Energy Technology Data Exchange (ETDEWEB)

    Hemez, Francois M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-02-23

    This presentation is a high-level discussion of the Verification and Validation (V&V) of computational models. Definitions of V&V are given to emphasize that “validation” is never performed in a vacuum; it accounts, instead, for the current state-of-knowledge in the discipline considered. In particular comparisons between physical measurements and numerical predictions should account for their respective sources of uncertainty. The differences between error (bias), aleatoric uncertainty (randomness) and epistemic uncertainty (ignorance, lack-of- knowledge) are briefly discussed. Four types of uncertainty in physics and engineering are discussed: 1) experimental variability, 2) variability and randomness, 3) numerical uncertainty and 4) model-form uncertainty. Statistical sampling methods are available to propagate, and analyze, variability and randomness. Numerical uncertainty originates from the truncation error introduced by the discretization of partial differential equations in time and space. Model-form uncertainty is introduced by assumptions often formulated to render a complex problem more tractable and amenable to modeling and simulation. The discussion concludes with high-level guidance to assess the “credibility” of numerical simulations, which stems from the level of rigor with which these various sources of uncertainty are assessed and quantified.

  9. Obstructive lung disease models: what is valid?

    Science.gov (United States)

    Ferdinands, Jill M; Mannino, David M

    2008-12-01

    Use of disease simulation models has led to scrutiny of model methods and demand for evidence that models credibly simulate health outcomes. We sought to describe recent obstructive lung disease simulation models and their validation. Medline and EMBASE were used to identify obstructive lung disease simulation models published from January 2000 to June 2006. Publications were reviewed to assess model attributes and four types of validation: first-order (verification/debugging), second-order (comparison with studies used in model development), third-order (comparison with studies not used in model development), and predictive validity. Six asthma and seven chronic obstructive pulmonary disease models were identified. Seven (54%) models included second-order validation, typically by comparing observed outcomes to simulations of source study cohorts. Seven (54%) models included third-order validation, in which modeled outcomes were usually compared qualitatively for agreement with studies independent of the model. Validation endpoints included disease prevalence, exacerbation, and all-cause mortality. Validation was typically described as acceptable, despite near-universal absence of criteria for judging adequacy of validation. Although over half of recent obstructive lung disease simulation models report validation, inconsistencies in validation methods and lack of detailed reporting make assessing adequacy of validation difficult. For simulation modeling to be accepted as a tool for evaluating clinical and public health programs, models must be validated to credibly simulate health outcomes of interest. Defining the required level of validation and providing guidance for quantitative assessment and reporting of validation are important future steps in promoting simulation models as practical decision tools.

  10. Cognitive Enhancement in Infants Associated with Increased Maternal Fruit Intake During Pregnancy: Results from a Birth Cohort Study with Validation in an Animal Model.

    Science.gov (United States)

    Bolduc, Francois V; Lau, Amanda; Rosenfelt, Cory S; Langer, Steven; Wang, Nan; Smithson, Lisa; Lefebvre, Diana; Alexander, R Todd; Dickson, Clayton T; Li, Liang; Becker, Allan B; Subbarao, Padmaja; Turvey, Stuart E; Pei, Jacqueline; Sears, Malcolm R; Mandhane, Piush J

    2016-06-01

    In-utero nutrition is an under-studied aspect of cognitive development. Fruit has been an important dietary constituent for early hominins and humans. Among 808 eligible CHILD-Edmonton sub-cohort subjects, 688 (85%) had 1-year cognitive outcome data. We found that each maternal daily serving of fruit (sum of fruit plus 100% fruit juice) consumed during pregnancy was associated with a 2.38 point increase in 1-year cognitive development (95% CI 0.39, 4.37; p<0.05). Consistent with this, we found 30% higher learning Performance index (PI) scores in Drosophila offspring from parents who consumed 30% fruit juice supplementation prenatally (PI: 85.7; SE 1.8; p<0.05) compared to the offspring of standard diet parents (PI: 65.0 SE 3.4). Using the Drosophila model, we also show that the cyclic adenylate monophosphate (cAMP) pathway may be a major regulator of this effect, as prenatal fruit associated cognitive enhancement was blocked in Drosophila rutabaga mutants with reduced Ca(2+)-Calmodulin-dependent adenylyl cyclase. Moreover, gestation is a critical time for this effect as postnatal fruit intake did not enhance cognitive performance in either humans or Drosophila. Our study supports increased fruit consumption during pregnancy with significant increases in infant cognitive performance. Validation in Drosophila helps control for potential participant bias or unmeasured confounders.

  11. Cognitive Enhancement in Infants Associated with Increased Maternal Fruit Intake During Pregnancy: Results from a Birth Cohort Study with Validation in an Animal Model

    Directory of Open Access Journals (Sweden)

    Francois V. Bolduc

    2016-06-01

    Full Text Available In-utero nutrition is an under-studied aspect of cognitive development. Fruit has been an important dietary constituent for early hominins and humans. Among 808 eligible CHILD-Edmonton sub-cohort subjects, 688 (85% had 1-year cognitive outcome data. We found that each maternal daily serving of fruit (sum of fruit plus 100% fruit juice consumed during pregnancy was associated with a 2.38 point increase in 1-year cognitive development (95% CI 0.39, 4.37; p < 0.05. Consistent with this, we found 30% higher learning Performance index (PI scores in Drosophila offspring from parents who consumed 30% fruit juice supplementation prenatally (PI: 85.7; SE 1.8; p < 0.05 compared to the offspring of standard diet parents (PI: 65.0 SE 3.4. Using the Drosophila model, we also show that the cyclic adenylate monophosphate (cAMP pathway may be a major regulator of this effect, as prenatal fruit associated cognitive enhancement was blocked in Drosophila rutabaga mutants with reduced Ca2+-Calmodulin-dependent adenylyl cyclase. Moreover, gestation is a critical time for this effect as postnatal fruit intake did not enhance cognitive performance in either humans or Drosophila. Our study supports increased fruit consumption during pregnancy with significant increases in infant cognitive performance. Validation in Drosophila helps control for potential participant bias or unmeasured confounders.

  12. Validation of HEDR models. Hanford Environmental Dose Reconstruction Project

    Energy Technology Data Exchange (ETDEWEB)

    Napier, B.A.; Simpson, J.C.; Eslinger, P.W.; Ramsdell, J.V. Jr.; Thiede, M.E.; Walters, W.H.

    1994-05-01

    The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computer models for estimating the possible radiation doses that individuals may have received from past Hanford Site operations. This document describes the validation of these models. In the HEDR Project, the model validation exercise consisted of comparing computational model estimates with limited historical field measurements and experimental measurements that are independent of those used to develop the models. The results of any one test do not mean that a model is valid. Rather, the collection of tests together provide a level of confidence that the HEDR models are valid.

  13. On the validation of an Eulerian model with the Fukushima Nuclear Power Plant (FNPP) accident. Global and local results for Europe and Japan

    Energy Technology Data Exchange (ETDEWEB)

    Evangeliou, Nikolaos; Balkanski, Yves; Cozic, Anne [CEA-CNRS-UVSQ UMR 8212, IPSL/LSCE - Laboratoire des Sciences du Climat et de l' Environnement, L' Orme des Merisiers, 91191 Gif-sur-Yvette Cedex (France); Florou, Heleni; Eleftheriadis, Konstantinos; Kritidis, Panayotis [NCSR ' Demokritos' , Institute of Nuclear and Radiological Sciences and Technology, Energy and Safety (INRASTES), Environmental Radioactivity Laboratory, 15310 Athens (Greece)

    2014-07-01

    A large debate about the exact emissions after the Fukushima NPP accident is still ongoing more than 3 years after the original disaster. Terada et al. (2012) reported the total release of {sup 137}Cs to be 13 PBq (x10{sup 15} Bq), based on an inverse modelling using Japanese data only, whereas the IRSN reported releases of {sup 137}Cs to be 20.6 PBq (IRSN, 2011). In the present study, we used the emission inventories for {sup 137}Cs and {sup 133}Xe reported by Stohl et al. (2012) estimated by inverse modelling using the CTBTO (Comprehensive Nuclear Test Ban Treaty Organisation) and Japanese networks (36.7 PBq of {sup 137}Cs and 15.3 EBq of {sup 133}Xe). For the simulations of the accident, three different versions of the LMDZORINCA model were used; A regular one with a grid resolution of 2.50 deg. x 1.27 deg. for the global comparison with the CTBTO network (19 and 39 vertical layers), and a zoom version over Europe and Asia (0.45 deg. x 0.51 deg. for 19 levels) resulting after 'stretching' the grid using the same number of grid points to assess what happened in Greece, and Japan. Cesium isotopes were treated as sub-micronic aerosols, whereas {sup 133}Xe as a passive tracer within the model, whereas several other radionuclides were estimated from reported isotopic ratios. Our results for the global assessment fit well to the observations. They differ about 0.04% from the measurements for {sup 137}Cs, and around 40% for xenon. The most significant deviations were observed for the northernmost stations due to both scavenging processes and transport over the Arctic. Scattered measurements of several radionuclides from Japan were adopted from literature (Shininaga et al., 2014). The comparison showed a significant quality of our model, although some isotopes are miscalculated. This shows that the reported isotopic ratios might be biased somehow. Finally, in Greece, few measurements of {sup 131}I, {sup 134}Cs and {sup 137}Cs were adopted from Potiriadis et al

  14. Validity of covariance models for the analysis of geographical variation

    DEFF Research Database (Denmark)

    Guillot, Gilles; Schilling, Rene L.; Porcu, Emilio

    2014-01-01

    attention lately and show that the conditions under which they are valid mathematical models have been overlooked so far. 3. We provide rigorous results for the construction of valid covariance models in this family. 4. We also outline how to construct alternative covariance models for the analysis...

  15. Mechatronic Design, Dynamic Modeling and Results of a Satellite Flight Simulator for Experimental Validation of Satellite Attitude Determination and Control Schemes in 3-Axis

    Directory of Open Access Journals (Sweden)

    M.A. Mendoza-Bárcenas

    2014-06-01

    Full Text Available This paper describes the integration and implementation of a satellite flight simulator based on an air bearing system, which was designed and instrumented in our laboratory to evaluate and to perform research in the field of Attitude Determination and Control Systems for satellites, using the hardware-in-the-loop technique. The satellite flight simulator considers two main blocks: an instrumented mobile platform and an external computer executing costume-made Matlab® software. The first block is an air bearing system containing an FPGA based on-board computer with capabilities to integrate digital architectures for data acquisition from inertial navigation sensors, control of actuators and communications data handling. The second block is an external personal computer, which runs in parallel Matlab® based algorithms for attitude determination and control. Both blocks are linked by means of radio modems. The paper also presents the analysis of the satellite flight simulator dynamics in order to obtain its movement equation which allows a better understanding of the satellite flight simulator behavior. In addition, the paper shows experimental results about the automated tracking of the satellite flight simulator based a virtual reality model developed in Matlab®. It also depicts two different versions of FPGA based on-board computers developed in-house to integrate embedded and polymorphic digital architectures for spacecrafts applications. Finally, the paper shows successful experimental results for an attitude control test using the satellite flight simulator based on a linear control law.

  16. On validation of multibody musculoskeletal models

    DEFF Research Database (Denmark)

    Lund, Morten Enemark; de Zee, Mark; Andersen, Michael Skipper;

    2012-01-01

    This paper reviews the opportunities to validate multibody musculoskeletal models in view of the current transition of musculoskeletal modelling from a research topic to a practical simulation tool in product design, healthcare and other important applications. This transition creates a new need...... for improvement of the validation of multibody musculoskeletal models are pointed out and directions for future research in the field are proposed. It is our hope that a more structured approach to model validation can help to improve the credibility of musculoskeletal models....

  17. Model validation of channel zapping quality

    OpenAIRE

    Kooij, R.; Nicolai, F.; Ahmed, K.; Brunnström, K.

    2009-01-01

    In an earlier paper we showed, that perceived quality of channel zapping is related to the perceived quality of download time of web browsing, as suggested by ITU-T Rec.G.1030. We showed this by performing subjective tests resulting in an excellent fit with a 0.99 correlation. This was what we call a lean forward experiment and gave the rule of thumb result that the zapping time must be less than 0.43 sec to be good ( > 3.5 on the MOS scale). To validate the model we have done new subjective ...

  18. Optimal Data Split Methodology for Model Validation

    CERN Document Server

    Morrison, Rebecca; Terejanu, Gabriel; Miki, Kenji; Prudhomme, Serge

    2011-01-01

    The decision to incorporate cross-validation into validation processes of mathematical models raises an immediate question - how should one partition the data into calibration and validation sets? We answer this question systematically: we present an algorithm to find the optimal partition of the data subject to certain constraints. While doing this, we address two critical issues: 1) that the model be evaluated with respect to predictions of a given quantity of interest and its ability to reproduce the data, and 2) that the model be highly challenged by the validation set, assuming it is properly informed by the calibration set. This framework also relies on the interaction between the experimentalist and/or modeler, who understand the physical system and the limitations of the model; the decision-maker, who understands and can quantify the cost of model failure; and the computational scientists, who strive to determine if the model satisfies both the modeler's and decision maker's requirements. We also note...

  19. Information systems validation using formal models

    Directory of Open Access Journals (Sweden)

    Azadeh Sarram

    2014-03-01

    Full Text Available During the past few years, there has been growing interest to use unified modeling language (UML to consider the functional requirements. However, lacking a tool to detect the accuracy and the logic of diagrams in this language makes a formal model indispensable. In this study, conversion of primary UML model of a system to a colored Petri net has been accomplished in order to examine the precision of the model. For this purpose, first the definition of priority and implementation tags for UML activity diagram are provided; then it is turned into colored Petri net. Second, the proposed model provides translated tags in terms of net transitions and some monitoring are used to control the system characteristics. Finally, an executable model of UML activity diagram is provided so that the designer could simulate the model by using the simulation results to detect and to refine the problems of the model. In addition, by checking the results, we find out the proposed method enhances authenticity and accuracy of early models and the ratio of system validation increases compared with previous methods.

  20. [Catalonia's primary healthcare accreditation model: a valid model].

    Science.gov (United States)

    Davins, Josep; Gens, Montserrat; Pareja, Clara; Guzmán, Ramón; Marquet, Roser; Vallès, Roser

    2014-07-01

    There are few experiences of accreditation models validated by primary care teams (EAP). The aim of this study was to detail the process of design, development, and subsequent validation of the consensus EAP accreditation model of Catalonia. An Operating Committee of the Health Department of Catalonia revised models proposed by the European Foundation for Quality Management, the Joint Commission International and the Institut Català de la Salut and proposed 628 essential standards to the technical group (25 experts in primary care and quality of care), to establish consensus standards. The consensus document was piloted in 30 EAP for the purpose of validating the contents, testing standards and identifying evidence. Finally, we did a survey to assess acceptance and validation of the document. The Technical Group agreed on a total of 414 essential standards. The pilot selected a total of 379. Mean compliance with the standards of the final document in the 30 EAP was 70.4%. The standards results were the worst fulfilment percentage. The survey target that 83% of the EAP found it useful and 78% found the content of the accreditation manual suitable as a tool to assess the quality of the EAP, and identify opportunities for improvement. On the downside they highlighted its complexity and laboriousness. We have a model that fits the reality of the EAP, and covers all relevant issues for the functioning of an excellent EAP. The model developed in Catalonia is a model for easy understanding.

  1. Validation of systems biology models

    NARCIS (Netherlands)

    Hasdemir, D.

    2015-01-01

    The paradigm shift from qualitative to quantitative analysis of biological systems brought a substantial number of modeling approaches to the stage of molecular biology research. These include but certainly are not limited to nonlinear kinetic models, static network models and models obtained by the

  2. Results from the First Validation Phase of CAP code

    Energy Technology Data Exchange (ETDEWEB)

    Choo, Yeon Joon; Hong, Soon Joon; Hwang, Su Hyun; Kim, Min Ki; Lee, Byung Chul [FNC Tech., SNU, Seoul (Korea, Republic of); Ha, Sang Jun; Choi, Hoon [Korea Electric Power Research Institute, Daejeon (Korea, Republic of)

    2010-10-15

    The second stage of Safety Analysis Code Development for Nuclear Power Plants was lunched on Apirl, 2010 and is scheduled to be through 2012, of which the scope of work shall cover from code validation to licensing preparation. As a part of this project, CAP(Containment Analysis Package) will follow the same procedures. CAP's validation works are organized hieratically into four validation steps using; 1) Fundamental phenomena. 2) Principal phenomena (mixing and transport) and components in containment. 3) Demonstration test by small, middle, large facilities and International Standard Problems. 4) Comparison with other containment codes such as GOTHIC or COMTEMPT. In addition, collecting the experimental data related to containment phenomena and then constructing the database is one of the major works during the second stage as a part of this project. From the validation process of fundamental phenomenon, it could be expected that the current capability and the future improvements of CAP code will be revealed. For this purpose, simple but significant problems, which have the exact analytical solution, were selected and calculated for validation of fundamental phenomena. In this paper, some results of validation problems for the selected fundamental phenomena will be summarized and discussed briefly

  3. SDG-based Model Validation in Chemical Process Simulation

    Institute of Scientific and Technical Information of China (English)

    张贝克; 许欣; 马昕; 吴重光

    2013-01-01

    Signed direct graph (SDG) theory provides algorithms and methods that can be applied directly to chemical process modeling and analysis to validate simulation models, and is a basis for the development of a soft-ware environment that can automate the validation activity. This paper is concentrated on the pretreatment of the model validation. We use the validation scenarios and standard sequences generated by well-established SDG model to validate the trends fitted from the simulation model. The results are helpful to find potential problems, as-sess possible bugs in the simulation model and solve the problem effectively. A case study on a simulation model of boiler is presented to demonstrate the effectiveness of this method.

  4. Feature Extraction for Structural Dynamics Model Validation

    Energy Technology Data Exchange (ETDEWEB)

    Farrar, Charles [Los Alamos National Laboratory; Nishio, Mayuko [Yokohama University; Hemez, Francois [Los Alamos National Laboratory; Stull, Chris [Los Alamos National Laboratory; Park, Gyuhae [Chonnam Univesity; Cornwell, Phil [Rose-Hulman Institute of Technology; Figueiredo, Eloi [Universidade Lusófona; Luscher, D. J. [Los Alamos National Laboratory; Worden, Keith [University of Sheffield

    2016-01-13

    As structural dynamics becomes increasingly non-modal, stochastic and nonlinear, finite element model-updating technology must adopt the broader notions of model validation and uncertainty quantification. For example, particular re-sampling procedures must be implemented to propagate uncertainty through a forward calculation, and non-modal features must be defined to analyze nonlinear data sets. The latter topic is the focus of this report, but first, some more general comments regarding the concept of model validation will be discussed.

  5. Model Validation in Ontology Based Transformations

    Directory of Open Access Journals (Sweden)

    Jesús M. Almendros-Jiménez

    2012-10-01

    Full Text Available Model Driven Engineering (MDE is an emerging approach of software engineering. MDE emphasizes the construction of models from which the implementation should be derived by applying model transformations. The Ontology Definition Meta-model (ODM has been proposed as a profile for UML models of the Web Ontology Language (OWL. In this context, transformations of UML models can be mapped into ODM/OWL transformations. On the other hand, model validation is a crucial task in model transformation. Meta-modeling permits to give a syntactic structure to source and target models. However, semantic requirements have to be imposed on source and target models. A given transformation will be sound when source and target models fulfill the syntactic and semantic requirements. In this paper, we present an approach for model validation in ODM based transformations. Adopting a logic programming based transformational approach we will show how it is possible to transform and validate models. Properties to be validated range from structural and semantic requirements of models (pre and post conditions to properties of the transformation (invariants. The approach has been applied to a well-known example of model transformation: the Entity-Relationship (ER to Relational Model (RM transformation.

  6. Dynamic FE simulation of four-story steel frame modeled by solid elements and its validation using results of full-scale shake-table test

    OpenAIRE

    Miyamura, Tomoshi; Yamashita, Takuzo; AKIBA,HIROSHI; Ohsaki, Makoto

    2015-01-01

    Dynamic finite element analyses of a four-story steel building frame modeled as a fine mesh of solid elements are performed using E-Simulator, which is a parallel finite element analysis software package for precisely simulating collapse behaviors of civil and building structures. E-Simulator is under development at the National Research Institute for Earth Science and Disaster Prevention (NIED), Japan. A full-scale shake-table test for a four-story frame was conducted using E-Defense at NIED...

  7. Statistical Validation of Normal Tissue Complication Probability Models

    Energy Technology Data Exchange (ETDEWEB)

    Xu Chengjian, E-mail: c.j.xu@umcg.nl [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schaaf, Arjen van der; Veld, Aart A. van' t; Langendijk, Johannes A. [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schilstra, Cornelis [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Radiotherapy Institute Friesland, Leeuwarden (Netherlands)

    2012-09-01

    Purpose: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. Methods and Materials: A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Results: Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Conclusion: Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use.

  8. Validation of ecological state space models using the Laplace approximation

    DEFF Research Database (Denmark)

    Thygesen, Uffe Høgsbro; Albertsen, Christoffer Moesgaard; Berg, Casper Willestofte

    2017-01-01

    Many statistical models in ecology follow the state space paradigm. For such models, the important step of model validation rarely receives as much attention as estimation or hypothesis testing, perhaps due to lack of available algorithms and software. Model validation is often based on a naive...... for estimation in general mixed effects models. Implementing one-step predictions in the R package Template Model Builder, we demonstrate that it is possible to perform model validation with little effort, even if the ecological model is multivariate, has non-linear dynamics, and whether observations...... are continuous or discrete. With both simulated data, and a real data set related to geolocation of seals, we demonstrate both the potential and the limitations of the techniques. Our results fill a need for convenient methods for validating a state space model, or alternatively, rejecting it while indicating...

  9. The Mistra experiment for field containment code validation first results

    Energy Technology Data Exchange (ETDEWEB)

    Caron-Charles, M.; Blumenfeld, L. [CEA Saclay, 91 - Gif sur Yvette (France)

    2001-07-01

    The MISTRA facility is a large scale experiment, designed for the purpose of thermal-hydraulics multi-D codes validation. A short description of the facility, the set up of the instrumentation and the test program are presented. Then, the first experimental results, studying helium injection in the containment and their calculations are detailed. (author)

  10. Base Flow Model Validation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The program focuses on turbulence modeling enhancements for predicting high-speed rocket base flows. A key component of the effort is the collection of high-fidelity...

  11. Model validation: Correlation for updating

    Indian Academy of Sciences (India)

    D J Ewins

    2000-06-01

    In this paper, a review is presented of the various methods which are available for the purpose of performing a systematic comparison and correlation between two sets of vibration data. In the present case, the application of interest is in conducting this correlation process as a prelude to model correlation or updating activity.

  12. Validity of information security policy models

    Directory of Open Access Journals (Sweden)

    Joshua Onome Imoniana

    Full Text Available Validity is concerned with establishing evidence for the use of a method to be used with a particular set of population. Thus, when we address the issue of application of security policy models, we are concerned with the implementation of a certain policy, taking into consideration the standards required, through attribution of scores to every item in the research instrument. En today's globalized economic scenarios, the implementation of information security policy, in an information technology environment, is a condition sine qua non for the strategic management process of any organization. Regarding this topic, various studies present evidences that, the responsibility for maintaining a policy rests primarily with the Chief Security Officer. The Chief Security Officer, in doing so, strives to enhance the updating of technologies, in order to meet all-inclusive business continuity planning policies. Therefore, for such policy to be effective, it has to be entirely embraced by the Chief Executive Officer. This study was developed with the purpose of validating specific theoretical models, whose designs were based on literature review, by sampling 10 of the Automobile Industries located in the ABC region of Metropolitan São Paulo City. This sampling was based on the representativeness of such industries, particularly with regards to each one's implementation of information technology in the region. The current study concludes, presenting evidence of the discriminating validity of four key dimensions of the security policy, being such: the Physical Security, the Logical Access Security, the Administrative Security, and the Legal & Environmental Security. On analyzing the Alpha of Crombach structure of these security items, results not only attest that the capacity of those industries to implement security policies is indisputable, but also, the items involved, homogeneously correlate to each other.

  13. Assessment model validity document FARF31

    Energy Technology Data Exchange (ETDEWEB)

    Elert, Mark; Gylling Bjoern; Lindgren, Maria [Kemakta Konsult AB, Stockholm (Sweden)

    2004-08-01

    The prime goal of model validation is to build confidence in the model concept and that the model is fit for its intended purpose. In other words: Does the model predict transport in fractured rock adequately to be used in repository performance assessments. Are the results reasonable for the type of modelling tasks the model is designed for. Commonly, in performance assessments a large number of realisations of flow and transport is made to cover the associated uncertainties. Thus, the flow and transport including radioactive chain decay are preferably calculated in the same model framework. A rather sophisticated concept is necessary to be able to model flow and radionuclide transport in the near field and far field of a deep repository, also including radioactive chain decay. In order to avoid excessively long computational times there is a need for well-based simplifications. For this reason, the far field code FARF31 is made relatively simple, and calculates transport by using averaged entities to represent the most important processes. FARF31 has been shown to be suitable for the performance assessments within the SKB studies, e.g. SR 97. Among the advantages are that it is a fast, simple and robust code, which enables handling of many realisations with wide spread in parameters in combination with chain decay of radionuclides. Being a component in the model chain PROPER, it is easy to assign statistical distributions to the input parameters. Due to the formulation of the advection-dispersion equation in FARF31 it is possible to perform the groundwater flow calculations separately.The basis for the modelling is a stream tube, i.e. a volume of rock including fractures with flowing water, with the walls of the imaginary stream tube defined by streamlines. The transport within the stream tube is described using a dual porosity continuum approach, where it is assumed that rock can be divided into two distinct domains with different types of porosity

  14. Validation of Numerical Shallow Water Models for Tidal Lagoons

    Energy Technology Data Exchange (ETDEWEB)

    Eliason, D.; Bourgeois, A.

    1999-11-01

    An analytical solution is presented for the case of a stratified, tidally forced lagoon. This solution, especially its energetics, is useful for the validation of numerical shallow water models under stratified, tidally forced conditions. The utility of the analytical solution for validation is demonstrated for a simple finite difference numerical model. A comparison is presented of the energetics of the numerical and analytical solutions in terms of the convergence of model results to the analytical solution with increasing spatial and temporal resolution.

  15. Geochemistry Model Validation Report: Material Degradation and Release Model

    Energy Technology Data Exchange (ETDEWEB)

    H. Stockman

    2001-09-28

    The purpose of this Analysis and Modeling Report (AMR) is to validate the Material Degradation and Release (MDR) model that predicts degradation and release of radionuclides from a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. This AMR is prepared according to ''Technical Work Plan for: Waste Package Design Description for LA'' (Ref. 17). The intended use of the MDR model is to estimate the long-term geochemical behavior of waste packages (WPs) containing U. S . Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The model is intended to predict (1) the extent to which criticality control material, such as gadolinium (Gd), will remain in the WP after corrosion of the initial WP, (2) the extent to which fissile Pu and uranium (U) will be carried out of the degraded WP by infiltrating water, and (3) the chemical composition and amounts of minerals and other solids left in the WP. The results of the model are intended for use in criticality calculations. The scope of the model validation report is to (1) describe the MDR model, and (2) compare the modeling results with experimental studies. A test case based on a degrading Pu-ceramic WP is provided to help explain the model. This model does not directly feed the assessment of system performance. The output from this model is used by several other models, such as the configuration generator, criticality, and criticality consequence models, prior to the evaluation of system performance. This document has been prepared according to AP-3.10Q, ''Analyses and Models'' (Ref. 2), and prepared in accordance with the technical work plan (Ref. 17).

  16. Application of A Global-To-Beam Irradiance Model to the Satellite-Based NASA GEWEX SRB Data and Validation of the Results against the Ground-Based BSRN Data

    Science.gov (United States)

    Zhang, T.; Stackhouse, P. W., Jr.; Chandler, W.; Hoell, J. M.; Westberg, D. J.

    2012-12-01

    normal and diffuse irradiances are derived. The input variables include, among others, surface pressure, precipitable water, geopotential height of the surface, 10-meter temperature, and specific humidity from GEOS, and AOD at 700 nm derived from the MATCH (Model for Atmospheric Transport and CHemistry) data. The DirIndex model is modified to accommodate the ranges of the input variables wider than specified in the original DirIndex model. The results are then validated against their BSRN counterparts. Compared with an earlier empirical model for monthly means, the results from the modified DirIndex model shows appreciable improvement.

  17. Ground-water models: Validate or invalidate

    Science.gov (United States)

    Bredehoeft, J.D.; Konikow, L.F.

    1993-01-01

    The word validation has a clear meaning to both the scientific community and the general public. Within the scientific community the validation of scientific theory has been the subject of philosophical debate. The philosopher of science, Karl Popper, argued that scientific theory cannot be validated, only invalidated. Popper’s view is not the only opinion in this debate; however, many scientists today agree with Popper (including the authors). To the general public, proclaiming that a ground-water model is validated carries with it an aura of correctness that we do not believe many of us who model would claim. We can place all the caveats we wish, but the public has its own understanding of what the word implies. Using the word valid with respect to models misleads the public; verification carries with it similar connotations as far as the public is concerned. Our point is this: using the terms validation and verification are misleading, at best. These terms should be abandoned by the ground-water community.

  18. Full-Scale Cookoff Model Validation Experiments

    Energy Technology Data Exchange (ETDEWEB)

    McClelland, M A; Rattanapote, M K; Heimdahl, E R; Erikson, W E; Curran, P O; Atwood, A I

    2003-11-25

    This paper presents the experimental results of the third and final phase of a cookoff model validation effort. In this phase of the work, two generic Heavy Wall Penetrators (HWP) were tested in two heating orientations. Temperature and strain gage data were collected over the entire test period. Predictions for time and temperature of reaction were made prior to release of the live data. Predictions were comparable to the measured values and were highly dependent on the established boundary conditions. Both HWP tests failed at a weld located near the aft closure of the device. More than 90 percent of unreacted explosive was recovered in the end heated experiment and less than 30 percent recovered in the side heated test.

  19. Structural system identification: Structural dynamics model validation

    Energy Technology Data Exchange (ETDEWEB)

    Red-Horse, J.R.

    1997-04-01

    Structural system identification is concerned with the development of systematic procedures and tools for developing predictive analytical models based on a physical structure`s dynamic response characteristics. It is a multidisciplinary process that involves the ability (1) to define high fidelity physics-based analysis models, (2) to acquire accurate test-derived information for physical specimens using diagnostic experiments, (3) to validate the numerical simulation model by reconciling differences that inevitably exist between the analysis model and the experimental data, and (4) to quantify uncertainties in the final system models and subsequent numerical simulations. The goal of this project was to develop structural system identification techniques and software suitable for both research and production applications in code and model validation.

  20. Model performance analysis and model validation in logistic regression

    Directory of Open Access Journals (Sweden)

    Rosa Arboretti Giancristofaro

    2007-10-01

    Full Text Available In this paper a new model validation procedure for a logistic regression model is presented. At first, we illustrate a brief review of different techniques of model validation. Next, we define a number of properties required for a model to be considered "good", and a number of quantitative performance measures. Lastly, we describe a methodology for the assessment of the performance of a given model by using an example taken from a management study.

  1. System Advisor Model: Flat Plate Photovoltaic Performance Modeling Validation Report

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, J.; Whitmore, J.; Kaffine, L.; Blair, N.; Dobos, A. P.

    2013-12-01

    The System Advisor Model (SAM) is a free software tool that performs detailed analysis of both system performance and system financing for a variety of renewable energy technologies. This report provides detailed validation of the SAM flat plate photovoltaic performance model by comparing SAM-modeled PV system generation data to actual measured production data for nine PV systems ranging from 75 kW to greater than 25 MW in size. The results show strong agreement between SAM predictions and field data, with annualized prediction error below 3% for all fixed tilt cases and below 8% for all one axis tracked cases. The analysis concludes that snow cover and system outages are the primary sources of disagreement, and other deviations resulting from seasonal biases in the irradiation models and one axis tracking issues are discussed in detail.

  2. Regimes of validity for balanced models

    Science.gov (United States)

    Gent, Peter R.; McWilliams, James C.

    1983-07-01

    Scaling analyses are presented which delineate the atmospheric and oceanic regimes of validity for the family of balanced models described in Gent and McWilliams (1983a). The analyses follow and extend the classical work of Charney (1948) and others. The analyses use three non-dimensional parameters which represent the flow scale relative to the Earth's radius, the dominance of turbulent or wave-like processes, and the dominant component of the potential vorticity. For each regime, the models that are accurate both at leading order and through at least one higher order of accuracy in the appropriate small parameter are then identified. In particular, it is found that members of the balanced family are the appropriate models of higher-order accuracy over a broad range of parameter regimes. Examples are also given of particular atmospheric and oceanic phenomena which are in the regimes of validity for the different balanced models.

  3. Validation of Hadronic Models in GEANT4

    Energy Technology Data Exchange (ETDEWEB)

    Koi, Tatsumi; Wright, Dennis H.; /SLAC; Folger, Gunter; Ivanchenko, Vladimir; Kossov, Mikhail; Starkov, Nikolai; /CERN; Heikkinen, Aatos; /Helsinki Inst. of Phys.; Truscott,; Lei, Fan; /QinetiQ; Wellisch, Hans-Peter

    2007-09-26

    Geant4 is a software toolkit for the simulation of the passage of particles through matter. It has abundant hadronic models from thermal neutron interactions to ultra relativistic hadrons. An overview of validations in Geant4 hadronic physics is presented based on thin target measurements. In most cases, good agreement is available between Monte Carlo prediction and experimental data; however, several problems have been detected which require some improvement in the models.

  4. Validation of hadronic models in GEANT4

    CERN Document Server

    Koi, Tatsumi; Folger, Günter; Ivanchenko, Vladimir; Kossov, Mikhail; Starkov, Nikolai; Heikkinen, Aatos; Truscott, Pete; Lei, Fan; Wellisch, Hans-Peter

    2007-01-01

    Geant4 is a software toolkit for the simulation of the passage of particles through matter. It has abundant hadronic models from thermal neutron interactions to ultra relativistic hadrons. An overview of validations in Geant4 hadronic physics is presented based on thin-target measurements. In most cases, good agreement is available between Monte Carlo prediction and experimental data; however, several problems have been detected which require some improvement in the models.

  5. Validation of Model Forecasts of the Ambient Solar Wind

    Science.gov (United States)

    Macneice, P. J.; Hesse, M.; Kuznetsova, M. M.; Rastaetter, L.; Taktakishvili, A.

    2009-01-01

    Independent and automated validation is a vital step in the progression of models from the research community into operational forecasting use. In this paper we describe a program in development at the CCMC to provide just such a comprehensive validation for models of the ambient solar wind in the inner heliosphere. We have built upon previous efforts published in the community, sharpened their definitions, and completed a baseline study. We also provide first results from this program of the comparative performance of the MHD models available at the CCMC against that of the Wang-Sheeley-Arge (WSA) model. An important goal of this effort is to provide a consistent validation to all available models. Clearly exposing the relative strengths and weaknesses of the different models will enable forecasters to craft more reliable ensemble forecasting strategies. Models of the ambient solar wind are developing rapidly as a result of improvements in data supply, numerical techniques, and computing resources. It is anticipated that in the next five to ten years, the MHD based models will supplant semi-empirical potential based models such as the WSA model, as the best available forecast models. We anticipate that this validation effort will track this evolution and so assist policy makers in gauging the value of past and future investment in modeling support.

  6. Validating agent based models through virtual worlds.

    Energy Technology Data Exchange (ETDEWEB)

    Lakkaraju, Kiran; Whetzel, Jonathan H.; Lee, Jina; Bier, Asmeret Brooke; Cardona-Rivera, Rogelio E.; Bernstein, Jeremy Ray Rhythm

    2014-01-01

    As the US continues its vigilance against distributed, embedded threats, understanding the political and social structure of these groups becomes paramount for predicting and dis- rupting their attacks. Agent-based models (ABMs) serve as a powerful tool to study these groups. While the popularity of social network tools (e.g., Facebook, Twitter) has provided extensive communication data, there is a lack of ne-grained behavioral data with which to inform and validate existing ABMs. Virtual worlds, in particular massively multiplayer online games (MMOG), where large numbers of people interact within a complex environ- ment for long periods of time provide an alternative source of data. These environments provide a rich social environment where players engage in a variety of activities observed between real-world groups: collaborating and/or competing with other groups, conducting battles for scarce resources, and trading in a market economy. Strategies employed by player groups surprisingly re ect those seen in present-day con icts, where players use diplomacy or espionage as their means for accomplishing their goals. In this project, we propose to address the need for ne-grained behavioral data by acquiring and analyzing game data a commercial MMOG, referred to within this report as Game X. The goals of this research were: (1) devising toolsets for analyzing virtual world data to better inform the rules that govern a social ABM and (2) exploring how virtual worlds could serve as a source of data to validate ABMs established for analogous real-world phenomena. During this research, we studied certain patterns of group behavior to compliment social modeling e orts where a signi cant lack of detailed examples of observed phenomena exists. This report outlines our work examining group behaviors that underly what we have termed the Expression-To-Action (E2A) problem: determining the changes in social contact that lead individuals/groups to engage in a particular behavior

  7. Model validation of channel zapping quality

    Science.gov (United States)

    Kooij, Robert; Nicolai, Floris; Ahmed, Kamal; Brunnström, Kjell

    2009-02-01

    In an earlier paper we showed, that perceived quality of channel zapping is related to the perceived quality of download time of web browsing, as suggested by ITU-T Rec.G.1030. We showed this by performing subjective tests resulting in an excellent fit with a 0.99 correlation. This was what we call a lean forward experiment and gave the rule of thumb result that the zapping time must be less than 0.43 sec to be good ( > 3.5 on the MOS scale). To validate the model we have done new subjective experiments. These experiments included lean backwards zapping i.e. sitting in a sofa with a remote control. The subjects are more forgiving in this case and the requirement could be relaxed to 0.67 sec. We also conducted subjective experiments where the zapping times are varying. We found that the MOS rating decreases if zapping delay times are varying. In our experiments we assumed uniformly distributed delays, where the variance cannot be larger than the mean delay. We found that in order to obtain a MOS rating of at least 3.5, that the maximum allowed variance, and thus also the maximum allowed mean zapping delay, is 0.46 sec.

  8. Exploring discrepancies between quantitative validation results and the geomorphic plausibility of statistical landslide susceptibility maps

    Science.gov (United States)

    Steger, Stefan; Brenning, Alexander; Bell, Rainer; Petschko, Helene; Glade, Thomas

    2016-06-01

    Empirical models are frequently applied to produce landslide susceptibility maps for large areas. Subsequent quantitative validation results are routinely used as the primary criteria to infer the validity and applicability of the final maps or to select one of several models. This study hypothesizes that such direct deductions can be misleading. The main objective was to explore discrepancies between the predictive performance of a landslide susceptibility model and the geomorphic plausibility of subsequent landslide susceptibility maps while a particular emphasis was placed on the influence of incomplete landslide inventories on modelling and validation results. The study was conducted within the Flysch Zone of Lower Austria (1,354 km2) which is known to be highly susceptible to landslides of the slide-type movement. Sixteen susceptibility models were generated by applying two statistical classifiers (logistic regression and generalized additive model) and two machine learning techniques (random forest and support vector machine) separately for two landslide inventories of differing completeness and two predictor sets. The results were validated quantitatively by estimating the area under the receiver operating characteristic curve (AUROC) with single holdout and spatial cross-validation technique. The heuristic evaluation of the geomorphic plausibility of the final results was supported by findings of an exploratory data analysis, an estimation of odds ratios and an evaluation of the spatial structure of the final maps. The results showed that maps generated by different inventories, classifiers and predictors appeared differently while holdout validation revealed similar high predictive performances. Spatial cross-validation proved useful to expose spatially varying inconsistencies of the modelling results while additionally providing evidence for slightly overfitted machine learning-based models. However, the highest predictive performances were obtained for

  9. Empirical data validation for model building

    Science.gov (United States)

    Kazarian, Aram

    2008-03-01

    Optical Proximity Correction (OPC) has become an integral and critical part of process development for advanced technologies with challenging k I requirements. OPC solutions in turn require stable, predictive models to be built that can project the behavior of all structures. These structures must comprehend all geometries that can occur in the layout in order to define the optimal corrections by feature, and thus enable a manufacturing process with acceptable margin. The model is built upon two main component blocks. First, is knowledge of the process conditions which includes the optical parameters (e.g. illumination source, wavelength, lens characteristics, etc) as well as mask definition, resist parameters and process film stack information. Second, is the empirical critical dimension (CD) data collected using this process on specific test features the results of which are used to fit and validate the model and to project resist contours for all allowable feature layouts. The quality of the model therefore is highly dependent on the integrity of the process data collected for this purpose. Since the test pattern suite generally extends to below the resolution limit that the process can support with adequate latitude, the CD measurements collected can often be quite noisy with marginal signal-to-noise ratios. In order for the model to be reliable and a best representation of the process behavior, it is necessary to scrutinize empirical data to ensure that it is not dominated by measurement noise or flyer/outlier points. The primary approach for generating a clean, smooth and dependable empirical data set should be a replicated measurement sampling that can help to statistically reduce measurement noise by averaging. However, it can often be impractical to collect the amount of data needed to ensure a clean data set by this method. An alternate approach is studied in this paper to further smooth the measured data by means of curve fitting to identify remaining

  10. Functional state modelling approach validation for yeast and bacteria cultivations

    Science.gov (United States)

    Roeva, Olympia; Pencheva, Tania

    2014-01-01

    In this paper, the functional state modelling approach is validated for modelling of the cultivation of two different microorganisms: yeast (Saccharomyces cerevisiae) and bacteria (Escherichia coli). Based on the available experimental data for these fed-batch cultivation processes, three different functional states are distinguished, namely primary product synthesis state, mixed oxidative state and secondary product synthesis state. Parameter identification procedures for different local models are performed using genetic algorithms. The simulation results show high degree of adequacy of the models describing these functional states for both S. cerevisiae and E. coli cultivations. Thus, the local models are validated for the cultivation of both microorganisms. This fact is a strong structure model verification of the functional state modelling theory not only for a set of yeast cultivations, but also for bacteria cultivation. As such, the obtained results demonstrate the efficiency and efficacy of the functional state modelling approach. PMID:26740778

  11. Functional state modelling approach validation for yeast and bacteria cultivations.

    Science.gov (United States)

    Roeva, Olympia; Pencheva, Tania

    2014-09-03

    In this paper, the functional state modelling approach is validated for modelling of the cultivation of two different microorganisms: yeast (Saccharomyces cerevisiae) and bacteria (Escherichia coli). Based on the available experimental data for these fed-batch cultivation processes, three different functional states are distinguished, namely primary product synthesis state, mixed oxidative state and secondary product synthesis state. Parameter identification procedures for different local models are performed using genetic algorithms. The simulation results show high degree of adequacy of the models describing these functional states for both S. cerevisiae and E. coli cultivations. Thus, the local models are validated for the cultivation of both microorganisms. This fact is a strong structure model verification of the functional state modelling theory not only for a set of yeast cultivations, but also for bacteria cultivation. As such, the obtained results demonstrate the efficiency and efficacy of the functional state modelling approach.

  12. Validation of a multi-objective, predictive urban traffic model

    NARCIS (Netherlands)

    Wilmink, I.R.; Haak, P. van den; Woldeab, Z.; Vreeswijk, J.

    2013-01-01

    This paper describes the results of the verification and validation of the ecoStrategic Model, which was developed, implemented and tested in the eCoMove project. The model uses real-time and historical traffic information to determine the current, predicted and desired state of traffic in a network

  13. Measurements for validation of high voltage underground cable modelling

    DEFF Research Database (Denmark)

    Bak, Claus Leth; Gudmundsdottir, Unnur Stella; Wiechowski, Wojciech Tomasz

    2009-01-01

    This paper discusses studies concerning cable modelling for long high voltage AC cable lines. In investigating the possibilities of using long cables instead of overhead lines, the simulation results must be trustworthy. Therefore a model validation is of great importance. This paper describes...

  14. SPR Hydrostatic Column Model Verification and Validation.

    Energy Technology Data Exchange (ETDEWEB)

    Bettin, Giorgia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lord, David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rudeen, David Keith [Gram, Inc. Albuquerque, NM (United States)

    2015-10-01

    A Hydrostatic Column Model (HCM) was developed to help differentiate between normal "tight" well behavior and small-leak behavior under nitrogen for testing the pressure integrity of crude oil storage wells at the U.S. Strategic Petroleum Reserve. This effort was motivated by steady, yet distinct, pressure behavior of a series of Big Hill caverns that have been placed under nitrogen for extended period of time. This report describes the HCM model, its functional requirements, the model structure and the verification and validation process. Different modes of operation are also described, which illustrate how the software can be used to model extended nitrogen monitoring and Mechanical Integrity Tests by predicting wellhead pressures along with nitrogen interface movements. Model verification has shown that the program runs correctly and it is implemented as intended. The cavern BH101 long term nitrogen test was used to validate the model which showed very good agreement with measured data. This supports the claim that the model is, in fact, capturing the relevant physical phenomena and can be used to make accurate predictions of both wellhead pressure and interface movements.

  15. Aromatic interactions impact ligand binding and function at serotonin 5-HT2C G protein-coupled receptors: receptor homology modelling, ligand docking, and molecular dynamics results validated by experimental studies

    Science.gov (United States)

    Córdova-Sintjago, Tania; Villa, Nancy; Fang, Lijuan; Booth, Raymond G.

    2014-02-01

    The serotonin (5-hydroxytryptamine, 5-HT) 5-HT2 G protein-coupled receptor (GPCR) family consists of types 2A, 2B, and 2C that share ∼75% transmembrane (TM) sequence identity. Agonists for 5-HT2C receptors are under development for psychoses; whereas, at 5-HT2A receptors, antipsychotic effects are associated with antagonists - in fact, 5-HT2A agonists can cause hallucinations and 5-HT2B agonists cause cardiotoxicity. It is known that 5-HT2A TM6 residues W6.48, F6.51, and F6.52 impact ligand binding and function; however, ligand interactions with these residues at the 5-HT2C receptor have not been reported. To predict and validate molecular determinants for 5-HT2C-specific activation, results from receptor homology modelling, ligand docking, and molecular dynamics simulation studies were compared with experimental results for ligand binding and function at wild type and W6.48A, F6.51A, and F6.52A point-mutated 5-HT2C receptors.

  16. Validation of Orthorectified Interferometric Radar Imagery and Digital Elevation Models

    Science.gov (United States)

    Smith Charles M.

    2004-01-01

    This work was performed under NASA's Verification and Validation (V&V) Program as an independent check of data supplied by EarthWatch, Incorporated, through the Earth Science Enterprise Scientific Data Purchase (SDP) Program. This document serves as the basis of reporting results associated with validation of orthorectified interferometric interferometric radar imagery and digital elevation models (DEM). This validation covers all datasets provided under the first campaign (Central America & Virginia Beach) plus three earlier missions (Indonesia, Red River: and Denver) for a total of 13 missions.

  17. Model validation in soft systems practice

    Energy Technology Data Exchange (ETDEWEB)

    Checkland, P. [Univ. of Lancaster (United Kingdom)

    1995-03-01

    The concept of `a model` usually evokes the connotation `model of part of the real world`. That is an almost automatic response. It makes sense especially in relation to the way the concept has been developed and used in natural science. Classical operational research (OR), with its scientific aspirations, and systems engineering, use the concept in the same way and in addition use models as surrogates for the real world, on which experimentation is cheap. In these fields the key feature of a model is representativeness. In soft systems methodology (SSM) models are not of part of the world; they are only relevant to debate about the real world and are used in a cyclic learning process. The paper shows how the different concepts of validation in classical OR and SSM lead to a way of sharply defining the nature of `soft OR`. 21 refs.

  18. Computational fluid dynamics simulations and validations of results

    CSIR Research Space (South Africa)

    Sitek, MA

    2013-09-01

    Full Text Available Wind flow influence on a high-rise building is analyzed. The research covers full-scale tests, wind-tunnel experiments and numerical simulations. In the present paper computational model used in simulations is described and the results, which were...

  19. Resampling procedures to validate dendro-auxometric regression models

    Directory of Open Access Journals (Sweden)

    2009-03-01

    Full Text Available Regression analysis has a large use in several sectors of forest research. The validation of a dendro-auxometric model is a basic step in the building of the model itself. The more a model resists to attempts of demonstrating its groundlessness, the more its reliability increases. In the last decades many new theories, that quite utilizes the calculation speed of the calculators, have been formulated. Here we show the results obtained by the application of a bootsprap resampling procedure as a validation tool.

  20. Model validation studies of solar systems, Phase III. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Lantz, L.J.; Winn, C.B.

    1978-12-01

    Results obtained from a validation study of the TRNSYS, SIMSHAC, and SOLCOST solar system simulation and design are presented. Also included are comparisons between the FCHART and SOLCOST solar system design programs and some changes that were made to the SOLCOST program. Finally, results obtained from the analysis of several solar radiation models are presented. Separate abstracts were prepared for ten papers.

  1. Bibliometric Modeling Processes and the Empirical Validity of Lotka's Law.

    Science.gov (United States)

    Nicholls, Paul Travis

    1989-01-01

    Examines the elements involved in fitting a bibliometric model to empirical data, proposes a consistent methodology for applying Lotka's law, and presents the results of an empirical test of the methodology. The results are discussed in terms of the validity of Lotka's law and the suitability of the proposed methodology. (49 references) (CLB)

  2. Results from the Deep Space One Technology Validation Mission

    Science.gov (United States)

    Rayman, M.; Varghese, P.; Lehman, D.; Livesay, L.

    1999-01-01

    Launched on October 25, 1998, Deep Space 1 (DS1) is the first mission of NASA's New Millennium Program, chartered to flight validate high-risk, new technologies important for future space and Earth science programs.

  3. Model validation for karst flow using sandbox experiments

    Science.gov (United States)

    Ye, M.; Pacheco Castro, R. B.; Tao, X.; Zhao, J.

    2015-12-01

    The study of flow in karst is complex due of the high heterogeneity of the porous media. Several approaches have been proposed in the literature to study overcome the natural complexity of karst. Some of those methods are the single continuum, double continuum and the discrete network of conduits coupled with the single continuum. Several mathematical and computing models are available in the literature for each approach. In this study one computer model has been selected for each category to validate its usefulness to model flow in karst using a sandbox experiment. The models chosen are: Modflow 2005, Modflow CFPV1 and Modflow CFPV2. A sandbox experiment was implemented in such way that all the parameters required for each model can be measured. The sandbox experiment was repeated several times under different conditions. The model validation will be carried out by comparing the results of the model simulation and the real data. This model validation will allows ud to compare the accuracy of each model and the applicability in Karst. Also we will be able to evaluate if the results of the complex models improve a lot compared to the simple models specially because some models require complex parameters that are difficult to measure in the real world.

  4. Experimental Validation of a Thermoelastic Model for SMA Hybrid Composites

    Science.gov (United States)

    Turner, Travis L.

    2001-01-01

    This study presents results from experimental validation of a recently developed model for predicting the thermomechanical behavior of shape memory alloy hybrid composite (SMAHC) structures, composite structures with an embedded SMA constituent. The model captures the material nonlinearity of the material system with temperature and is capable of modeling constrained, restrained, or free recovery behavior from experimental measurement of fundamental engineering properties. A brief description of the model and analysis procedures is given, followed by an overview of a parallel effort to fabricate and characterize the material system of SMAHC specimens. Static and dynamic experimental configurations for the SMAHC specimens are described and experimental results for thermal post-buckling and random response are presented. Excellent agreement is achieved between the measured and predicted results, fully validating the theoretical model for constrained recovery behavior of SMAHC structures.

  5. Bayesian structural equation modeling method for hierarchical model validation

    Energy Technology Data Exchange (ETDEWEB)

    Jiang Xiaomo [Department of Civil and Environmental Engineering, Vanderbilt University, Box 1831-B, Nashville, TN 37235 (United States)], E-mail: xiaomo.jiang@vanderbilt.edu; Mahadevan, Sankaran [Department of Civil and Environmental Engineering, Vanderbilt University, Box 1831-B, Nashville, TN 37235 (United States)], E-mail: sankaran.mahadevan@vanderbilt.edu

    2009-04-15

    A building block approach to model validation may proceed through various levels, such as material to component to subsystem to system, comparing model predictions with experimental observations at each level. Usually, experimental data becomes scarce as one proceeds from lower to higher levels. This paper presents a structural equation modeling approach to make use of the lower-level data for higher-level model validation under uncertainty, integrating several components: lower-level data, higher-level data, computational model, and latent variables. The method proposed in this paper uses latent variables to model two sets of relationships, namely, the computational model to system-level data, and lower-level data to system-level data. A Bayesian network with Markov chain Monte Carlo simulation is applied to represent the two relationships and to estimate the influencing factors between them. Bayesian hypothesis testing is employed to quantify the confidence in the predictive model at the system level, and the role of lower-level data in the model validation assessment at the system level. The proposed methodology is implemented for hierarchical assessment of three validation problems, using discrete observations and time-series data.

  6. The earth radiation budget experiment: Early validation results

    Science.gov (United States)

    Smith, G. Louis; Barkstrom, Bruce R.; Harrison, Edwin F.

    The Earth Radiation Budget Experiment (ERBE) consists of radiometers on a dedicated spacecraft in a 57° inclination orbit, which has a precessional period of 2 months, and on two NOAA operational meteorological spacecraft in near polar orbits. The radiometers include scanning narrow field-of-view (FOV) and nadir-looking wide and medium FOV radiometers covering the ranges 0.2 to 5 μm and 5 to 50 μm and a solar monitoring channel. This paper describes the validation procedures and preliminary results. Each of the radiometer channels underwent extensive ground calibration, and the instrument packages include in-flight calibration facilities which, to date, show negligible changes of the instruments in orbit, except for gradual degradation of the suprasil dome of the shortwave wide FOV (about 4% per year). Measurements of the solar constant by the solar monitors, wide FOV, and medium FOV radiometers of two spacecraft agree to a fraction of a percent. Intercomparisons of the wide and medium FOV radiometers with the scanning radiometers show agreement of 1 to 4%. The multiple ERBE satellites are acquiring the first global measurements of regional scale diurnal variations in the Earth's radiation budget. These diurnal variations are verified by comparison with high temporal resolution geostationary satellite data. Other principal investigators of the ERBE Science Team are: R. Cess, SUNY, Stoneybrook; J. Coakley, NCAR; C. Duncan, M. King and A Mecherikunnel, Goddard Space Flight Center, NASA; A. Gruber and A.J. Miller, NOAA; D. Hartmann, U. Washington; F.B. House, Drexel U.; F.O. Huck, Langley Research Center, NASA; G. Hunt, Imperial College, London U.; R. Kandel and A. Berroir, Laboratory of Dynamic Meteorology, Ecole Polytechique; V. Ramanathan, U. Chicago; E. Raschke, U. of Cologne; W.L. Smith, U. of Wisconsin and T.H. Vonder Haar, Colorado State U.

  7. Contact Modelling in Resistance Welding, Part II: Experimental Validation

    DEFF Research Database (Denmark)

    Song, Quanfeng; Zhang, Wenqi; Bay, Niels

    2006-01-01

    Contact algorithms in resistance welding presented in the previous paper are experimentally validated in the present paper. In order to verify the mechanical contact algorithm, two types of experiments, i.e. sandwich upsetting of circular, cylindrical specimens and compression tests of discs...... with a solid ring projection towards a flat ring, are carried out at room temperature. The complete algorithm, involving not only the mechanical model but also the thermal and electrical models, is validated by projection welding experiments. The experimental results are in satisfactory agreement...

  8. Validation of spectral gas radiation models under oxyfuel conditions

    Energy Technology Data Exchange (ETDEWEB)

    Becher, Johann Valentin

    2013-05-15

    Combustion of hydrocarbon fuels with pure oxygen results in a different flue gas composition than combustion with air. Standard computational-fluid-dynamics (CFD) spectral gas radiation models for air combustion are therefore out of their validity range in oxyfuel combustion. This thesis provides a common spectral basis for the validation of new spectral models. A literature review about fundamental gas radiation theory, spectral modeling and experimental methods provides the reader with a basic understanding of the topic. In the first results section, this thesis validates detailed spectral models with high resolution spectral measurements in a gas cell with the aim of recommending one model as the best benchmark model. In the second results section, spectral measurements from a turbulent natural gas flame - as an example for a technical combustion process - are compared to simulated spectra based on measured gas atmospheres. The third results section compares simplified spectral models to the benchmark model recommended in the first results section and gives a ranking of the proposed models based on their accuracy. A concluding section gives recommendations for the selection and further development of simplified spectral radiation models. Gas cell transmissivity spectra in the spectral range of 2.4 - 5.4 {mu}m of water vapor and carbon dioxide in the temperature range from 727 C to 1500 C and at different concentrations were compared in the first results section at a nominal resolution of 32 cm{sup -1} to line-by-line models from different databases, two statistical-narrow-band models and the exponential-wide-band model. The two statistical-narrow-band models EM2C and RADCAL showed good agreement with a maximal band transmissivity deviation of 3 %. The exponential-wide-band model showed a deviation of 6 %. The new line-by-line database HITEMP2010 had the lowest band transmissivity deviation of 2.2% and was therefore recommended as a reference model for the

  9. Importance of Computer Model Validation in Pyroprocessing Technology Development

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Y. E.; Li, Hui; Yim, M. S. [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of)

    2014-05-15

    In this research, we developed a plan for experimental validation of one of the computer models developed for ER process modeling, i. e., the ERAD code. Several candidate surrogate materials are selected for the experiment considering the chemical and physical properties. Molten salt-based pyroprocessing technology is being examined internationally as an alternative to treat spent nuclear fuel over aqueous technology. The central process in pyroprocessing is electrorefining(ER) which separates uranium from transuranic elements and fission products present in spent nuclear fuel. ER is a widely used process in the minerals industry to purify impure metals. Studies of ER by using actual spent nuclear fuel materials are problematic for both technical and political reasons. Therefore, the initial effort for ER process optimization is made by using computer models. A number of models have been developed for this purpose. But as validation of these models is incomplete and often times problematic, the simulation results from these models are inherently uncertain.

  10. Progress in Geant4 Electromagnetic Physics Modelling and Validation

    Science.gov (United States)

    Apostolakis, J.; Asai, M.; Bagulya, A.; Brown, J. M. C.; Burkhardt, H.; Chikuma, N.; Cortes-Giraldo, M. A.; Elles, S.; Grichine, V.; Guatelli, S.; Incerti, S.; Ivanchenko, V. N.; Jacquemier, J.; Kadri, O.; Maire, M.; Pandola, L.; Sawkey, D.; Toshito, T.; Urban, L.; Yamashita, T.

    2015-12-01

    In this work we report on recent improvements in the electromagnetic (EM) physics models of Geant4 and new validations of EM physics. Improvements have been made in models of the photoelectric effect, Compton scattering, gamma conversion to electron and muon pairs, fluctuations of energy loss, multiple scattering, synchrotron radiation, and high energy positron annihilation. The results of these developments are included in the new Geant4 version 10.1 and in patches to previous versions 9.6 and 10.0 that are planned to be used for production for run-2 at LHC. The Geant4 validation suite for EM physics has been extended and new validation results are shown in this work. In particular, the effect of gamma-nuclear interactions on EM shower shape at LHC energies is discussed.

  11. Progress in Geant4 Electromagnetic Physics Modelling and Validation

    CERN Document Server

    Apostolakis, J; Bagulya, A; Brown, J M C; Burkhardt, H; Chikuma, N; Cortes-Giraldo, M A; Elles, S; Grichine, V; Guatelli, S; Incerti, S; Ivanchenko, V N; Jacquemier, J; Kadri, O; Maire, M; Pandola, L; Sawkey, D; Toshito, T; Urban, L; Yamashita, T

    2015-01-01

    In this work we report on recent improvements in the electromagnetic (EM) physics models of Geant4 and new validations of EM physics. Improvements have been made in models of the photoelectric effect, Compton scattering, gamma conversion to electron and muon pairs, fluctuations of energy loss, multiple scattering, synchrotron radiation, and high energy positron annihilation. The results of these developments are included in the new Geant4 version 10.1 and in patches to previous versions 9.6 and 10.0 that are planned to be used for production for run-2 at LHC. The Geant4 validation suite for EM physics has been extended and new validation results are shown in this work. In particular, the effect of gamma-nuclear interactions on EM shower shape at LHC energies is discussed.

  12. Progress in Geant4 Electromagnetic Physics Modelling and Validation

    CERN Document Server

    Apostolakis, J; Bagulya, A; Brown, J M C; Burkhardt, H; Chikuma, N; Cortes-Giraldo, M A; Elles, S; Grichine, V; Guatelli, S; Incerti, S; Ivanchenko, V N; Jacquemier, J; Kadri, O; Maire, M; Pandola, L; Sawkey, D; Toshito, T; Urban, L; Yamashita, T

    2015-01-01

    In this work we report on recent improvements in the electromagnetic (EM) physics models of Geant4 and new validations of EM physics. Improvements have been made in models of the photoelectric effect, Compton scattering, gamma conversion to electron and muon pairs, fluctuations of energy loss, multiple scattering, synchrotron radiation, and high energy positron annihilation. The results of these developments are included in the new Geant4 version 10.1 and in patches to previous versions 9.6 and 10.0 that are planned to be used for production for run-2 at LHC. The Geant4 validation suite for EM physics has been extended and new validation results are shown in this work. In particular, the effect of gamma-nuclear interactions on EM shower shape at LHC energies is discussed.

  13. The Validation of Climate Models: The Development of Essential Practice

    Science.gov (United States)

    Rood, R. B.

    2011-12-01

    It is possible from both a scientific and philosophical perspective to state that climate models cannot be validated. However, with the realization that the scientific investigation of climate change is as much a subject of politics as of science, maintaining this formal notion of "validation" has significant consequences. For example, it relegates the bulk of work of many climate scientists to an exercise of model evaluation that can be construed as ill-posed. Even within the science community this motivates criticism of climate modeling as an exercise of weak scientific practice. Stepping outside of the science community, statements that validation is impossible are used in political arguments to discredit the scientific investigation of climate, to maintain doubt about projections of climate change, and hence, to prohibit the development of public policy to regulate the emissions of greenhouse gases. With the acceptance of the impossibility of validation, scientists often state that the credibility of models can be established through an evaluation process. A robust evaluation process leads to the quantitative description of the modeling system against a standard set of measures. If this process is standardized as institutional practice, then this provides a measure of model performance from one modeling release to the next. It is argued, here, that such a robust and standardized evaluation of climate models can be structured and quantified as "validation." Arguments about the nuanced meaning of validation and evaluation are a subject about which the climate modeling community needs to develop a standard. It does injustice to a body of science-based knowledge to maintain that validation is "impossible." Rather than following such a premise, which immediately devalues the knowledge base, it is more useful to develop a systematic, standardized approach to robust, appropriate validation. This stands to represent the complexity of the Earth's climate and its

  14. Validation of the filament winding process model

    Science.gov (United States)

    Calius, Emilo P.; Springer, George S.; Wilson, Brian A.; Hanson, R. Scott

    1987-01-01

    Tests were performed toward validating the WIND model developed previously for simulating the filament winding of composite cylinders. In these tests two 24 in. long, 8 in. diam and 0.285 in. thick cylinders, made of IM-6G fibers and HBRF-55 resin, were wound at + or - 45 deg angle on steel mandrels. The temperatures on the inner and outer surfaces and inside the composite cylinders were recorded during oven cure. The temperatures inside the cylinders were also calculated by the WIND model. The measured and calculated temperatures were then compared. In addition, the degree of cure and resin viscosity distributions inside the cylinders were calculated for the conditions which existed in the tests.

  15. Actuarial and actual analysis of surgical results: empirical validation.

    Science.gov (United States)

    Grunkemeier, G L; Anderson, R P; Starr, A

    2001-06-01

    This report validates the use of the Kaplan-Meier (actuarial) method of computing survival curves by comparing 12-year estimates published in 1978 with current assessments. It also contrasts cumulative incidence curves, referred to as "actual" analysis in the cardiac-related literature with Kaplan-Meier curves for thromboembolism and demonstrates that with the former estimate the percentage of events that will actually occur.

  16. SDG and qualitative trend based model multiple scale validation

    Science.gov (United States)

    Gao, Dong; Xu, Xin; Yin, Jianjin; Zhang, Hongyu; Zhang, Beike

    2017-09-01

    Verification, Validation and Accreditation (VV&A) is key technology of simulation and modelling. For the traditional model validation methods, the completeness is weak; it is carried out in one scale; it depends on human experience. The SDG (Signed Directed Graph) and qualitative trend based multiple scale validation is proposed. First the SDG model is built and qualitative trends are added to the model. And then complete testing scenarios are produced by positive inference. The multiple scale validation is carried out by comparing the testing scenarios with outputs of simulation model in different scales. Finally, the effectiveness is proved by carrying out validation for a reactor model.

  17. Model Validation for Shipboard Power Cables Using Scattering Parameters%Model Validation for Shipboard Power Cables Using Scattering Parameters

    Institute of Scientific and Technical Information of China (English)

    Lukas Graber; Diomar Infante; Michael Steurer; William W. Brey

    2011-01-01

    Careful analysis of transients in shipboard power systems is important to achieve long life times of the com ponents in future all-electric ships. In order to accomplish results with high accuracy, it is recommended to validate cable models as they have significant influence on the amplitude and frequency spectrum of voltage transients. The authors propose comparison of model and measurement using scattering parameters. They can be easily obtained from measurement and simulation and deliver broadband information about the accuracy of the model. The measurement can be performed using a vector network analyzer. The process to extract scattering parameters from simulation models is explained in detail. Three different simulation models of a 5 kV XLPE power cable have been validated. The chosen approach delivers an efficient tool to quickly estimate the quality of a model.

  18. Unit testing, model validation, and biological simulation

    Science.gov (United States)

    Watts, Mark D.; Ghayoomie, S. Vahid; Larson, Stephen D.; Gerkin, Richard C.

    2016-01-01

    The growth of the software industry has gone hand in hand with the development of tools and cultural practices for ensuring the reliability of complex pieces of software. These tools and practices are now acknowledged to be essential to the management of modern software. As computational models and methods have become increasingly common in the biological sciences, it is important to examine how these practices can accelerate biological software development and improve research quality. In this article, we give a focused case study of our experience with the practices of unit testing and test-driven development in OpenWorm, an open-science project aimed at modeling Caenorhabditis elegans. We identify and discuss the challenges of incorporating test-driven development into a heterogeneous, data-driven project, as well as the role of model validation tests, a category of tests unique to software which expresses scientific models. PMID:27635225

  19. The Danish national passenger modelModel specification and results

    DEFF Research Database (Denmark)

    Rich, Jeppe; Hansen, Christian Overgaard

    2016-01-01

    The paper describes the structure of the new Danish National Passenger model and provides on this basis a general discussion of large-scale model design, cost-damping and model validation. The paper aims at providing three main contributions to the existing literature. Firstly, at the general level......, the paper provides a description of a large-scale forecast model with a discussion of the linkage between population synthesis, demand and assignment. Secondly, the paper gives specific attention to model specification and in particular choice of functional form and cost-damping. Specifically we suggest...... a family of logarithmic spline functions and illustrate how it is applied in the model. Thirdly and finally, we evaluate model sensitivity and performance by evaluating the distance distribution and elasticities. In the paper we present results where the spline-function is compared with more traditional...

  20. Validation of Geant4 hadronic physics models at intermediate energies

    Science.gov (United States)

    Banerjee, Sunanda; Geant4 Hadronic Group

    2010-04-01

    GEANT4 provides a number of physics models at intermediate energies (corresponding to incident momenta in the range 1-20 GeV/c). Recently, these models have been validated with existing data from a number of experiments: (a) inclusive proton and neutron production with a variety of beams (π-, π+, p) at different energies between 1 and 9 GeV/c on a number of nuclear targets (from beryllium to uranium); (2) inclusive pion/kaon/proton production from 14.6 GeV/c proton beams on nuclear targets (from beryllium to gold); (3) inclusive pion production from pion beams between 3-13 GeV/c on a number of nuclear targets (from beryllium to lead). The results of simulation/data comparison for different GEANT4 models are discussed in the context of validating the models and determining their usage in physics lists for high energy application. Due to the increasing number of validations becoming available, and the requirement that they be done at regular intervals corresponding to the GEANT4 release schedule, automated methods of validation are being developed.

  1. Statistical validation of normal tissue complication probability models

    NARCIS (Netherlands)

    Xu, Cheng-Jian; van der Schaaf, Arjen; van t Veld, Aart; Langendijk, Johannes A.; Schilstra, Cornelis

    2012-01-01

    PURPOSE: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. METHODS AND MATERIALS: A penalized regression method, LASSO (least absolute shrinkage

  2. Verification and Validation of Heat Transfer Model of AGREE Code

    Energy Technology Data Exchange (ETDEWEB)

    Tak, N. I. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Seker, V.; Drzewiecki, T. J.; Downar, T. J. [Department of Nuclear Engineering and Radiological Sciences, Univ. of Michigan, Michigan (United States); Kelly, J. M. [US Nuclear Regulatory Commission, Washington (United States)

    2013-05-15

    The AGREE code was originally developed as a multi physics simulation code to perform design and safety analysis of Pebble Bed Reactors (PBR). Currently, additional capability for the analysis of Prismatic Modular Reactor (PMR) core is in progress. Newly implemented fluid model for a PMR core is based on a subchannel approach which has been widely used in the analyses of light water reactor (LWR) cores. A hexagonal fuel (or graphite block) is discretized into triangular prism nodes having effective conductivities. Then, a meso-scale heat transfer model is applied to the unit cell geometry of a prismatic fuel block. Both unit cell geometries of multi-hole and pin-in-hole types of prismatic fuel blocks are considered in AGREE. The main objective of this work is to verify and validate the heat transfer model newly implemented for a PMR core in the AGREE code. The measured data in the HENDEL experiment were used for the validation of the heat transfer model for a pin-in-hole fuel block. However, the HENDEL tests were limited to only steady-state conditions of pin-in-hole fuel blocks. There exist no available experimental data regarding a heat transfer in multi-hole fuel blocks. Therefore, numerical benchmarks using conceptual problems are considered to verify the heat transfer model of AGREE for multi-hole fuel blocks as well as transient conditions. The CORONA and GAMMA+ codes were used to compare the numerical results. In this work, the verification and validation study were performed for the heat transfer model of the AGREE code using the HENDEL experiment and the numerical benchmarks of selected conceptual problems. The results of the present work show that the heat transfer model of AGREE is accurate and reliable for prismatic fuel blocks. Further validation of AGREE is in progress for a whole reactor problem using the HTTR safety test data such as control rod withdrawal tests and loss-of-forced convection tests.

  3. Development and Validation of a Needs Assessment Model Using Stakeholders Involved in a University Program.

    Science.gov (United States)

    Labrecque, Monique

    1999-01-01

    Developed a needs-assessment model and validated the model with five groups of stakeholders connected with an undergraduate university nursing program in Canada. Used focus groups, questionnaires, a hermeneutic approach, and the magnitude-estimation scaling model to validate the model. Results show that respondents must define need to clarify the…

  4. Validation of models with constant bias: an applied approach

    Directory of Open Access Journals (Sweden)

    Salvador Medina-Peralta

    2014-06-01

    Full Text Available Objective. This paper presents extensions to the statistical validation method based on the procedure of Freese when a model shows constant bias (CB in its predictions and illustrate the method with data from a new mechanistic model that predict weight gain in cattle. Materials and methods. The extensions were the hypothesis tests and maximum anticipated error for the alternative approach, and the confidence interval for a quantile of the distribution of errors. Results. The model evaluated showed CB, once the CB is removed and with a confidence level of 95%, the magnitude of the error does not exceed 0.575 kg. Therefore, the validated model can be used to predict the daily weight gain of cattle, although it will require an adjustment in its structure based on the presence of CB to increase the accuracy of its forecasts. Conclusions. The confidence interval for the 1-α quantile of the distribution of errors after correcting the constant bias, allows determining the top limit for the magnitude of the error of prediction and use it to evaluate the evolution of the model in the forecasting of the system. The confidence interval approach to validate a model is more informative than the hypothesis tests for the same purpose.

  5. Propeller aircraft interior noise model utilization study and validation

    Science.gov (United States)

    Pope, L. D.

    1984-01-01

    Utilization and validation of a computer program designed for aircraft interior noise prediction is considered. The program, entitled PAIN (an acronym for Propeller Aircraft Interior Noise), permits (in theory) predictions of sound levels inside propeller driven aircraft arising from sidewall transmission. The objective of the work reported was to determine the practicality of making predictions for various airplanes and the extent of the program's capabilities. The ultimate purpose was to discern the quality of predictions for tonal levels inside an aircraft occurring at the propeller blade passage frequency and its harmonics. The effort involved three tasks: (1) program validation through comparisons of predictions with scale-model test results; (2) development of utilization schemes for large (full scale) fuselages; and (3) validation through comparisons of predictions with measurements taken in flight tests on a turboprop aircraft. Findings should enable future users of the program to efficiently undertake and correctly interpret predictions.

  6. Validation of Biomarker-based risk prediction models

    OpenAIRE

    Taylor, Jeremy M.G.; Ankerst, Donna P.; Andridge, Rebecca R.

    2008-01-01

    The increasing availability and use of predictive models to facilitate informed decision making highlights the need for careful assessment of the validity of these models. In particular, models involving biomarkers require careful validation for two reasons: issues with overfitting when complex models involve a large number of biomarkers, and inter-laboratory variation in assays used to measure biomarkers. In this paper we distinguish between internal and external statistical validation. Inte...

  7. Dynamic validation of the Planck/LFI thermal model

    CERN Document Server

    Tomasi, M; Gregorio, A; Colombo, F; Lapolla, M; Terenzi, L; Morgante, G; Bersanelli, M; Butler, R C; Galeotta, S; Mandolesi, N; Maris, M; Mennella, A; Valenziano, L; Zacchei, A; 10.1088/1748-0221/5/01/T01002

    2010-01-01

    The Low Frequency Instrument (LFI) is an array of cryogenically cooled radiometers on board the Planck satellite, designed to measure the temperature and polarization anisotropies of the cosmic microwave backgrond (CMB) at 30, 44 and 70 GHz. The thermal requirements of the LFI, and in particular the stringent limits to acceptable thermal fluctuations in the 20 K focal plane, are a critical element to achieve the instrument scientific performance. Thermal tests were carried out as part of the on-ground calibration campaign at various stages of instrument integration. In this paper we describe the results and analysis of the tests on the LFI flight model (FM) performed at Thales Laboratories in Milan (Italy) during 2006, with the purpose of experimentally sampling the thermal transfer functions and consequently validating the numerical thermal model describing the dynamic response of the LFI focal plane. This model has been used extensively to assess the ability of LFI to achieve its scientific goals: its valid...

  8. Validation Results for Core-Scale Oil Shale Pyrolysis

    Energy Technology Data Exchange (ETDEWEB)

    Staten, Josh; Tiwari, Pankaj

    2015-03-01

    This report summarizes a study of oil shale pyrolysis at various scales and the subsequent development a model for in situ production of oil from oil shale. Oil shale from the Mahogany zone of the Green River formation was used in all experiments. Pyrolysis experiments were conducted at four scales, powdered samples (100 mesh) and core samples of 0.75”, 1” and 2.5” diameters. The batch, semibatch and continuous flow pyrolysis experiments were designed to study the effect of temperature (300°C to 500°C), heating rate (1°C/min to 10°C/min), pressure (ambient and 500 psig) and size of the sample on product formation. Comprehensive analyses were performed on reactants and products - liquid, gas and spent shale. These experimental studies were designed to understand the relevant coupled phenomena (reaction kinetics, heat transfer, mass transfer, thermodynamics) at multiple scales. A model for oil shale pyrolysis was developed in the COMSOL multiphysics platform. A general kinetic model was integrated with important physical and chemical phenomena that occur during pyrolysis. The secondary reactions of coking and cracking in the product phase were addressed. The multiscale experimental data generated and the models developed provide an understanding of the simultaneous effects of chemical kinetics, and heat and mass transfer on oil quality and yield. The comprehensive data collected in this study will help advance the move to large-scale in situ oil production from the pyrolysis of oil shale.

  9. Impact mechanics of ship collisions and validations with experimental results

    DEFF Research Database (Denmark)

    Zhang, Shengming; Villavicencio, R.; Zhu, L.;

    2017-01-01

    Closed-form analytical solutions for the energy released for deforming and crushing ofstructures and the impact impulse during ship collisions were developed and published inMarine Structures in 1998 [1]. The proposed mathematical models have been used bymany engineers and researchers although th...

  10. A verification and validation process for model-driven engineering

    Science.gov (United States)

    Delmas, R.; Pires, A. F.; Polacsek, T.

    2013-12-01

    Model Driven Engineering practitioners already benefit from many well established verification tools, for Object Constraint Language (OCL), for instance. Recently, constraint satisfaction techniques have been brought to Model-Driven Engineering (MDE) and have shown promising results on model verification tasks. With all these tools, it becomes possible to provide users with formal support from early model design phases to model instantiation phases. In this paper, a selection of such tools and methods is presented, and an attempt is made to define a verification and validation process for model design and instance creation centered on UML (Unified Modeling Language) class diagrams and declarative constraints, and involving the selected tools. The suggested process is illustrated with a simple example.

  11. The Atmospheric Radionuclide Transport Model (ARTM) - Validation of a long-term atmospheric dispersion model

    Science.gov (United States)

    Hettrich, Sebastian; Wildermuth, Hans; Strobl, Christopher; Wenig, Mark

    2016-04-01

    In the last couple of years, the Atmospheric Radionuclide Transport Model (ARTM) has been developed by the German Federal Office for Radiation Protection (BfS) and the Society for Plant and Reactor Security (GRS). ARTM is an atmospheric dispersion model for continuous long-term releases of radionuclides into the atmosphere, based on the Lagrangian particle model. This model, developed in the first place as a more realistic replacement for the out-dated Gaussian plume models, is currently being optimised for further scientific purposes to study atmospheric dispersion in short-range scenarios. It includes a diagnostic wind field model, allows for the application of building structures and multiple sources (including linear, 2-and 3-dimensional source geometries), and considers orography and surface roughness. As an output it calculates the activity concentration, dry and wet deposition and can model also the radioactive decay of Rn-222. As such, ARTM requires to undergo an intense validation process. While for short-term and short-range models, which were mainly developed for examining nuclear accidents or explosions, a few measurement data-sets are available for validation, data-sets for validating long-term models are very sparse and the existing ones mostly prove to be not applicable for validation. Here we present a strategy for the validation of long-term Lagrangian particle models based on the work with ARTM. In our validation study, the first part we present is a comprehensive analysis of the model sensitivities on different parameters like e.g. (simulation grid size resolution, starting random number, amount of simulation particles, etc.). This study provides a good estimation for the uncertainties of the simulation results and consequently can be used to generate model outputs comparable to the available measurements data at various distances from the emission source. This comparison between measurement data from selected scenarios and simulation results

  12. Experimental validation of a solar-chimney power plant model

    Science.gov (United States)

    Fathi, Nima; Wayne, Patrick; Trueba Monje, Ignacio; Vorobieff, Peter

    2016-11-01

    In a solar chimney power plant system (SCPPS), the energy of buoyant hot air is converted to electrical energy. SCPPS includes a collector at ground level covered with a transparent roof. Solar radiation heats the air inside and the ground underneath. There is a tall chimney at the center of the collector, and a turbine located at the base of the chimney. Lack of detailed experimental data for validation is one of the important issues in modeling this type of power plants. We present a small-scale experimental prototype developed to perform validation analysis for modeling and simulation of SCCPS. Detailed velocity measurements are acquired using particle image velocimetry (PIV) at a prescribed Reynolds number. Convection is driven by a temperature-controlled hot plate at the bottom of the prototype. Velocity field data are used to perform validation analysis and measure any mismatch of the experimental results and the CFD data. CFD Code verification is also performed, to assess the uncertainly of the numerical model with respect to our grid and the applied mathematical model. The dimensionless output power of the prototype is calculated and compared with a recent analytical solution and the experimental results.

  13. Predicting the ungauged basin: model validation and realism assessment

    Science.gov (United States)

    van Emmerik, Tim; Mulder, Gert; Eilander, Dirk; Piet, Marijn; Savenije, Hubert

    2016-04-01

    The hydrological decade on Predictions in Ungauged Basins (PUB) [1] led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of model outcome has not been discussed to a great extent. With this study [2] we aim to contribute to the discussion on how one can determine the value and validity of a hydrological model developed for an ungauged basin. As in many cases no local, or even regional, data are available, alternative methods should be applied. Using a PUB case study in a genuinely ungauged basin in southern Cambodia, we give several examples of how one can use different types of soft data to improve model design, calibrate and validate the model, and assess the realism of the model output. A rainfall-runoff model was coupled to an irrigation reservoir, allowing the use of additional and unconventional data. The model was mainly forced with remote sensing data, and local knowledge was used to constrain the parameters. Model realism assessment was done using data from surveys. This resulted in a successful reconstruction of the reservoir dynamics, and revealed the different hydrological characteristics of the two topographical classes. We do not present a generic approach that can be transferred to other ungauged catchments, but we aim to show how clever model design and alternative data acquisition can result in a valuable hydrological model for ungauged catchments. [1] Sivapalan, M., Takeuchi, K., Franks, S., Gupta, V., Karambiri, H., Lakshmi, V., et al. (2003). IAHS decade on predictions in ungauged basins (PUB), 2003-2012: shaping an exciting future for the hydrological sciences. Hydrol. Sci. J. 48, 857-880. doi: 10.1623/hysj.48.6.857.51421 [2] van Emmerik, T., Mulder, G., Eilander, D., Piet, M. and Savenije, H. (2015). Predicting the ungauged basin: model validation and realism assessment

  14. Validating predictions from climate envelope models

    Science.gov (United States)

    Watling, J.; Bucklin, D.; Speroterra, C.; Brandt, L.; Cabal, C.; Romañach, Stephanie S.; Mazzotti, Frank J.

    2013-01-01

    Climate envelope models are a potentially important conservation tool, but their ability to accurately forecast species’ distributional shifts using independent survey data has not been fully evaluated. We created climate envelope models for 12 species of North American breeding birds previously shown to have experienced poleward range shifts. For each species, we evaluated three different approaches to climate envelope modeling that differed in the way they treated climate-induced range expansion and contraction, using random forests and maximum entropy modeling algorithms. All models were calibrated using occurrence data from 1967–1971 (t1) and evaluated using occurrence data from 1998–2002 (t2). Model sensitivity (the ability to correctly classify species presences) was greater using the maximum entropy algorithm than the random forest algorithm. Although sensitivity did not differ significantly among approaches, for many species, sensitivity was maximized using a hybrid approach that assumed range expansion, but not contraction, in t2. Species for which the hybrid approach resulted in the greatest improvement in sensitivity have been reported from more land cover types than species for which there was little difference in sensitivity between hybrid and dynamic approaches, suggesting that habitat generalists may be buffered somewhat against climate-induced range contractions. Specificity (the ability to correctly classify species absences) was maximized using the random forest algorithm and was lowest using the hybrid approach. Overall, our results suggest cautious optimism for the use of climate envelope models to forecast range shifts, but also underscore the importance of considering non-climate drivers of species range limits. The use of alternative climate envelope models that make different assumptions about range expansion and contraction is a new and potentially useful way to help inform our understanding of climate change effects on species.

  15. Planck early results. XIV. ERCSC validation and extreme radio sources

    DEFF Research Database (Denmark)

    Lavonen, N.; León-Tavares, J.; Savolainen, P.

    2011-01-01

    Planck's all-sky surveys at 30-857 GHz provide an unprecedented opportunity to follow the radio spectra of a large sample of extragalactic sources to frequencies 2-20 times higher than allowed by past, large-area, ground-based surveys. We combine the results of the Planck Early Release Compact So...

  16. Validation technique using mean and variance of kriging model

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ho Sung; Jung, Jae Jun; Lee, Tae Hee [Hanyang Univ., Seoul (Korea, Republic of)

    2007-07-01

    To validate rigorously the accuracy of metamodel is an important research area in metamodel techniques. A leave-k-out cross-validation technique not only requires considerable computational cost but also cannot measure quantitatively the fidelity of metamodel. Recently, the average validation technique has been proposed. However the average validation criterion may stop a sampling process prematurely even if kriging model is inaccurate yet. In this research, we propose a new validation technique using an average and a variance of response during a sequential sampling method, such as maximum entropy sampling. The proposed validation technique becomes more efficient and accurate than cross-validation technique, because it integrates explicitly kriging model to achieve an accurate average and variance, rather than numerical integration. The proposed validation technique shows similar trend to root mean squared error such that it can be used as a strop criterion for sequential sampling.

  17. Filament winding cylinders. II - Validation of the process model

    Science.gov (United States)

    Calius, Emilio P.; Lee, Soo-Yong; Springer, George S.

    1990-01-01

    Analytical and experimental studies were performed to validate the model developed by Lee and Springer for simulating the manufacturing process of filament wound composite cylinders. First, results calculated by the Lee-Springer model were compared to results of the Calius-Springer thin cylinder model. Second, temperatures and strains calculated by the Lee-Springer model were compared to data. The data used in these comparisons were generated during the course of this investigation with cylinders made of Hercules IM-6G/HBRF-55 and Fiberite T-300/976 graphite-epoxy tows. Good agreement was found between the calculated and measured stresses and strains, indicating that the model is a useful representation of the winding and curing processes.

  18. Modeling for Optimal Control : A Validated Diesel-Electric Powertrain Model

    OpenAIRE

    Sivertsson, Martin; Eriksson, Lars

    2014-01-01

    An optimal control ready model of a diesel-electric powertrain is developed,validated and provided to the research community. The aim ofthe model is to facilitate studies of the transient control of diesel-electricpowertrains and also to provide a model for developers of optimizationtools. The resulting model is a four state three control mean valueengine model that captures the significant nonlinearity of the diesel engine, while still being continuously differentiable.

  19. Packed bed heat storage: Continuum mechanics model and validation

    Science.gov (United States)

    Knödler, Philipp; Dreißigacker, Volker; Zunft, Stefan

    2016-05-01

    Thermal energy storage (TES) systems are key elements for various types of new power plant concepts. As possible cost-effective storage inventory option, packed beds of miscellaneous material come into consideration. However, high technical risks arise due to thermal expansion and shrinking of the packed bed's particles during cyclic thermal operation, possibly leading to material failure. Therefore, suitable tools for designing the heat storage system are mandatory. While particle discrete models offer detailed simulation results, the computing time for large scale applications is inefficient. In contrast, continuous models offer time-efficient simulation results but are in need of effective packed bed parameters. This work focuses on providing insight into some basic methods and tools on how to obtain such parameters and on how they are implemented into a continuum model. In this context, a particle discrete model as well as a test rig for carrying out uniaxial compression tests (UCT) is introduced. Performing of experimental validation tests indicate good agreement with simulated UCT results. In this process, effective parameters required for a continuous packed bed model were identified and used for continuum simulation. This approach is validated by comparing the simulated results with experimental data from another test rig. The presented method significantly simplifies subsequent design studies.

  20. Improvement and Validation of Weld Residual Stress Modelling Procedure

    Energy Technology Data Exchange (ETDEWEB)

    Zang, Weilin; Gunnars, Jens (Inspecta Technology AB, Stockholm (Sweden)); Dong, Pingsha; Hong, Jeong K. (Center for Welded Structures Research, Battelle, Columbus, OH (United States))

    2009-06-15

    The objective of this work is to identify and evaluate improvements for the residual stress modelling procedure currently used in Sweden. There is a growing demand to eliminate any unnecessary conservatism involved in residual stress assumptions. The study was focused on the development and validation of an improved weld residual stress modelling procedure, by taking advantage of the recent advances in residual stress modelling and stress measurement techniques. The major changes applied in the new weld residual stress modelling procedure are: - Improved procedure for heat source calibration based on use of analytical solutions. - Use of an isotropic hardening model where mixed hardening data is not available. - Use of an annealing model for improved simulation of strain relaxation in re-heated material. The new modelling procedure is demonstrated to capture the main characteristics of the through thickness stress distributions by validation to experimental measurements. Three austenitic stainless steel butt-welds cases are analysed, covering a large range of pipe geometries. From the cases it is evident that there can be large differences between the residual stresses predicted using the new procedure, and the earlier procedure or handbook recommendations. Previously recommended profiles could give misleading fracture assessment results. The stress profiles according to the new procedure agree well with the measured data. If data is available then a mixed hardening model should be used

  1. Validation of an axial flow blood pump: computational fluid dynamics results using particle image velocimetry.

    Science.gov (United States)

    Su, Boyang; Chua, Leok Poh; Wang, Xikun

    2012-04-01

    A magnetically suspended axial flow blood pump is studied experimentally in this article. The pump casing enclosed a three-blade straightener, a two-blade impeller shrouded by a permanent magnet-embedded cylinder, and a three-blade diffuser. The internal flow fields were simulated earlier using computational fluid dynamics (CFD), and the pump characteristic curves were determined. The simulation results showed that the internal flow field was basically streamlined, except the diffuser region. Particle image velocimetry (PIV) measurement of the 1:1 pump model was conducted to validate the CFD result. In order to ensure the optical access, an acrylic prototype was fabricated with the impeller driven by a servomotor instead, as the magnet is opaque. In addition to the transparent model, the blood analog fluid with the refractive index close to that of acrylic was used to avoid refraction. According to the CFD results, the axial flow blood pump could generate adequate pressure head at the rotating speed of 9500rpm and flow rate of 5L/min, and the same flow condition was applied during the PIV measurement. Through the comparisons, it was found that the experimental results were close to those obtained by CFD and had thus validated the CFD model, which could complement the limitation of the measurement in assessing the more detailed flow fields of the axial flow pump.

  2. Test-driven verification/validation of model transformations

    Institute of Scientific and Technical Information of China (English)

    László LENGYEL; Hassan CHARAF

    2015-01-01

    Why is it important to verify/validate model transformations? The motivation is to improve the quality of the trans-formations, and therefore the quality of the generated software artifacts. Verified/validated model transformations make it possible to ensure certain properties of the generated software artifacts. In this way, verification/validation methods can guarantee different requirements stated by the actual domain against the generated/modified/optimized software products. For example, a verified/ validated model transformation can ensure the preservation of certain properties during the model-to-model transformation. This paper emphasizes the necessity of methods that make model transformation verified/validated, discusses the different scenarios of model transformation verification and validation, and introduces the principles of a novel test-driven method for verifying/ validating model transformations. We provide a solution that makes it possible to automatically generate test input models for model transformations. Furthermore, we collect and discuss the actual open issues in the field of verification/validation of model transformations.

  3. Fleet manager's guide to testing vehicles for valid results

    Energy Technology Data Exchange (ETDEWEB)

    1981-02-01

    The managers of automotive fleets are vitally interested in saving money. Fleet procurement and operations costs are increasing rapidly. Fuel cost increases have been especially extreme. Conservation measures have included the purchase of more fuel-efficient vehicles, consolidation or reduction of unnecessary or redundant travel, upgraded and/or more frequent vehicle inspection, maintenance, repair, and the installation of fuel-saving components (or removal of fuel-consuming components). Virtually every significant cost saving measure has a cost associated with it, either a tangible financial cost, or an intangible (convenience) cost. In order to justify to his superiors that such measures should be taken, the fleet manager must be able to demonstrate clearly that the benefits derived from implementation of these measures will exceed the costs of doing so. In order to accomplish this, he must have unambiguous measures of both costs and benefits and methods of comparison which are easily usable and which yield unambiguous results. The analysis methods presented in this document are designed to accomplish this end.

  4. Validation of population-based disease simulation models: a review of concepts and methods

    Directory of Open Access Journals (Sweden)

    Sharif Behnam

    2010-11-01

    Full Text Available Abstract Background Computer simulation models are used increasingly to support public health research and policy, but questions about their quality persist. The purpose of this article is to review the principles and methods for validation of population-based disease simulation models. Methods We developed a comprehensive framework for validating population-based chronic disease simulation models and used this framework in a review of published model validation guidelines. Based on the review, we formulated a set of recommendations for gathering evidence of model credibility. Results Evidence of model credibility derives from examining: 1 the process of model development, 2 the performance of a model, and 3 the quality of decisions based on the model. Many important issues in model validation are insufficiently addressed by current guidelines. These issues include a detailed evaluation of different data sources, graphical representation of models, computer programming, model calibration, between-model comparisons, sensitivity analysis, and predictive validity. The role of external data in model validation depends on the purpose of the model (e.g., decision analysis versus prediction. More research is needed on the methods of comparing the quality of decisions based on different models. Conclusion As the role of simulation modeling in population health is increasing and models are becoming more complex, there is a need for further improvements in model validation methodology and common standards for evaluating model credibility.

  5. Segregation analysis of cryptogenic epilepsy and an empirical test of the validity of the results

    Energy Technology Data Exchange (ETDEWEB)

    Ottman, R.; Hauser, W.A.; Barker-Cummings, C.; Lee, J.H. [Columbia Univ., New York, NY (United States)] [and others

    1997-03-01

    We used POINTER to perform segregation analysis of crytogenic epilepsy in 1,557 three-generation families (probands and their parents, siblings, and offspring) ascertained from voluntary organizations. Analysis of the full data set indicated that the data were most consistent with an autosomal dominant (AD) model with 61% penetrance of the susceptibility gene. However, subsequent analyses revealed that the patterns of familial aggregation differed markedly between siblings and offspring of the probands. Risks in siblings were consistent with an autosomal recessive (AR) model and inconsistent with an AD model, whereas risks in offspring were inconsistent with an AR model and more consistent with an AD model. As a further test of the validity of the AD model, we used sequential ascertainment to extend the family history information in the subset of families judged likely to carry the putative susceptibility gene because they contained at least three affected individuals. Prevalence of idiopathic/cryptogenic epilepsy was only 3.7% in newly identified relatives expected to have a 50% probability of carrying the susceptibility gene under an AD model. Approximately 30% (i.e., 50% X 61%) were expected to be affected under the AD model resulting from the segregation analysis. These results suggest that the familial distribution of cryptogenic epilepsy is inconsistent with any conventional genetic model. The differences between siblings and offspring in the patterns of familial risk are intriguing and should be investigated further. 28 refs., 6 tabs.

  6. Calibration of Predictor Models Using Multiple Validation Experiments

    Science.gov (United States)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2015-01-01

    This paper presents a framework for calibrating computational models using data from several and possibly dissimilar validation experiments. The offset between model predictions and observations, which might be caused by measurement noise, model-form uncertainty, and numerical error, drives the process by which uncertainty in the models parameters is characterized. The resulting description of uncertainty along with the computational model constitute a predictor model. Two types of predictor models are studied: Interval Predictor Models (IPMs) and Random Predictor Models (RPMs). IPMs use sets to characterize uncertainty, whereas RPMs use random vectors. The propagation of a set through a model makes the response an interval valued function of the state, whereas the propagation of a random vector yields a random process. Optimization-based strategies for calculating both types of predictor models are proposed. Whereas the formulations used to calculate IPMs target solutions leading to the interval value function of minimal spread containing all observations, those for RPMs seek to maximize the models' ability to reproduce the distribution of observations. Regarding RPMs, we choose a structure for the random vector (i.e., the assignment of probability to points in the parameter space) solely dependent on the prediction error. As such, the probabilistic description of uncertainty is not a subjective assignment of belief, nor is it expected to asymptotically converge to a fixed value, but instead it casts the model's ability to reproduce the experimental data. This framework enables evaluating the spread and distribution of the predicted response of target applications depending on the same parameters beyond the validation domain.

  7. Experimental validation of flexible robot arm modeling and control

    Science.gov (United States)

    Ulsoy, A. Galip

    1989-01-01

    Flexibility is important for high speed, high precision operation of lightweight manipulators. Accurate dynamic modeling of flexible robot arms is needed. Previous work has mostly been based on linear elasticity with prescribed rigid body motions (i.e., no effect of flexible motion on rigid body motion). Little or no experimental validation of dynamic models for flexible arms is available. Experimental results are also limited for flexible arm control. Researchers include the effects of prismatic as well as revolute joints. They investigate the effect of full coupling between the rigid and flexible motions, and of axial shortening, and consider the control of flexible arms using only additional sensors.

  8. Dynamic Modeling of Wind Turbine Gearboxes and Experimental Validation

    DEFF Research Database (Denmark)

    Pedersen, Rune

    is presented. The model takes into account the effects of load and applied grinding corrections. The results are verified by comparing to simulated and experimental results reported in the existing literature. Using gear data loosely based on a 1 MW wind turbine gearbox, the gear mesh stiffness is expanded...... analysis in relation to gear dynamics. A multibody model of two complete 2.3MWwind turbine gearboxes mounted back-to-back in a test rig is built. The mean values of the proposed gear mesh stiffnesses are included. The model is validated by comparing with calculated and measured eigenfrequencies and mode...... shapes. The measured eigenfrequencies have been identified in accelerometer signals obtained during run-up tests. Since the calculated eigenfrequencies do not match the measured eigenfrequencies with sufficient accuracy, a model updating technique is applied to ensure a better match by adjusting...

  9. Validation of a Hertzian contact model with nonlinear damping

    Science.gov (United States)

    Sierakowski, Adam

    2015-11-01

    Due to limited spatial resolution, most disperse particle simulation methods rely on simplified models for incorporating short-range particle interactions. In this presentation, we introduce a contact model that combines the Hertz elastic restoring force with a nonlinear damping force, requiring only material properties and no tunable parameters. We have implemented the model in a resolved-particle flow solver that implements the Physalis method, which accurately captures hydrodynamic interactions by analytically enforcing the no-slip condition on the particle surface. We summarize the results of a few numerical studies that suggest the validity of the contact model over a range of particle interaction intensities (i.e., collision Stokes numbers) when compared with experimental data. This work was supported by the National Science Foundation under Grant Number CBET1335965 and the Johns Hopkins University Modeling Complex Systems IGERT program.

  10. Statistical validation of normal tissue complication probability models.

    Science.gov (United States)

    Xu, Cheng-Jian; van der Schaaf, Arjen; Van't Veld, Aart A; Langendijk, Johannes A; Schilstra, Cornelis

    2012-09-01

    To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use. Copyright © 2012 Elsevier Inc. All rights reserved.

  11. Case study for model validation : assessing a model for thermal decomposition of polyurethane foam.

    Energy Technology Data Exchange (ETDEWEB)

    Dowding, Kevin J.; Leslie, Ian H. (New Mexico State University, Las Cruces, NM); Hobbs, Michael L.; Rutherford, Brian Milne; Hills, Richard Guy (New Mexico State University, Las Cruces, NM); Pilch, Martin M.

    2004-10-01

    A case study is reported to document the details of a validation process to assess the accuracy of a mathematical model to represent experiments involving thermal decomposition of polyurethane foam. The focus of the report is to work through a validation process. The process addresses the following activities. The intended application of mathematical model is discussed to better understand the pertinent parameter space. The parameter space of the validation experiments is mapped to the application parameter space. The mathematical models, computer code to solve the models and its (code) verification are presented. Experimental data from two activities are used to validate mathematical models. The first experiment assesses the chemistry model alone and the second experiment assesses the model of coupled chemistry, conduction, and enclosure radiation. The model results of both experimental activities are summarized and uncertainty of the model to represent each experimental activity is estimated. The comparison between the experiment data and model results is quantified with various metrics. After addressing these activities, an assessment of the process for the case study is given. Weaknesses in the process are discussed and lessons learned are summarized.

  12. A validated approach for modeling collapse of steel structures

    Science.gov (United States)

    Saykin, Vitaliy Victorovich

    A civil engineering structure is faced with many hazardous conditions such as blasts, earthquakes, hurricanes, tornadoes, floods, and fires during its lifetime. Even though structures are designed for credible events that can happen during a lifetime of the structure, extreme events do happen and cause catastrophic failures. Understanding the causes and effects of structural collapse is now at the core of critical areas of national need. One factor that makes studying structural collapse difficult is the lack of full-scale structural collapse experimental test results against which researchers could validate their proposed collapse modeling approaches. The goal of this work is the creation of an element deletion strategy based on fracture models for use in validated prediction of collapse of steel structures. The current work reviews the state-of-the-art of finite element deletion strategies for use in collapse modeling of structures. It is shown that current approaches to element deletion in collapse modeling do not take into account stress triaxiality in vulnerable areas of the structure, which is important for proper fracture and element deletion modeling. The report then reviews triaxiality and its role in fracture prediction. It is shown that fracture in ductile materials is a function of triaxiality. It is also shown that, depending on the triaxiality range, different fracture mechanisms are active and should be accounted for. An approach using semi-empirical fracture models as a function of triaxiality are employed. The models to determine fracture initiation, softening and subsequent finite element deletion are outlined. This procedure allows for stress-displacement softening at an integration point of a finite element in order to subsequently remove the element. This approach avoids abrupt changes in the stress that would create dynamic instabilities, thus making the results more reliable and accurate. The calibration and validation of these models are

  13. Assessing uncertainty in pollutant wash-off modelling via model validation.

    Science.gov (United States)

    Haddad, Khaled; Egodawatta, Prasanna; Rahman, Ataur; Goonetilleke, Ashantha

    2014-11-01

    Stormwater pollution is linked to stream ecosystem degradation. In predicting stormwater pollution, various types of modelling techniques are adopted. The accuracy of predictions provided by these models depends on the data quality, appropriate estimation of model parameters, and the validation undertaken. It is well understood that available water quality datasets in urban areas span only relatively short time scales unlike water quantity data, which limits the applicability of the developed models in engineering and ecological assessment of urban waterways. This paper presents the application of leave-one-out (LOO) and Monte Carlo cross validation (MCCV) procedures in a Monte Carlo framework for the validation and estimation of uncertainty associated with pollutant wash-off when models are developed using a limited dataset. It was found that the application of MCCV is likely to result in a more realistic measure of model coefficients than LOO. Most importantly, MCCV and LOO were found to be effective in model validation when dealing with a small sample size which hinders detailed model validation and can undermine the effectiveness of stormwater quality management strategies.

  14. Three Dimensional Vapor Intrusion Modeling: Model Validation and Uncertainty Analysis

    Science.gov (United States)

    Akbariyeh, S.; Patterson, B.; Rakoczy, A.; Li, Y.

    2013-12-01

    Volatile organic chemicals (VOCs), such as chlorinated solvents and petroleum hydrocarbons, are prevalent groundwater contaminants due to their improper disposal and accidental spillage. In addition to contaminating groundwater, VOCs may partition into the overlying vadose zone and enter buildings through gaps and cracks in foundation slabs or basement walls, a process termed vapor intrusion. Vapor intrusion of VOCs has been recognized as a detrimental source for human exposures to potential carcinogenic or toxic compounds. The simulation of vapor intrusion from a subsurface source has been the focus of many studies to better understand the process and guide field investigation. While multiple analytical and numerical models were developed to simulate the vapor intrusion process, detailed validation of these models against well controlled experiments is still lacking, due to the complexity and uncertainties associated with site characterization and soil gas flux and indoor air concentration measurement. In this work, we present an effort to validate a three-dimensional vapor intrusion model based on a well-controlled experimental quantification of the vapor intrusion pathways into a slab-on-ground building under varying environmental conditions. Finally, a probabilistic approach based on Monte Carlo simulations is implemented to determine the probability distribution of indoor air concentration based on the most uncertain input parameters.

  15. Prospects and problems for standardizing model validation in systems biology.

    Science.gov (United States)

    Gross, Fridolin; MacLeod, Miles

    2017-10-01

    There are currently no widely shared criteria by which to assess the validity of computational models in systems biology. Here we discuss the feasibility and desirability of implementing validation standards for modeling. Having such a standard would facilitate journal review, interdisciplinary collaboration, model exchange, and be especially relevant for applications close to medical practice. However, even though the production of predictively valid models is considered a central goal, in practice modeling in systems biology employs a variety of model structures and model-building practices. These serve a variety of purposes, many of which are heuristic and do not seem to require strict validation criteria and may even be restricted by them. Moreover, given the current situation in systems biology, implementing a validation standard would face serious technical obstacles mostly due to the quality of available empirical data. We advocate a cautious approach to standardization. However even though rigorous standardization seems premature at this point, raising the issue helps us develop better insights into the practices of systems biology and the technical problems modelers face validating models. Further it allows us to identify certain technical validation issues which hold regardless of modeling context and purpose. Informal guidelines could in fact play a role in the field by helping modelers handle these. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Validation of the Anhysteretic Magnetization Model for Soft Magnetic Materials with Perpendicular Anisotropy

    Directory of Open Access Journals (Sweden)

    Roman Szewczyk

    2014-07-01

    Full Text Available The paper presents results of validation of the anhysteretic magnetization model for a soft amorphous alloy with significant perpendicular anisotropy. The validation was carried out for the Jiles-Atherton model with Ramesh extension considering anisotropy. Due to the fact that it is difficult to measure anhysteretic magnetization directly, the soft magnetic core with negligible hysteresis was used. The results of validation indicate that the Jiles-Atherton model with Ramesh extension should be corrected to allow accurate modeling of the anhysteretic magnetization. The corrected model may be applied for modeling the cores of current transformers operating in a wide range of measured currents.

  17. Requirements Validation: Execution of UML Models with CPN Tools

    DEFF Research Database (Denmark)

    Machado, Ricardo J.; Lassen, Kristian Bisgaard; Oliveira, Sérgio

    2007-01-01

    with simple unified modelling language (UML) requirements models, it is not easy for the development team to get confidence on the stakeholders' requirements validation. This paper describes an approach, based on the construction of executable interactive prototypes, to support the validation of workflow...

  18. Prospects and problems for standardizing model validation in systems biology

    NARCIS (Netherlands)

    Gross, Fridolin; MacLeod, Miles Alexander James

    2017-01-01

    There are currently no widely shared criteria by which to assess the validity of computational models in systems biology. Here we discuss the feasibility and desirability of implementing validation standards for modeling. Having such a standard would facilitate journal review, interdisciplinary coll

  19. Validation and Adaptation of Router and Switch Models

    NARCIS (Netherlands)

    Boltjes, B.; Fernandez Diaz, I.; Kock, B.A.; Langeveld, R.J.G.M.; Schoenmaker, G.

    2003-01-01

    This paper describes validating OPNET models of key devices for the next generation IP-based tactical network of the Royal Netherlands Army (RNLA). The task of TNO-FEL is to provide insight in scalability and performance of future deployed networks. Because validated models ol key Cisco equipment we

  20. Foundational Issues in Statistical Modeling: Statistical Model Specification and Validation

    Directory of Open Access Journals (Sweden)

    Aris Spanos

    2011-01-01

    Full Text Available Statistical model specification and validation raise crucial foundational problems whose pertinent resolution holds the key to learning from data by securing the reliability of frequentist inference. The paper questions the judiciousness of several current practices, including the theory-driven approach, and the Akaike-type model selection procedures, arguing that they often lead to unreliable inferences. This is primarily due to the fact that goodness-of-fit/prediction measures and other substantive and pragmatic criteria are of questionable value when the estimated model is statistically misspecified. Foisting one's favorite model on the data often yields estimated models which are both statistically and substantively misspecified, but one has no way to delineate between the two sources of error and apportion blame. The paper argues that the error statistical approach can address this Duhemian ambiguity by distinguishing between statistical and substantive premises and viewing empirical modeling in a piecemeal way with a view to delineate the various issues more effectively. It is also argued that Hendry's general to specific procedures does a much better job in model selection than the theory-driven and the Akaike-type procedures primary because of its error statistical underpinnings.

  1. Methods for Geometric Data Validation of 3d City Models

    Science.gov (United States)

    Wagner, D.; Alam, N.; Wewetzer, M.; Pries, M.; Coors, V.

    2015-12-01

    are detected, among additional criteria. Self-intersection might lead to different results, e.g. intersection points, lines or areas. Depending on the geometric constellation, they might represent gaps between bounding polygons of the solids, overlaps, or violations of the 2-manifoldness. Not least due to the floating point problem in digital numbers, tolerances must be considered in some algorithms, e.g. planarity and solid self-intersection. Effects of different tolerance values and their handling is discussed; recommendations for suitable values are given. The goal of the paper is to give a clear understanding of geometric validation in the context of 3D city models. This should also enable the data holder to get a better comprehension of the validation results and their consequences on the deployment fields of the validated data set.

  2. Toward Validation of the Diagnostic-Prescriptive Model

    Science.gov (United States)

    Ysseldyke, James E.; Sabatino, David A.

    1973-01-01

    Criticized are recent research efforts to validate the diagnostic prescriptive model of remediating learning disabilities, and proposed is a 6-step psychoeducational model designed to ascertain links between behavioral differences and instructional outcomes. (DB)

  3. Models for Validation of Prior Learning (VPL)

    DEFF Research Database (Denmark)

    Ehlers, Søren

    would have been categorized as utopian can become realpolitik. Validation of Prior Learning (VPL) was in Europe mainly regarded as utopian while universities in the United States of America (USA) were developing ways to obtain credits to those students which was coming with experiences from working life....

  4. 42 CFR 476.84 - Changes as a result of DRG validation.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Changes as a result of DRG validation. 476.84... § 476.84 Changes as a result of DRG validation. A provider or practitioner may obtain a review by a QIO under part 473 of this chapter for changes in diagnostic and procedural coding that resulted in a change...

  5. Validating a spatially distributed hydrological model with soil morphology data

    Directory of Open Access Journals (Sweden)

    T. Doppler

    2013-10-01

    Full Text Available Spatially distributed hydrological models are popular tools in hydrology and they are claimed to be useful to support management decisions. Despite the high spatial resolution of the computed variables, calibration and validation is often carried out only on discharge time-series at specific locations due to the lack of spatially distributed reference data. Because of this restriction, the predictive power of these models, with regard to predicted spatial patterns, can usually not be judged. An example of spatial predictions in hydrology is the prediction of saturated areas in agricultural catchments. These areas can be important source areas for the transport of agrochemicals to the stream. We set up a spatially distributed model to predict saturated areas in a 1.2 km2 catchment in Switzerland with moderate topography. Around 40% of the catchment area are artificially drained. We measured weather data, discharge and groundwater levels in 11 piezometers for 1.5 yr. For broadening the spatially distributed data sets that can be used for model calibration and validation, we translated soil morphological data available from soil maps into an estimate of the duration of soil saturation in the soil horizons. We used redox-morphology signs for these estimates. This resulted in a data set with high spatial coverage on which the model predictions were validated. In general, these saturation estimates corresponded well to the measured groundwater levels. We worked with a model that would be applicable for management decisions because of its fast calculation speed and rather low data requirements. We simultaneously calibrated the model to the groundwater levels in the piezometers and discharge. The model was able to reproduce the general hydrological behavior of the catchment in terms of discharge and absolute groundwater levels. However, the accuracy of the groundwater level predictions was not high enough to be used for the prediction of saturated areas

  6. Modeling and Simulation Behavior Validation Methodology and Extension Model Validation for the Individual Soldier

    Science.gov (United States)

    2015-03-01

    Historical Methods The three historical methods of validation are rationalism, empiricism , and positive economics. Rationalism requires that... Empiricism requires every assumption and outcome to be empirically validated. Positive economics requires only that the model’s outcome(s) be correct...historical methods of rationalism, empiricism , and positive economics into a multistage process of validation. This validation method consists of (1

  7. Validation of the WATEQ4 geochemical model for uranium

    Energy Technology Data Exchange (ETDEWEB)

    Krupka, K.M.; Jenne, E.A.; Deutsch, W.J.

    1983-09-01

    As part of the Geochemical Modeling and Nuclide/Rock/Groundwater Interactions Studies Program, a study was conducted to partially validate the WATEQ4 aqueous speciation-solubility geochemical model for uranium. The solubility controls determined with the WATEQ4 geochemical model were in excellent agreement with those laboratory studies in which the solids schoepite (UO/sub 2/(OH)/sub 2/ . H/sub 2/O), UO/sub 2/(OH)/sub 2/, and rutherfordine ((UO/sub 2/CO/sub 3/) were identified as actual solubility controls for uranium. The results of modeling solution analyses from laboratory studies of uranyl phosphate solids, however, identified possible errors in the characterization of solids in the original solubility experiments. As part of this study, significant deficiencies in the WATEQ4 thermodynamic data base for uranium solutes and solids were corrected. Revisions included recalculation of selected uranium reactions. Additionally, thermodynamic data for the hydroxyl complexes of U(VI), including anionic (VI) species, were evaluated (to the extent permitted by the available data). Vanadium reactions were also added to the thermodynamic data base because uranium-vanadium solids can exist in natural ground-water systems. This study is only a partial validation of the WATEQ4 geochemical model because the available laboratory solubility studies do not cover the range of solid phases, alkaline pH values, and concentrations of inorganic complexing ligands needed to evaluate the potential solubility of uranium in ground waters associated with various proposed nuclear waste repositories. Further validation of this or other geochemical models for uranium will require careful determinations of uraninite solubility over the pH range of 7 to 10 under highly reducing conditions and of uranyl hydroxide and phosphate solubilities over the pH range of 7 to 10 under oxygenated conditions.

  8. Five-Factor Screener in the 2005 National Health Interview Survey Cancer Control Supplement: Validation Results

    Science.gov (United States)

    Risk Factor Assessment Branch staff have assessed indirectly the validity of parts of the Five-Factor Screener in two studies: NCI's Observing Protein and Energy (OPEN) Study and the Eating at America's Table Study (EATS). In both studies, multiple 24-hour recalls in conjunction with a measurement error model were used to assess validity.

  9. An Experimentally Validated SOA Model for High-Bit Rate System Applications

    Institute of Scientific and Technical Information of China (English)

    Hasan I. Saleheen

    2003-01-01

    A comprehensive model of the Semiconductor Optical Amplifier with experimental validation result is presented. This model accounts for various physical behavior of the device which is necessary for high bit-rate system application.

  10. Validation and results of a questionnaire for functional bowel disease in out-patients

    Directory of Open Access Journals (Sweden)

    Skordilis Panagiotis

    2002-05-01

    Full Text Available Abstract Background The aim was to evaluate and validate a bowel disease questionnaire in patients attending an out-patient gastroenterology clinic in Greece. Methods This was a prospective study. Diagnosis was based on detailed clinical and laboratory evaluation. The questionnaire was tested on a pilot group of patients. Interviewer-administration technique was used. One-hundred-and-forty consecutive patients attending the out-patient clinic for the first time and fifty healthy controls selected randomly participated in the study. Reliability (kappa statistics and validity of the questionnaire were tested. We used logistic regression models and binary recursive partitioning for assessing distinguishing ability among irritable bowel syndrome (IBS, functional dyspepsia and organic disease patients. Results Mean time for questionnaire completion was 18 min. In test-retest procedure a good agreement was obtained (kappa statistics 0.82. There were 55 patients diagnosed as having IBS, 18 with functional dyspepsia (Rome I criteria, 38 with organic disease. Location of pain was a significant distinguishing factor, patients with functional dyspepsia having no lower abdominal pain (p Conclusions This questionnaire for functional bowel disease is a valid and reliable instrument that can distinguish satisfactorily between organic and functional disease in an out-patient setting.

  11. Image decomposition as a tool for validating stress analysis models

    Directory of Open Access Journals (Sweden)

    Mottershead J.

    2010-06-01

    Full Text Available It is good practice to validate analytical and numerical models used in stress analysis for engineering design by comparison with measurements obtained from real components either in-service or in the laboratory. In reality, this critical step is often neglected or reduced to placing a single strain gage at the predicted hot-spot of stress. Modern techniques of optical analysis allow full-field maps of displacement, strain and, or stress to be obtained from real components with relative ease and at modest cost. However, validations continued to be performed only at predicted and, or observed hot-spots and most of the wealth of data is ignored. It is proposed that image decomposition methods, commonly employed in techniques such as fingerprinting and iris recognition, can be employed to validate stress analysis models by comparing all of the key features in the data from the experiment and the model. Image decomposition techniques such as Zernike moments and Fourier transforms have been used to decompose full-field distributions for strain generated from optical techniques such as digital image correlation and thermoelastic stress analysis as well as from analytical and numerical models by treating the strain distributions as images. The result of the decomposition is 101 to 102 image descriptors instead of the 105 or 106 pixels in the original data. As a consequence, it is relatively easy to make a statistical comparison of the image descriptors from the experiment and from the analytical/numerical model and to provide a quantitative assessment of the stress analysis.

  12. The Fruit & Vegetable Screener in the 2000 California Health Interview Survey: Validation Results

    Science.gov (United States)

    In this study, multiple 24-hour recalls in conjunction with a measurement error model were used to assess validity. The screeners used in the EATS included additional foods and reported portion sizes.

  13. Determination of validation threshold for coordinate measuring methods using a metrological compatibility model

    Science.gov (United States)

    Gromczak, Kamila; Gąska, Adam; Kowalski, Marek; Ostrowska, Ksenia; Sładek, Jerzy; Gruza, Maciej; Gąska, Piotr

    2017-01-01

    The following paper presents a practical approach to the validation process of coordinate measuring methods at an accredited laboratory, using a statistical model of metrological compatibility. The statistical analysis of measurement results obtained using a highly accurate system was intended to determine the permissible validation threshold values. The threshold value constitutes the primary criterion for the acceptance or rejection of the validated method, and depends on both the differences between measurement results with corresponding uncertainties and the individual correlation coefficient. The article specifies and explains the types of measuring methods that were subject to validation and defines the criterion value governing their acceptance or rejection in the validation process.

  14. A Review of Models and Procedures for Synthetic Validation for Entry-Level Army Jobs

    Science.gov (United States)

    1988-12-01

    ARI Research Note 88-107 A Review of Models and Procedures for Co Synthetic Validation for Entry-LevelM -£.2 Army Jobs C i Jennifer L. Crafts, Philip...of Models and Procecures for Synthetic Validation for Entry-Level Army Jobs 12. PERSONAL AUTHOR(S) Crafts, Jennifer L., Szenas, Fhilip L., Chia, Wel...well as ability. ProJect A Validity Results Campbell (1986) and McHerry, Houigh. Thquam, Hanson, and Ashworth (1987) have conducted extensive

  15. Validation and Scenario Analysis of a Soil Organic Carbon Model

    Institute of Scientific and Technical Information of China (English)

    HUANG Yao; LIU Shi-liang; SHEN Qi-rong; ZONG Liang-gang; JIANG Ding-an; HUANG Hong-guang

    2002-01-01

    A model developed by the authors was validated against independent data sets. The data sets were obtained from field experiments of crop residue decomposition and a 7-year soil improvement in Yixing City, Jiangsu Province. Model validation indicated that soil organic carbon dynamics can be simulated from the weather variables of temperature, sunlight and precipitation, soil clay content and bulk density, grain yield of previous crops, qualities and quantities of the added organic matter. Model simulation in general agreed with the measurements. The comparison between computed and measured resulted in correlation coefficient γ2 values of 0.9291 * * * (n = 48) and 0. 6431 * * (n = 65) for the two experiments, respectively. Model prediction under three scenarios of no additional organic matter input, with an annual incorporation of rice and wheat straw at rates of 6.75t/ha and 9.0t/ha suggested that the soil organic carbon in Wanshi Township of Yixing City would be from an initial value of 7.85g/kg in 1983 to 6.30g/kg, 11.42g/kg and 13g/kg in 2014, respectively. Consequently, total nitrogen content of the soil was predicted to be respectively 0.49g/kg,0.89g/kg and 1.01g/kg under the three scenarios.

  16. Validation, Optimization and Simulation of a Solar Thermoelectric Generator Model

    Science.gov (United States)

    Madkhali, Hadi Ali; Hamil, Ali; Lee, HoSung

    2017-08-01

    This study explores thermoelectrics as a viable option for small-scale solar thermal applications. Thermoelectric technology is based on the Seebeck effect, which states that a voltage is induced when a temperature gradient is applied to the junctions of two differing materials. This research proposes to analyze, validate, simulate, and optimize a prototype solar thermoelectric generator (STEG) model in order to increase efficiency. The intent is to further develop STEGs as a viable and productive energy source that limits pollution and reduces the cost of energy production. An empirical study (Kraemer et al. in Nat Mater 10:532, 2011) on the solar thermoelectric generator reported a high efficiency performance of 4.6%. The system had a vacuum glass enclosure, a flat panel (absorber), thermoelectric generator and water circulation for the cold side. The theoretical and numerical approach of this current study validated the experimental results from Kraemer's study to a high degree. The numerical simulation process utilizes a two-stage approach in ANSYS software for Fluent and Thermal-Electric Systems. The solar load model technique uses solar radiation under AM 1.5G conditions in Fluent. This analytical model applies Dr. Ho Sung Lee's theory of optimal design to improve the performance of the STEG system by using dimensionless parameters. Applying this theory, using two cover glasses and radiation shields, the STEG model can achieve a highest efficiency of 7%.

  17. An approach to model validation and model-based prediction -- polyurethane foam case study.

    Energy Technology Data Exchange (ETDEWEB)

    Dowding, Kevin J.; Rutherford, Brian Milne

    2003-07-01

    Enhanced software methodology and improved computing hardware have advanced the state of simulation technology to a point where large physics-based codes can be a major contributor in many systems analyses. This shift toward the use of computational methods has brought with it new research challenges in a number of areas including characterization of uncertainty, model validation, and the analysis of computer output. It is these challenges that have motivated the work described in this report. Approaches to and methods for model validation and (model-based) prediction have been developed recently in the engineering, mathematics and statistical literatures. In this report we have provided a fairly detailed account of one approach to model validation and prediction applied to an analysis investigating thermal decomposition of polyurethane foam. A model simulates the evolution of the foam in a high temperature environment as it transforms from a solid to a gas phase. The available modeling and experimental results serve as data for a case study focusing our model validation and prediction developmental efforts on this specific thermal application. We discuss several elements of the ''philosophy'' behind the validation and prediction approach: (1) We view the validation process as an activity applying to the use of a specific computational model for a specific application. We do acknowledge, however, that an important part of the overall development of a computational simulation initiative is the feedback provided to model developers and analysts associated with the application. (2) We utilize information obtained for the calibration of model parameters to estimate the parameters and quantify uncertainty in the estimates. We rely, however, on validation data (or data from similar analyses) to measure the variability that contributes to the uncertainty in predictions for specific systems or units (unit-to-unit variability). (3) We perform statistical

  18. Using virtual reality to validate system models

    Energy Technology Data Exchange (ETDEWEB)

    Winter, V.L.; Caudell, T.P.

    1999-12-09

    To date most validation techniques are highly biased towards calculations involving symbolic representations of problems. These calculations are either formal (in the case of consistency and completeness checks), or informal in the case of code inspections. The authors believe that an essential type of evidence of the correctness of the formalization process must be provided by (i.e., must originate from) human-based calculation. They further believe that human calculation can by significantly amplified by shifting from symbolic representations to graphical representations. This paper describes their preliminary efforts in realizing such a representational shift.

  19. Extending Model Checking to Object Process Validation

    NARCIS (Netherlands)

    Rein, van H.

    2002-01-01

    Object-oriented techniques allow the gathering and modelling of system requirements in terms of an application area. The expression of data and process models at that level is a great asset in communication with non-technical people in that area, but it does not necessarily lead to consistent models

  20. A practical approach to validating a PD model

    NARCIS (Netherlands)

    Medema, Lydian; Koning, Ruud H.; Lensink, Robert; Medema, M.

    2009-01-01

    The capital adequacy framework Basel II aims to promote the adoption of stronger risk management practices by the banking industry. The implementation makes validation of credit risk models more important. Lenders therefore need a validation methodology to convince their supervisors that their credi

  1. A Practical Approach to Validating a PD Model

    NARCIS (Netherlands)

    Medema, L.; Koning, de R.; Lensink, B.W.

    2009-01-01

    The capital adequacy framework Basel II aims to promote the adoption of stronger risk management practices by the banking industry. The implementation makes validation of credit risk models more important. Lenders therefore need a validation methodology to convince their supervisors that their credi

  2. Cross-validation criteria for SETAR model selection

    NARCIS (Netherlands)

    de Gooijer, J.G.

    2001-01-01

    Three cross-validation criteria, denoted C, C_c, and C_u are proposed for selecting the orders of a self-exciting threshold autoregressive SETAR) model when both the delay and the threshold value are unknown. The derivatioon of C is within a natural cross-validation framework. The crietion C_c is si

  3. Ensuring the Validity of the Micro Foundation in DSGE Models

    DEFF Research Database (Denmark)

    Andreasen, Martin Møller

    The presence of i) stochastic trends, ii) deterministic trends, and/or iii) stochastic volatil- ity in DSGE models may imply that the agents' objective functions attain infinite values. We say that such models do not have a valid micro foundation. The paper derives sufficient condi- tions which...... ensure that the objective functions of the households and the firms are finite even when various trends and stochastic volatility are included in a standard DSGE model. Based on these conditions we test the validity of the micro foundation in six DSGE models from the literature. The models of Justiniano...... in these models remains to be established....

  4. Model transparency and validation: a report of the ISPOR-SMDM Modeling Good Research Practices Task Force--7.

    Science.gov (United States)

    Eddy, David M; Hollingworth, William; Caro, J Jaime; Tsevat, Joel; McDonald, Kathryn M; Wong, John B

    2012-01-01

    Trust and confidence are critical to the success of health care models. There are two main methods for achieving this: transparency (people can see how the model is built) and validation (how well the model reproduces reality). This report describes recommendations for achieving transparency and validation developed by a taskforce appointed by the International Society for Pharmacoeconomics and Outcomes Research and the Society for Medical Decision Making. Recommendations were developed iteratively by the authors. A nontechnical description--including model type, intended applications, funding sources, structure, intended uses, inputs, outputs, other components that determine function, and their relationships, data sources, validation methods, results, and limitations--should be made available to anyone. Technical documentation, written in sufficient detail to enable a reader with necessary expertise to evaluate the model and potentially reproduce it, should be made available openly or under agreements that protect intellectual property, at the discretion of the modelers. Validation involves face validity (wherein experts evaluate model structure, data sources, assumptions, and results), verification or internal validity (check accuracy of coding), cross validity (comparison of results with other models analyzing the same problem), external validity (comparing model results with real-world results), and predictive validity (comparing model results with prospectively observed events). The last two are the strongest form of validation. Each section of this article contains a number of recommendations that were iterated among the authors, as well as among the wider modeling taskforce, jointly set up by the International Society for Pharmacoeconomics and Outcomes Research and the Society for Medical Decision Making.

  5. Amendment to Validated dynamic flow model

    DEFF Research Database (Denmark)

    Knudsen, Torben

    2011-01-01

    The purpose of WP2 is to establish flow models relating the wind speed at turbines in a farm. Until now, active control of power reference has not been included in these models as only data with standard operation has been available. In this report the first data series with power reference...... excitations from the Thanet farm are used for trying to update some of the models discussed in D2.5. Because of very limited amount of data only simple dynamic transfer function models can be obtained. The three obtained data series are somewhat different. Only the first data set seems to have the front...... turbine in undisturbed flow. For this data set both the multiplicative model and in particular the simple first order transfer function model can predict the down wind wind speed from upwind wind speed and loading....

  6. Gear Windage Modeling Progress - Experimental Validation Status

    Science.gov (United States)

    Kunz, Rob; Handschuh, Robert F.

    2008-01-01

    In the Subsonics Rotary Wing (SRW) Project being funded for propulsion work at NASA Glenn Research Center, performance of the propulsion system is of high importance. In current rotorcraft drive systems many gearing components operate at high rotational speed (pitch line velocity > 24000 ft/ min). In our testing of high speed helical gear trains at NASA Glenn we have found that the work done on the air - oil mist within the gearbox can become a significant part of the power loss of the system. This loss mechanism is referred to as windage. The effort described in this presentation is to try to understand the variables that affect windage, develop a good experimental data base to validate, the analytical project being conducted at Penn State University by Dr. Rob Kunz under a NASA SRW NRA. The presentation provides an update to the status of these efforts.

  7. Dynamic validation of the Planck-LFI thermal model

    Energy Technology Data Exchange (ETDEWEB)

    Tomasi, M; Bersanelli, M; Mennella, A [Universita degli Studi di Milano, Via Celoria 16, 20133 Milano (Italy); Cappellini, B [INAF IASF Milano, Via Bassini, 15, 20133, Milano (Italy); Gregorio, A [University of Trieste, Department of Physics, via Valerio 2, 34127 Trieste (Italy); Colombo, F; Lapolla, M [Thales Alenia Space Italia S.p.A., IUEL - Scientific Instruments, S.S. Padana Superiore 290, 20090 Vimodrone (Mi) (Italy); Terenzi, L; Morgante, G; Butler, R C; Mandolesi, N; Valenziano, L [INAF IASF Bologna, via Gobetti 101, 40129 Bologna (Italy); Galeotta, S; Maris, M; Zacchei, A [LFI-DPC INAF-OATs, via Tiepolo 11, 34131 Trieste (Italy)

    2010-01-15

    The Low Frequency Instrument (LFI) is an array of cryogenically cooled radiometers on board the Planck satellite, designed to measure the temperature and polarization anisotropies of the cosmic microwave background (CMB) at 30, 44 and 70 GHz. The thermal requirements of the LFI, and in particular the stringent limits to acceptable thermal fluctuations in the 20 K focal plane, are a critical element to achieve the instrument scientific performance. Thermal tests were carried out as part of the on-ground calibration campaign at various stages of instrument integration. In this paper we describe the results and analysis of the tests on the LFI flight model (FM) performed at Thales Laboratories in Milan (Italy) during 2006, with the purpose of experimentally sampling the thermal transfer functions and consequently validating the numerical thermal model describing the dynamic response of the LFI focal plane. This model has been used extensively to assess the ability of LFI to achieve its scientific goals: its validation is therefore extremely important in the context of the Planck mission. Our analysis shows that the measured thermal properties of the instrument show a thermal damping level better than predicted, therefore further reducing the expected systematic effect induced in the LFI maps. We then propose an explanation of the increased damping in terms of non-ideal thermal contacts.

  8. An independent verification and validation of the Future Theater Level Model conceptual model

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, D.S. III; Kruse, K.L.; Martellaro, A.J.; Packard, S.L.; Thomas, B. Jr.; Turley, V.K.

    1994-08-01

    This report describes the methodology and results of independent verification and validation performed on a combat model in its design stage. The combat model is the Future Theater Level Model (FTLM), under development by The Joint Staff/J-8. J-8 has undertaken its development to provide an analysis tool that addresses the uncertainties of combat more directly than previous models and yields more rapid study results. The methodology adopted for this verification and validation consisted of document analyses. Included were detailed examination of the FTLM design documents (at all stages of development), the FTLM Mission Needs Statement, and selected documentation for other theater level combat models. These documents were compared to assess the FTLM as to its design stage, its purpose as an analytical combat model, and its capabilities as specified in the Mission Needs Statement. The conceptual design passed those tests. The recommendations included specific modifications as well as a recommendation for continued development. The methodology is significant because independent verification and validation have not been previously reported as being performed on a combat model in its design stage. The results are significant because The Joint Staff/J-8 will be using the recommendations from this study in determining whether to proceed with develop of the model.

  9. Model transparency and validation: a report of the ISPOR-SMDM Modeling Good Research Practices Task Force-7.

    Science.gov (United States)

    Eddy, David M; Hollingworth, William; Caro, J Jaime; Tsevat, Joel; McDonald, Kathryn M; Wong, John B

    2012-01-01

    Trust and confidence are critical to the success of health care models. There are two main methods for achieving this: transparency (people can see how the model is built) and validation (how well it reproduces reality). This report describes recommendations for achieving transparency and validation, developed by a task force appointed by the International Society for Pharmacoeconomics and Outcomes Research (ISPOR) and the Society for Medical Decision Making (SMDM). Recommendations were developed iteratively by the authors. A nontechnical description should be made available to anyone-including model type and intended applications; funding sources; structure; inputs, outputs, other components that determine function, and their relationships; data sources; validation methods and results; and limitations. Technical documentation, written in sufficient detail to enable a reader with necessary expertise to evaluate the model and potentially reproduce it, should be made available openly or under agreements that protect intellectual property, at the discretion of the modelers. Validation involves face validity (wherein experts evaluate model structure, data sources, assumptions, and results), verification or internal validity (check accuracy of coding), cross validity (comparison of results with other models analyzing same problem), external validity (comparing model results to real-world results), and predictive validity (comparing model results with prospectively observed events). The last two are the strongest form of validation. Each section of this paper contains a number of recommendations that were iterated among the authors, as well as the wider modeling task force jointly set up by the International Society for Pharmacoeconomics and Outcomes Research and the Society for Medical Decision Making.

  10. Validity of microgravity simulation models on earth

    DEFF Research Database (Denmark)

    Regnard, J; Heer, M; Drummer, C

    2001-01-01

    Many studies have used water immersion and head-down bed rest as experimental models to simulate responses to microgravity. However, some data collected during space missions are at variance or in contrast with observations collected from experimental models. These discrepancies could reflect inc...

  11. EMMD-Prony approach for dynamic validation of simulation models

    Institute of Scientific and Technical Information of China (English)

    Ruiyang Bai

    2015-01-01

    Model validation and updating is critical to model credi-bility growth. In order to assess model credibility quantitatively and locate model error precisely, a new dynamic validation method based on extremum field mean mode decomposition (EMMD) and the Prony method is proposed in this paper. Firstly, complex dy-namic responses from models and real systems are processed into stationary components by EMMD. These components always have definite physical meanings which can be the evidence about rough model error location. Secondly, the Prony method is applied to identify the features of each EMMD component. Amplitude si-milarity, frequency similarity, damping similarity and phase simi-larity are defined to describe the similarity of dynamic responses. Then quantitative validation metrics are obtained based on the improved entropy weight and energy proportion. Precise model error location is realized based on the physical meanings of these features. The application of this method in aircraft control er design provides evidence about its feasibility and usability.

  12. Uncertainty Quantification and Validation for RANS Turbulence Models

    Science.gov (United States)

    Oliver, Todd; Moser, Robert

    2011-11-01

    Uncertainty quantification and validation procedures for RANS turbulence models are developed and applied. The procedures used here rely on a Bayesian view of probability. In particular, the uncertainty quantification methodology requires stochastic model development, model calibration, and model comparison, all of which are pursued using tools from Bayesian statistics. Model validation is also pursued in a probabilistic framework. The ideas and processes are demonstrated on a channel flow example. Specifically, a set of RANS models--including Baldwin-Lomax, Spalart-Allmaras, k- ɛ, k- ω, and v2- f--and uncertainty representations are analyzed using DNS data for fully-developed channel flow. Predictions of various quantities of interest and the validity (or invalidity) of the various models for making those predictions will be examined. This work is supported by the Department of Energy [National Nuclear Security Administration] under Award Number [DE-FC52-08NA28615].

  13. Validation of a national hydrological model

    Science.gov (United States)

    McMillan, H. K.; Booker, D. J.; Cattoën, C.

    2016-10-01

    Nationwide predictions of flow time-series are valuable for development of policies relating to environmental flows, calculating reliability of supply to water users, or assessing risk of floods or droughts. This breadth of model utility is possible because various hydrological signatures can be derived from simulated flow time-series. However, producing national hydrological simulations can be challenging due to strong environmental diversity across catchments and a lack of data available to aid model parameterisation. A comprehensive and consistent suite of test procedures to quantify spatial and temporal patterns in performance across various parts of the hydrograph is described and applied to quantify the performance of an uncalibrated national rainfall-runoff model of New Zealand. Flow time-series observed at 485 gauging stations were used to calculate Nash-Sutcliffe efficiency and percent bias when simulating between-site differences in daily series, between-year differences in annual series, and between-site differences in hydrological signatures. The procedures were used to assess the benefit of applying a correction to the modelled flow duration curve based on an independent statistical analysis. They were used to aid understanding of climatological, hydrological and model-based causes of differences in predictive performance by assessing multiple hypotheses that describe where and when the model was expected to perform best. As the procedures produce quantitative measures of performance, they provide an objective basis for model assessment that could be applied when comparing observed daily flow series with competing simulated flow series from any region-wide or nationwide hydrological model. Model performance varied in space and time with better scores in larger and medium-wet catchments, and in catchments with smaller seasonal variations. Surprisingly, model performance was not sensitive to aquifer fraction or rain gauge density.

  14. LANL*V2.0: global modeling and validation

    Directory of Open Access Journals (Sweden)

    S. Zaharia

    2011-08-01

    Full Text Available We describe in this paper the new version of LANL*, an artificial neural network (ANN for calculating the magnetic drift invariant L*. This quantity is used for modeling radiation belt dynamics and for space weather applications. We have implemented the following enhancements in the new version: (1 we have removed the limitation to geosynchronous orbit and the model can now be used for a much larger region. (2 The new version is based on the improved magnetic field model by Tsyganenko and Sitnov (2005 (TS05 instead of the older model by Tsyganenko et al. (2003. We have validated the model and compared our results to L* calculations with the TS05 model based on ephemerides for CRRES, Polar, GPS, a LANL geosynchronous satellite, and a virtual RBSP type orbit. We find that the neural network performs very well for all these orbits with an error typically ΔL* * V2.0 artificial neural network is orders of magnitudes faster than traditional numerical field line integration techniques with the TS05 model. It has applications to real-time radiation belt forecasting, analysis of data sets involving decades of satellite of observations, and other problems in space weather.

  15. Radiative transfer model for contaminated slabs: experimental validations

    Science.gov (United States)

    Andrieu, F.; Schmidt, F.; Schmitt, B.; Douté, S.; Brissaud, O.

    2015-09-01

    This article presents a set of spectro-goniometric measurements of different water ice samples and the comparison with an approximated radiative transfer model. The experiments were done using the spectro-radiogoniometer described in Brissaud et al. (2004). The radiative transfer model assumes an isotropization of the flux after the second interface and is fully described in Andrieu et al. (2015). Two kinds of experiments were conducted. First, the specular spot was closely investigated, at high angular resolution, at the wavelength of 1.5 μm, where ice behaves as a very absorbing media. Second, the bidirectional reflectance was sampled at various geometries, including low phase angles on 61 wavelengths ranging from 0.8 to 2.0 μm. In order to validate the model, we made qualitative tests to demonstrate the relative isotropization of the flux. We also conducted quantitative assessments by using a Bayesian inversion method in order to estimate the parameters (e.g., sample thickness, surface roughness) from the radiative measurements only. A simple comparison between the retrieved parameters and the direct independent measurements allowed us to validate the model. We developed an innovative Bayesian inversion approach to quantitatively estimate the uncertainties in the parameters avoiding the usual slow Monte Carlo approach. First we built lookup tables, and then we searched the best fits and calculated a posteriori density probability functions. The results show that the model is able to reproduce the geometrical energy distribution in the specular spot, as well as the spectral behavior of water ice slabs. In addition, the different parameters of the model are compatible with independent measurements.

  16. A validation study of a stochastic model of human interaction

    Science.gov (United States)

    Burchfield, Mitchel Talmadge

    The purpose of this dissertation is to validate a stochastic model of human interactions which is part of a developmentalism paradigm. Incorporating elements of ancient and contemporary philosophy and science, developmentalism defines human development as a progression of increasing competence and utilizes compatible theories of developmental psychology, cognitive psychology, educational psychology, social psychology, curriculum development, neurology, psychophysics, and physics. To validate a stochastic model of human interactions, the study addressed four research questions: (a) Does attitude vary over time? (b) What are the distributional assumptions underlying attitudes? (c) Does the stochastic model, {-}N{intlimitssbsp{-infty}{infty}}varphi(chi,tau)\\ Psi(tau)dtau, have utility for the study of attitudinal distributions and dynamics? (d) Are the Maxwell-Boltzmann, Fermi-Dirac, and Bose-Einstein theories applicable to human groups? Approximately 25,000 attitude observations were made using the Semantic Differential Scale. Positions of individuals varied over time and the logistic model predicted observed distributions with correlations between 0.98 and 1.0, with estimated standard errors significantly less than the magnitudes of the parameters. The results bring into question the applicability of Fisherian research designs (Fisher, 1922, 1928, 1938) for behavioral research based on the apparent failure of two fundamental assumptions-the noninteractive nature of the objects being studied and normal distribution of attributes. The findings indicate that individual belief structures are representable in terms of a psychological space which has the same or similar properties as physical space. The psychological space not only has dimension, but individuals interact by force equations similar to those described in theoretical physics models. Nonlinear regression techniques were used to estimate Fermi-Dirac parameters from the data. The model explained a high degree

  17. Toward a validation process for model based safety analysis

    OpenAIRE

    Adeline, Romain; Cardoso, Janette; Darfeuil, Pierre; Humbert, Sophie; Seguin, Christel

    2010-01-01

    Today, Model Based processes become more and more widespread to achieve the analysis of a system. However, there is no formal testing approach to ensure that the formal model is compliant with the real system. In the paper, we choose to study AltaRica model. We present a general process to well construct and validate an AltaRica formal model. The focus is made on this validation phase, i.e. verifying the compliance between the model and the real system. For it, the proposed process recommends...

  18. A Report on the Validation of Beryllium Strength Models

    Energy Technology Data Exchange (ETDEWEB)

    Armstrong, Derek Elswick [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-02-05

    This report discusses work on validating beryllium strength models with flyer plate and Taylor rod experimental data. Strength models are calibrated with Hopkinson bar and quasi-static data. The Hopkinson bar data for beryllium provides strain rates up to about 4000 per second. A limitation of the Hopkinson bar data for beryllium is that it only provides information on strain up to about 0.15. The lack of high strain data at high strain rates makes it difficult to distinguish between various strength model settings. The PTW model has been calibrated many different times over the last 12 years. The lack of high strain data for high strain rates has resulted in these calibrated PTW models for beryllium exhibiting significantly different behavior when extrapolated to high strain. For beryllium, the α parameter of PTW has recently been calibrated to high precision shear modulus data. In the past the α value for beryllium was set based on expert judgment. The new α value for beryllium was used in a calibration of the beryllium PTW model by Sky Sjue. The calibration by Sjue used EOS table information to model the temperature dependence of the heat capacity. Also, the calibration by Sjue used EOS table information to model the density changes of the beryllium sample during the Hopkinson bar and quasi-static experiments. In this paper, the calibrated PTW model by Sjue is compared against experimental data and other strength models. The other strength models being considered are a PTW model calibrated by Shuh- Rong Chen and a Steinberg-Guinan type model by John Pedicini. The three strength models are used in a comparison against flyer plate and Taylor rod data. The results show that the Chen PTW model provides better agreement to this data. The Chen PTW model settings have been previously adjusted to provide a better fit to flyer plate data, whereas the Sjue PTW model has not been changed based on flyer plate data. However, the Sjue model provides a reasonable fit to

  19. Validation of Modeling Flow Approaching Navigation Locks

    Science.gov (United States)

    2013-08-01

    instrumentation, direction vernier . ........................................................................ 8  Figure 11. Plan A lock approach, upstream approach...13-9 8 Figure 9. Tools and instrumentation, bracket attached to rail. Figure 10. Tools and instrumentation, direction vernier . Numerical model

  20. Parental modelling of eating behaviours: observational validation of the Parental Modelling of Eating Behaviours scale (PARM).

    Science.gov (United States)

    Palfreyman, Zoe; Haycraft, Emma; Meyer, Caroline

    2015-03-01

    Parents are important role models for their children's eating behaviours. This study aimed to further validate the recently developed Parental Modelling of Eating Behaviours Scale (PARM) by examining the relationships between maternal self-reports on the PARM with the modelling practices exhibited by these mothers during three family mealtime observations. Relationships between observed maternal modelling and maternal reports of children's eating behaviours were also explored. Seventeen mothers with children aged between 2 and 6 years were video recorded at home on three separate occasions whilst eating a meal with their child. Mothers also completed the PARM, the Children's Eating Behaviour Questionnaire and provided demographic information about themselves and their child. Findings provided validation for all three PARM subscales, which were positively associated with their observed counterparts on the observational coding scheme (PARM-O). The results also indicate that habituation to observations did not change the feeding behaviours displayed by mothers. In addition, observed maternal modelling was significantly related to children's food responsiveness (i.e., their interest in and desire for foods), enjoyment of food, and food fussiness. This study makes three important contributions to the literature. It provides construct validation for the PARM measure and provides further observational support for maternal modelling being related to lower levels of food fussiness and higher levels of food enjoyment in their children. These findings also suggest that maternal feeding behaviours remain consistent across repeated observations of family mealtimes, providing validation for previous research which has used single observations. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. Bayesian Calibration, Validation and Uncertainty Quantification for Predictive Modelling of Tumour Growth: A Tutorial.

    Science.gov (United States)

    Collis, Joe; Connor, Anthony J; Paczkowski, Marcin; Kannan, Pavitra; Pitt-Francis, Joe; Byrne, Helen M; Hubbard, Matthew E

    2017-03-13

    In this work, we present a pedagogical tumour growth example, in which we apply calibration and validation techniques to an uncertain, Gompertzian model of tumour spheroid growth. The key contribution of this article is the discussion and application of these methods (that are not commonly employed in the field of cancer modelling) in the context of a simple model, whose deterministic analogue is widely known within the community. In the course of the example, we calibrate the model against experimental data that are subject to measurement errors, and then validate the resulting uncertain model predictions. We then analyse the sensitivity of the model predictions to the underlying measurement model. Finally, we propose an elementary learning approach for tuning a threshold parameter in the validation procedure in order to maximize predictive accuracy of our validated model.

  2. Checklist for the qualitative evaluation of clinical studies with particular focus on external validity and model validity

    Directory of Open Access Journals (Sweden)

    Vollmar Horst C

    2006-12-01

    Full Text Available Abstract Background It is often stated that external validity is not sufficiently considered in the assessment of clinical studies. Although tools for its evaluation have been established, there is a lack of awareness of their significance and application. In this article, a comprehensive checklist is presented addressing these relevant criteria. Methods The checklist was developed by listing the most commonly used assessment criteria for clinical studies. Additionally, specific lists for individual applications were included. The categories of biases of internal validity (selection, performance, attrition and detection bias correspond to structural, treatment-related and observational differences between the test and control groups. Analogously, we have extended these categories to address external validity and model validity, regarding similarity between the study population/conditions and the general population/conditions related to structure, treatment and observation. Results A checklist is presented, in which the evaluation criteria concerning external validity and model validity are systemised and transformed into a questionnaire format. Conclusion The checklist presented in this article can be applied to both planning and evaluating of clinical studies. We encourage the prospective user to modify the checklists according to the respective application and research question. The higher expenditure needed for the evaluation of clinical studies in systematic reviews is justified, particularly in the light of the influential nature of their conclusions on therapeutic decisions and the creation of clinical guidelines.

  3. A computational fluid dynamics model for wind simulation:model implementation and experimental validation

    Institute of Scientific and Technical Information of China (English)

    Zhuo-dong ZHANG; Ralf WIELAND; Matthias REICHE; Roger FUNK; Carsten HOFFMANN; Yong LI; Michael SOMMER

    2012-01-01

    To provide physically based wind modelling for wind erosion research at regional scale,a 3D computational fluid dynamics (CFD) wind model was developed.The model was programmed in C language based on the Navier-Stokes equations,and it is freely available as open source.Integrated with the spatial analysis and modelling tool (SAMT),the wind model has convenient input preparation and powerful output visualization.To validate the wind model,a series of experiments was conducted in a wind tunnel.A blocking inflow experiment was designed to test the performance of the model on simulation of basic fluid processes.A round obstacle experiment was designed to check if the model could simulate the influences of the obstacle on wind field.Results show that measured and simulated wind fields have high correlations,and the wind model can simulate both the basic processes of the wind and the influences of the obstacle on the wind field.These results show the high reliability of the wind model.A digital elevation model (DEM) of an area (3800 m long and 1700 m wide) in the Xilingele grassland in Inner Mongolia (autonomous region,China) was applied to the model,and a 3D wind field has been successfully generated.The clear implementation of the model and the adequate validation by wind tunnel experiments laid a solid foundation for the prediction and assessment of wind erosion at regional scale.

  4. Validation of full cavitation model in cryogenic fluids

    Institute of Scientific and Technical Information of China (English)

    CAO XiaoLi; ZHANG XiaoBin; QIU LiMin; GAN ZhiHua

    2009-01-01

    Numerical simulation of cavitation in cryogenic fluids is important in improving the stable operation of he propulsion system in liquid-fuel rocket. It also represents a broader class of problems where the fluid is operating close to its critical point and the thermal effects of cavitation are pronounced. The present article focuses on simulating cryogenic cavitation by implementing the "full cavitation model", coupled with energy equation, in conjunction with iteraUve update of the real fluid properties at local temperatures. Steady state computations are then conducted on hydrofoil and ogive in liquid nitrogen and hydrogen respectively, based on which we explore the mechanism of cavitation with thermal ef-fects. Comprehensive comparisons between the simulation results and experimental data as well as previous computations by other researchers validate the full cavitation model in cryogenic fluids. The sensitivity of cavity length to cavitation number is also examined.

  5. Modelling and validation of multiple reflections for enhanced laser welding

    Science.gov (United States)

    Milewski, J.; Sklar, E.

    1996-05-01

    The effects of multiple internal reflections within a laser weld joint as functions of joint geometry and processing conditions have been characterized. A computer-based ray tracing model is used to predict the reflective propagation of laser beam energy focused into the narrow gap of a metal joint for the purpose of predicting the location of melting and coalescence to form a weld. Quantitative comparisons are made between simulation cases. Experimental results are provided for qualitative model validation. This method is proposed as a way to enhance process efficiency and design laser welds which display deep penetration and high depth-to-width aspect ratios without high powered systems or keyhole mode melting.

  6. Traffic modelling validation of advanced driver assistance systems

    NARCIS (Netherlands)

    Tongeren, R. van; Gietelink, O.J.; Schutter, B. de; Verhaegen, M.

    2007-01-01

    This paper presents a microscopic traffic model for the validation of advanced driver assistance systems. This model describes single-lane traffic and is calibrated with data from a field operational test. To illustrate the use of the model, a Monte Carlo simulation of single-lane traffic scenarios

  7. Testing the validity of the International Atomic Energy Agency (IAEA) safety culture model.

    Science.gov (United States)

    López de Castro, Borja; Gracia, Francisco J; Peiró, José M; Pietrantoni, Luca; Hernández, Ana

    2013-11-01

    This paper takes the first steps to empirically validate the widely used model of safety culture of the International Atomic Energy Agency (IAEA), composed of five dimensions, further specified by 37 attributes. To do so, three independent and complementary studies are presented. First, 290 students serve to collect evidence about the face validity of the model. Second, 48 experts in organizational behavior judge its content validity. And third, 468 workers in a Spanish nuclear power plant help to reveal how closely the theoretical five-dimensional model can be replicated. Our findings suggest that several attributes of the model may not be related to their corresponding dimensions. According to our results, a one-dimensional structure fits the data better than the five dimensions proposed by the IAEA. Moreover, the IAEA model, as it stands, seems to have rather moderate content validity and low face validity. Practical implications for researchers and practitioners are included.

  8. Drilling forces model for lunar regolith exploration and experimental validation

    Science.gov (United States)

    Zhang, Tao; Ding, Xilun

    2017-02-01

    China's Chang'e lunar exploration project aims to sample and return lunar regolith samples at a minimum penetration depth of 2 m in 2017. Unlike such tasks on the Earth, automated drilling and sampling missions on the Moon are more complicated. Therefore, a delicately designed drill tool is required to minimize operational cost and enhance reliability. Penetration force and rotational torque are two critical parameters in designing the drill tool. In this paper, a novel numerical model for predicting penetration force and rotational torque in the drilling of lunar regolith is proposed. The model is based on quasi-static Mohr-Coulomb soil mechanics and explicitly describes the interaction between drill tool and lunar regolith. Geometric features of drill tool, mechanical properties of lunar regolith, and drilling parameters are taken into consideration in the model. Consequently, a drilling test bed was developed, and experimental penetration force and rotational torque were obtained in penetrating a lunar regolith simulant with different drilling parameters. Finally, theoretical and experimental results were compared to validate the proposed model. Experimental results indicated that the numerical model had good accuracy and was effective in predicting the penetration force and rotational torque in drilling the lunar regolith simulant.

  9. PIV validation of blood-heart valve leaflet interaction modelling.

    Science.gov (United States)

    Kaminsky, R; Dumont, K; Weber, H; Schroll, M; Verdonck, P

    2007-07-01

    The aim of this study was to validate the 2D computational fluid dynamics (CFD) results of a moving heart valve based on a fluid-structure interaction (FSI) algorithm with experimental measurements. Firstly, a pulsatile laminar flow through a monoleaflet valve model with a stiff leaflet was visualized by means of Particle Image Velocimetry (PIV). The inflow data sets were applied to a CFD simulation including blood-leaflet interaction. The measurement section with a fixed leaflet was enclosed into a standard mock loop in series with a Harvard Apparatus Pulsatile Blood Pump, a compliance chamber and a reservoir. Standard 2D PIV measurements were made at a frequency of 60 bpm. Average velocity magnitude results of 36 phase-locked measurements were evaluated at every 10 degrees of the pump cycle. For the CFD flow simulation, a commercially available package from Fluent Inc. was used in combination with inhouse developed FSI code based on the Arbitrary Lagrangian-Eulerian (ALE) method. Then the CFD code was applied to the leaflet to quantify the shear stress on it. Generally, the CFD results are in agreement with the PIV evaluated data in major flow regions, thereby validating the FSI simulation of a monoleaflet valve with a flexible leaflet. The applicability of the new CFD code for quantifying the shear stress on a flexible leaflet is thus demonstrated.

  10. Modeling simulation and experimental validation for mold filling process

    Institute of Scientific and Technical Information of China (English)

    HOU Hua; JU Dong-ying; MAO Hong-kui; D. SAITO

    2006-01-01

    Based on the continuum equation, momentum conservation and energy conservation equations, the numerical model of turbulent flow filling was introduced; the 3-D free surface vof method was improved. Whether or not the numerical simulation results are reasonable, it needs corresponding experimental validations. General experimental techniques for casting fluid flow process include: thermocouple tracking location method, hydraulic simulating method, heat-resistant glass window method and X-ray observation etc. The hydraulic analogue experiment with DPIV technique is arranged to validate the fluent flow program for low-pressure casting with 0.1×105 Pa and 0.6×105 Pa pressure visually. By comparing the flow head, liquid surface, flow velocity, it is found that the filling pressure value influences the flow state strongly. With the increase of the filling pressure, the fluid flow state becomes unstable, the flow head becomes higher, and the filling time is reduced. The simulated results are accordant with the observed results approximately, which can prove the reasonability of our numerical program for filling process further.

  11. Quantitative system validation in model driven design

    DEFF Research Database (Denmark)

    Hermanns, Hilger; Larsen, Kim Guldstrand; Raskin, Jean-Francois;

    2010-01-01

    The European STREP project Quasimodo1 develops theory, techniques and tool components for handling quantitative constraints in model-driven development of real-time embedded systems, covering in particular real-time, hybrid and stochastic aspects. This tutorial highlights the advances made, focus...

  12. Validation of limited sampling models (LSM) for estimating AUC in therapeutic drug monitoring - is a separate validation group required?

    NARCIS (Netherlands)

    Proost, J. H.

    2007-01-01

    Objective: Limited sampling models (LSM) for estimating AUC in therapeutic drug monitoring are usually validated in a separate group of patients, according to published guidelines. The aim of this study is to evaluate the validation of LSM by comparing independent validation with cross-validation us

  13. Development and Validation of Methodology to Model Flow in Ventilation Systems Commonly Found in Nuclear Facilities. Phase I

    Energy Technology Data Exchange (ETDEWEB)

    Strons, Philip [Argonne National Lab. (ANL), Argonne, IL (United States); Bailey, James L. [Argonne National Lab. (ANL), Argonne, IL (United States); Davis, John [Argonne National Lab. (ANL), Argonne, IL (United States); Grudzinski, James [Argonne National Lab. (ANL), Argonne, IL (United States); Hlotke, John [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-03-01

    In this work, we apply the CFD in modeling airflow and particulate transport. This modeling is then compared to field validation studies to both inform and validate the modeling assumptions. Based on the results of field tests, modeling assumptions and boundary conditions are refined and the process is repeated until the results are found to be reliable with a high level of confidence.

  14. Ensuring the Validity of the Micro Foundation in DSGE Models

    DEFF Research Database (Denmark)

    Andreasen, Martin Møller

    The presence of i) stochastic trends, ii) deterministic trends, and/or iii) stochastic volatil- ity in DSGE models may imply that the agents' objective functions attain infinite values. We say that such models do not have a valid micro foundation. The paper derives sufficient condi- tions which...... ensure that the objective functions of the households and the firms are finite even when various trends and stochastic volatility are included in a standard DSGE model. Based on these conditions we test the validity of the micro foundation in six DSGE models from the literature. The models of Justiniano...... & Primiceri (American Economic Review, forth- coming) and Fernández-Villaverde & Rubio-Ramírez (Review of Economic Studies, 2007) do not satisfy these sufficient conditions, or any other known set of conditions ensuring finite values for the objective functions. Thus, the validity of the micro foundation...

  15. Validation of Air Traffic Controller Workload Models

    Science.gov (United States)

    1979-09-01

    SAR) tapes dtirinq the data reduc- tion phase of the project. Kentron International Limited provided the software support for the oroject. This included... ETABS ) or to revised traffic control procedures. The models also can be used to verify productivity benefits after new configurations have been...col- lected and processed manually. A preliminary compari- son has been made between standard NAS Stage A and ETABS operations at Miami. 1.2

  16. Systematic approach to verification and validation: High explosive burn models

    Energy Technology Data Exchange (ETDEWEB)

    Menikoff, Ralph [Los Alamos National Laboratory; Scovel, Christina A. [Los Alamos National Laboratory

    2012-04-16

    Most material models used in numerical simulations are based on heuristics and empirically calibrated to experimental data. For a specific model, key questions are determining its domain of applicability and assessing its relative merits compared to other models. Answering these questions should be a part of model verification and validation (V and V). Here, we focus on V and V of high explosive models. Typically, model developers implemented their model in their own hydro code and use different sets of experiments to calibrate model parameters. Rarely can one find in the literature simulation results for different models of the same experiment. Consequently, it is difficult to assess objectively the relative merits of different models. This situation results in part from the fact that experimental data is scattered through the literature (articles in journals and conference proceedings) and that the printed literature does not allow the reader to obtain data from a figure in electronic form needed to make detailed comparisons among experiments and simulations. In addition, it is very time consuming to set up and run simulations to compare different models over sufficiently many experiments to cover the range of phenomena of interest. The first difficulty could be overcome if the research community were to support an online web based database. The second difficulty can be greatly reduced by automating procedures to set up and run simulations of similar types of experiments. Moreover, automated testing would be greatly facilitated if the data files obtained from a database were in a standard format that contained key experimental parameters as meta-data in a header to the data file. To illustrate our approach to V and V, we have developed a high explosive database (HED) at LANL. It now contains a large number of shock initiation experiments. Utilizing the header information in a data file from HED, we have written scripts to generate an input file for a hydro code

  17. Validación de un modelo de medida de los resultados percibidos por los mandos relacionados con el uso de las prácticas de alta implicación – HIWP (Validation of a model of measure for the results perceived by senior managers related to the use of HIWP

    Directory of Open Access Journals (Sweden)

    Graziela Conci

    2011-12-01

    Full Text Available Entre las prácticas de recursos humanos destacan las herramientas de alta implicación (HIWP. El modelo de Lawler (1991; 1998, agrupa estas prácticas en cuatro constructos: comunicación, formación, participación y compensación. Nuestro trabajo se centra en validar el modelo de medida de la escala de resultados, propuesto por Lawler y colaboradores (2001 y otros modelos de medida alternativos para medir los resultados con percepciones subjetivas por el mando. El modelo de medida se compone de las escalas de desempeño y servicio, recursos humanos y resultados de la organización. Tras realizar los análisis factoriales confirmatorias de los datos de 98 empresas españolas, validamos un modelo que presenta unos estadísticos de validez convergente muy buenos. Abstract: Upon several human resources practices high involvement work practices stand out. Lawler’s model (1991; 1998 groups these practices in four constructs: communication, training, empowerment and rewards. Our work focuses on validating the model of measure of the scale of results, proposed by Lawler and colleagues (2001 and other alternative measurement models in order to measure the results with subjective perception for the senior managers. The measurement model is composed of performance and services scales, human resources and results achieved by the organization. After carrying out the confirmatory factorial analyses for collected data from 98 Spanish companies, we validate a model that presents very good convergent statistics of validity.

  18. Development, validation and application of numerical space environment models

    Science.gov (United States)

    Honkonen, Ilja

    2013-10-01

    Currently the majority of space-based assets are located inside the Earth's magnetosphere where they must endure the effects of the near-Earth space environment, i.e. space weather, which is driven by the supersonic flow of plasma from the Sun. Space weather refers to the day-to-day changes in the temperature, magnetic field and other parameters of the near-Earth space, similarly to ordinary weather which refers to changes in the atmosphere above ground level. Space weather can also cause adverse effects on the ground, for example, by inducing large direct currents in power transmission systems. The performance of computers has been growing exponentially for many decades and as a result the importance of numerical modeling in science has also increased rapidly. Numerical modeling is especially important in space plasma physics because there are no in-situ observations of space plasmas outside of the heliosphere and it is not feasible to study all aspects of space plasmas in a terrestrial laboratory. With the increasing number of computational cores in supercomputers, the parallel performance of numerical models on distributed memory hardware is also becoming crucial. This thesis consists of an introduction, four peer reviewed articles and describes the process of developing numerical space environment/weather models and the use of such models to study the near-Earth space. A complete model development chain is presented starting from initial planning and design to distributed memory parallelization and optimization, and finally testing, verification and validation of numerical models. A grid library that provides good parallel scalability on distributed memory hardware and several novel features, the distributed cartesian cell-refinable grid (DCCRG), is designed and developed. DCCRG is presently used in two numerical space weather models being developed at the Finnish Meteorological Institute. The first global magnetospheric test particle simulation based on the

  19. Experiments for foam model development and validation.

    Energy Technology Data Exchange (ETDEWEB)

    Bourdon, Christopher Jay; Cote, Raymond O.; Moffat, Harry K.; Grillet, Anne Mary; Mahoney, James F. (Honeywell Federal Manufacturing and Technologies, Kansas City Plant, Kansas City, MO); Russick, Edward Mark; Adolf, Douglas Brian; Rao, Rekha Ranjana; Thompson, Kyle Richard; Kraynik, Andrew Michael; Castaneda, Jaime N.; Brotherton, Christopher M.; Mondy, Lisa Ann; Gorby, Allen D.

    2008-09-01

    A series of experiments has been performed to allow observation of the foaming process and the collection of temperature, rise rate, and microstructural data. Microfocus video is used in conjunction with particle image velocimetry (PIV) to elucidate the boundary condition at the wall. Rheology, reaction kinetics and density measurements complement the flow visualization. X-ray computed tomography (CT) is used to examine the cured foams to determine density gradients. These data provide input to a continuum level finite element model of the blowing process.

  20. Nonlinear dispersion effects in elastic plates: numerical modelling and validation

    Science.gov (United States)

    Kijanka, Piotr; Radecki, Rafal; Packo, Pawel; Staszewski, Wieslaw J.; Uhl, Tadeusz; Leamy, Michael J.

    2017-04-01

    Nonlinear features of elastic wave propagation have attracted significant attention recently. The particular interest herein relates to complex wave-structure interactions, which provide potential new opportunities for feature discovery and identification in a variety of applications. Due to significant complexity associated with wave propagation in nonlinear media, numerical modeling and simulations are employed to facilitate design and development of new measurement, monitoring and characterization systems. However, since very high spatio- temporal accuracy of numerical models is required, it is critical to evaluate their spectral properties and tune discretization parameters for compromise between accuracy and calculation time. Moreover, nonlinearities in structures give rise to various effects that are not present in linear systems, e.g. wave-wave interactions, higher harmonics generation, synchronism and | recently reported | shifts to dispersion characteristics. This paper discusses local computational model based on a new HYBRID approach for wave propagation in nonlinear media. The proposed approach combines advantages of the Local Interaction Simulation Approach (LISA) and Cellular Automata for Elastodynamics (CAFE). The methods are investigated in the context of their accuracy for predicting nonlinear wavefields, in particular shifts to dispersion characteristics for finite amplitude waves and secondary wavefields. The results are validated against Finite Element (FE) calculations for guided waves in copper plate. Critical modes i.e., modes determining accuracy of a model at given excitation frequency - are identified and guidelines for numerical model parameters are proposed.

  1. Lightweight ZERODUR: Validation of Mirror Performance and Mirror Modeling Predictions

    Science.gov (United States)

    Hull, Tony; Stahl, H. Philip; Westerhoff, Thomas; Valente, Martin; Brooks, Thomas; Eng, Ron

    2017-01-01

    Upcoming spaceborne missions, both moderate and large in scale, require extreme dimensional stability while relying both upon established lightweight mirror materials, and also upon accurate modeling methods to predict performance under varying boundary conditions. We describe tests, recently performed at NASA's XRCF chambers and laboratories in Huntsville Alabama, during which a 1.2 m diameter, f/1.2988% lightweighted SCHOTT lightweighted ZERODUR(TradeMark) mirror was tested for thermal stability under static loads in steps down to 230K. Test results are compared to model predictions, based upon recently published data on ZERODUR(TradeMark). In addition to monitoring the mirror surface for thermal perturbations in XRCF Thermal Vacuum tests, static load gravity deformations have been measured and compared to model predictions. Also the Modal Response(dynamic disturbance) was measured and compared to model. We will discuss the fabrication approach and optomechanical design of the ZERODUR(TradeMark) mirror substrate by SCHOTT, its optical preparation for test by Arizona Optical Systems (AOS). Summarize the outcome of NASA's XRCF tests and model validations

  2. Validation of DWPF Melter Off-Gas Combustion Model

    Energy Technology Data Exchange (ETDEWEB)

    Choi, A.S.

    2000-08-23

    The empirical melter off-gas combustion model currently used in the DWPF safety basis calculations is valid at melter vapor space temperatures above 570 degrees C, as measured in the thermowell. This lower temperature bound coincides with that of the off-gas data used as the basis of the model. In this study, the applicability of the empirical model in a wider temperature range was assessed using the off-gas data collected during two small-scale research melter runs. The first data set came from the Small Cylindrical Melter-2 run in 1985 with the sludge feed coupled with the precipitate hydrolysis product. The second data set came from the 774-A melter run in 1996 with the sludge-only feed prepared with the modified acid addition strategy during the feed pretreatment step. The results of the assessment showed that the data from these two melter runs agreed well with the existing model, and further provided the basis for extending the lower temperature bound of the model to the measured melter vapor space temperature of 445 degrees C.

  3. Model validation of channel zapping quality

    NARCIS (Netherlands)

    Kooij, R.E; Nicolai, F.; Ahmed, K.; Brunnström, K.

    2009-01-01

    In an earlier paper we showed, that perceived quality of channel zapping is related to the perceived quality of download time of web browsing, as suggested by ITU-T Rec.G.1030. We showed this by performing subjective tests resulting in an excellent fit with a 0.99 correlation. This was what we call

  4. Model validation of channel zapping quality

    NARCIS (Netherlands)

    Kooij, R.E; Nicolai, F.; Ahmed, K.; Brunnström, K.

    2009-01-01

    In an earlier paper we showed, that perceived quality of channel zapping is related to the perceived quality of download time of web browsing, as suggested by ITU-T Rec.G.1030. We showed this by performing subjective tests resulting in an excellent fit with a 0.99 correlation. This was what we call

  5. Validation of a Model of the Domino Effect?

    CERN Document Server

    Larham, Ron

    2008-01-01

    A recent paper proposing a model of the limiting speed of the domino effect is discussed with reference to its need and the need of models in general for validation against experimental data. It is shown that the proposed model diverges significantly from experimentally derived speed estimates over a significant range of domino spacing using data from the existing literature and this author's own measurements, hence if its use had had economic importance its use outside its range of validity could have led to loses of one sort or another to its users.

  6. Validation of an Efficient Outdoor Sound Propagation Model Using BEM

    DEFF Research Database (Denmark)

    Quirós-Alpera, S.; Henriquez, Vicente Cutanda; Jacobsen, Finn

    2001-01-01

    An approximate, simple and practical model for prediction of outdoor sound propagation exists based on ray theory, diffraction theory and Fresnel-zone considerations [1]. This model, which can predict sound propagation over non-flat terrain, has been validated for combinations of flat ground, hills...... and barriers, but it still needs to be validated for configurations that involve combinations of valleys and barriers. In order to do this a boundary element model has been implemented in MATLAB to serve as a reliable reference....

  7. Validation of a Model for Ice Formation around Finned Tubes

    Directory of Open Access Journals (Sweden)

    Kamal A. R. Ismai

    2016-09-01

    Full Text Available Phase change materials although attaractive option for thermal storage applications its main drawback is the slow thermal response during charging and discharging processes due to their low thermal conductivity. The present study validates a model developed by the authors some years ago on radial fins as a method to meliorate the thermal performance of PCM in horizontal storage system. The developed model for the radial finned tube is based on pure conduction, the enthalpy approach and was discretized by the finite difference method. Experiments were realized specifically to validate the model and its numerical predictions.

  8. Development and validation of a realistic head model for EEG

    Science.gov (United States)

    Bangera, Nitin Bhalchandra

    The utility of extracranial electrical or magnetic field recordings (EEG or MEG) is greatly enhanced if the generators of the bioelectromagnetic fields can be determined accurately from the measured fields. This procedure, known as the 'inverse method,' depends critically on calculations of the projection from generators in the brain to the EEG and MEG sensors. Improving and validating this calculation, known as the 'forward solution,' is the focus of this dissertation. The improvements involve more accurate modeling of the structures of the brain and thus understanding how current flows within the brain as a result of addition of structures in the forward model. Validation compares calculations using different forward models to the experimental results obtained by stimulating with implanted dipole electrodes. The human brain tissue displays inhomogeneity in electrical conductivity and also displays anisotropy, notably in the skull and brain white matter. In this dissertation, a realistic head model has been implemented using the finite element method to calculate the effects of inhomogeneity and anisotropy in the human brain. Accurate segmentation of the brain tissue type is implemented using a semi-automatic method to segment multimodal imaging data from multi-spectral MRI scans (different flip angles) in conjunction with the regular T1-weighted scans and computed x-ray tomography images. The electrical conductivity in the anisotropic white matter tissue is quantified from diffusion tensor MRI. The finite element model is constructed using AMIRA, a commercial segmentation and visualization tool and solved using ABAQUS, a commercial finite element solver. The model is validated using experimental data collected from intracranial stimulation in medically intractable epileptic patients. Depth electrodes are implanted in medically intractable epileptic patients in order to direct surgical therapy when the foci cannot be localized with the scalp EEG. These patients

  9. External Validity and Model Validity: A Conceptual Approach for Systematic Review Methodology

    Directory of Open Access Journals (Sweden)

    Raheleh Khorsan

    2014-01-01

    Full Text Available Background. Evidence rankings do not consider equally internal (IV, external (EV, and model validity (MV for clinical studies including complementary and alternative medicine/integrative health care (CAM/IHC research. This paper describe this model and offers an EV assessment tool (EVAT© for weighing studies according to EV and MV in addition to IV. Methods. An abbreviated systematic review methodology was employed to search, assemble, and evaluate the literature that has been published on EV/MV criteria. Standard databases were searched for keywords relating to EV, MV, and bias-scoring from inception to Jan 2013. Tools identified and concepts described were pooled to assemble a robust tool for evaluating these quality criteria. Results. This study assembled a streamlined, objective tool to incorporate for the evaluation of quality of EV/MV research that is more sensitive to CAM/IHC research. Conclusion. Improved reporting on EV can help produce and provide information that will help guide policy makers, public health researchers, and other scientists in their selection, development, and improvement in their research-tested intervention. Overall, clinical studies with high EV have the potential to provide the most useful information about “real-world” consequences of health interventions. It is hoped that this novel tool which considers IV, EV, and MV on equal footing will better guide clinical decision making.

  10. Child human model development: a hybrid validation approach

    NARCIS (Netherlands)

    Forbes, P.A.; Rooij, L. van; Rodarius, C.; Crandall, J.

    2008-01-01

    The current study presents a development and validation approach of a child human body model that will help understand child impact injuries and improve the biofidelity of child anthropometric test devices. Due to the lack of fundamental child biomechanical data needed to fully develop such models a

  11. Child human model development: a hybrid validation approach

    NARCIS (Netherlands)

    Forbes, P.A.; Rooij, L. van; Rodarius, C.; Crandall, J.

    2008-01-01

    The current study presents a development and validation approach of a child human body model that will help understand child impact injuries and improve the biofidelity of child anthropometric test devices. Due to the lack of fundamental child biomechanical data needed to fully develop such models a

  12. Reverse electrodialysis : A validated process model for design and optimization

    NARCIS (Netherlands)

    Veerman, J.; Saakes, M.; Metz, S. J.; Harmsen, G. J.

    2011-01-01

    Reverse electrodialysis (RED) is a technology to generate electricity using the entropy of the mixing of sea and river water. A model is made of the RED process and validated experimentally. The model is used to design and optimize the RED process. It predicts very small differences between counter-

  13. Composing, Analyzing and Validating Software Models

    Science.gov (United States)

    Sheldon, Frederick T.

    1998-10-01

    This research has been conducted at the Computational Sciences Division of the Information Sciences Directorate at Ames Research Center (Automated Software Engineering Grp). The principle work this summer has been to review and refine the agenda that were carried forward from last summer. Formal specifications provide good support for designing a functionally correct system, however they are weak at incorporating non-functional performance requirements (like reliability). Techniques which utilize stochastic Petri nets (SPNs) are good for evaluating the performance and reliability for a system, but they may be too abstract and cumbersome from the stand point of specifying and evaluating functional behavior. Therefore, one major objective of this research is to provide an integrated approach to assist the user in specifying both functionality (qualitative: mutual exclusion and synchronization) and performance requirements (quantitative: reliability and execution deadlines). In this way, the merits of a powerful modeling technique for performability analysis (using SPNs) can be combined with a well-defined formal specification language. In doing so, we can come closer to providing a formal approach to designing a functionally correct system that meets reliability and performance goals.

  14. Validation of a terrestrial food chain model.

    Science.gov (United States)

    Travis, C C; Blaylock, B P

    1992-01-01

    An increasingly important topic in risk assessment is the estimation of human exposure to environmental pollutants through pathways other than inhalation. The Environmental Protection Agency (EPA) has recently developed a computerized methodology (EPA, 1990) to estimate indirect exposure to toxic pollutants from Municipal Waste Combuster emissions. This methodology estimates health risks from exposure to toxic pollutants from the terrestrial food chain (TFC), soil ingestion, drinking water ingestion, fish ingestion, and dermal absorption via soil and water. Of these, one of the most difficult to estimate is exposure through the food chain. This paper estimates the accuracy of the EPA methodology for estimating food chain contamination. To our knowledge, no data exist on measured concentrations of pollutants in food grown around Municipal Waste Incinerators, and few field-scale studies have been performed on the uptake of pollutants in the food chain. Therefore, to evaluate the EPA methodology, we compare actual measurements of background contaminant levels in food with estimates made using EPA's computerized methodology. Background levels of contaminants in air, water, and soil were used as input to the EPA food chain model to predict background levels of contaminants in food. These predicted values were then compared with the measured background contaminant levels. Comparisons were performed for dioxin, pentachlorophenol, polychlorinated biphenyls, benzene, benzo(a)pyrene, mercury, and lead.

  15. On the development and validation of QSAR models.

    Science.gov (United States)

    Gramatica, Paola

    2013-01-01

    The fundamental and more critical steps that are necessary for the development and validation of QSAR models are presented in this chapter as best practices in the field. These procedures are discussed in the context of predictive QSAR modelling that is focused on achieving models of the highest statistical quality and with external predictive power. The most important and most used statistical parameters needed to verify the real performances of QSAR models (of both linear regression and classification) are presented. Special emphasis is placed on the validation of models, both internally and externally, as well as on the need to define model applicability domains, which should be done when models are employed for the prediction of new external compounds.

  16. 42 CFR 493.571 - Disclosure of accreditation, State and CMS validation inspection results.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Disclosure of accreditation, State and CMS... Program § 493.571 Disclosure of accreditation, State and CMS validation inspection results. (a) Accreditation organization inspection results. CMS may disclose accreditation organization inspection results...

  17. GEOCHEMICAL RECOGNITION OF SPILLED SEDIMENTS USED IN NUMERICAL MODEL VALIDATION

    Institute of Scientific and Technical Information of China (English)

    Jens R.VALEUR; Steen LOMHOLT; Christian KNUDSEN

    2004-01-01

    A fixed link (tunnel and bridge,in total 16 km) was constructed between Sweden and Denmark during 1995-2000.As part of the work,approximately 16 million tonnes of seabed materials (limestone and clay till) were dredged,and about 0.6 million tonnes of these were spilled in the water.Modelling of the spreading and sedimentation of the spilled sediments took place as part of the environmental monitoring of the construction activities.In order to verify the results of the numerical modelling of sediment spreading and sedimentation,a new method with the purpose of distinguishing between the spilled sediments and the naturally occurring sediments was developed.Because the spilled sediments tend to accumulate at the seabed in areas with natural sediments of the same size,it is difficult to separate these based purely on the physical properties.The new method is based on the geo-chemical differences between the natural sediment in the area and the spill.The basic properties used are the higher content of calcium carbonate material in the spill as compared to the natural sediments and the higher Ca/Sr ratio in the spill compared to shell fragments dominating the natural calcium carbonate deposition in the area.The reason for these differences is that carbonate derived from recent shell debris can be discriminated from Danien limestone,which is the material in which the majority of the dredging took place,on the basis of the Ca/Sr ratio being 488 in Danien Limestone and 237 in shell debris.The geochemical recognition of the origin of the sediments proved useful in separating the spilled from the naturally occurring sediments.Without this separation,validation of the modelling of accumulation of spilled sediments would not have been possible.The method has general validity and can be used in many situations where the origin ora given sediment is sought.

  18. Validation of ice loads predicted from meteorological models

    Energy Technology Data Exchange (ETDEWEB)

    Veal, A.; Skea, A. [UK Met Office, Exeter, England (United Kingdom); Wareing, B. [Brian Wareing Tech Ltd., England (United Kingdom)

    2005-07-01

    Results of a field trial conducted on 2 Gerber PVM-100 instruments at Deadwater Fell test site in the United Kingdom were presented. The trials were conducted to assess whether the instruments were capable of measuring the liquid water content of the air, as well as to validate an ice model in terms of accretion rates on different sized conductors. Ambient air temperature, wind speed and direction were monitored at the Deadwater Fell weather station along with load cell values. Time lapse video recorders and a web camera system were used to view the performance of the conductors in varying weather conditions. All data was collected and stored at the site. It was anticipated that output from the instruments could be related to the conditions under which overhead line conductors suffer from ice loads, and help to revise weather maps which have proved to be incompatible with utility experience and the lifetimes achieved by overhead line designs. The data provided from the Deadwater work included logged data from the Gerbers, weather data and load data from a 10 mm diameter aluminium alloy conductor. When the combination of temperature, wind direction and Gerber output indicated icing conditions, they were confirmed by the conductor's load cell data. The tests confirmed the validity of the Gerber instruments to predict the occurrence of icing conditions, when combined with other meteorological data. It was concluded that the instruments may aid in optimized prediction methods for ice loads and icing events. 2 refs., 4 figs.

  19. ADOPT: A Historically Validated Light Duty Vehicle Consumer Choice Model

    Energy Technology Data Exchange (ETDEWEB)

    Brooker, A.; Gonder, J.; Lopp, S.; Ward, J.

    2015-05-04

    The Automotive Deployment Option Projection Tool (ADOPT) is a light-duty vehicle consumer choice and stock model supported by the U.S. Department of Energy’s Vehicle Technologies Office. It estimates technology improvement impacts on U.S. light-duty vehicles sales, petroleum use, and greenhouse gas emissions. ADOPT uses techniques from the multinomial logit method and the mixed logit method estimate sales. Specifically, it estimates sales based on the weighted value of key attributes including vehicle price, fuel cost, acceleration, range and usable volume. The average importance of several attributes changes nonlinearly across its range and changes with income. For several attributes, a distribution of importance around the average value is used to represent consumer heterogeneity. The majority of existing vehicle makes, models, and trims are included to fully represent the market. The Corporate Average Fuel Economy regulations are enforced. The sales feed into the ADOPT stock model. It captures key aspects for summing petroleum use and greenhouse gas emissions This includes capturing the change in vehicle miles traveled by vehicle age, the creation of new model options based on the success of existing vehicles, new vehicle option introduction rate limits, and survival rates by vehicle age. ADOPT has been extensively validated with historical sales data. It matches in key dimensions including sales by fuel economy, acceleration, price, vehicle size class, and powertrain across multiple years. A graphical user interface provides easy and efficient use. It manages the inputs, simulation, and results.

  20. A prediction model for ocular damage - Experimental validation.

    Science.gov (United States)

    Heussner, Nico; Vagos, Márcia; Spitzer, Martin S; Stork, Wilhelm

    2015-08-01

    With the increasing number of laser applications in medicine and technology, accidental as well as intentional exposure of the human eye to laser sources has become a major concern. Therefore, a prediction model for ocular damage (PMOD) is presented within this work and validated for long-term exposure. This model is a combination of a raytracing model with a thermodynamical model of the human and an application which determines the thermal damage by the implementation of the Arrhenius integral. The model is based on our earlier work and is here validated against temperature measurements taken with porcine eye samples. For this validation, three different powers were used: 50mW, 100mW and 200mW with a spot size of 1.9mm. Also, the measurements were taken with two different sensing systems, an infrared camera and a fibre optic probe placed within the tissue. The temperatures were measured up to 60s and then compared against simulations. The measured temperatures were found to be in good agreement with the values predicted by the PMOD-model. To our best knowledge, this is the first model which is validated for both short-term and long-term irradiations in terms of temperature and thus demonstrates that temperatures can be accurately predicted within the thermal damage regime. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Empirical Validation of Building Simulation Software : Modeling of Double Facades

    DEFF Research Database (Denmark)

    Kalyanova, Olena; Heiselberg, Per

    The work described in this report is the result of a collaborative effort of members of the International Energy Agency (IEA), Task 34/43: Testing and validation of building energy simulation tools experts group.......The work described in this report is the result of a collaborative effort of members of the International Energy Agency (IEA), Task 34/43: Testing and validation of building energy simulation tools experts group....

  2. First approximations in avalanche model validations using seismic information

    Science.gov (United States)

    Roig Lafon, Pere; Suriñach, Emma; Bartelt, Perry; Pérez-Guillén, Cristina; Tapia, Mar; Sovilla, Betty

    2017-04-01

    Avalanche dynamics modelling is an essential tool for snow hazard management. Scenario based numerical modelling provides quantitative arguments for decision-making. The software tool RAMMS (WSL Institute for Snow and Avalanche Research SLF) is one such tool, often used by government authorities and geotechnical offices. As avalanche models improve, the quality of the numerical results will depend increasingly on user experience on the specification of input (e.g. release and entrainment volumes, secondary releases, snow temperature and quality). New model developments must continue to be validated using real phenomena data, for improving performance and reliability. The avalanches group form University of Barcelona (RISKNAT - UB), has studied the seismic signals generated from avalanches since 1994. Presently, the group manages the seismic installation at SLF's Vallée de la Sionne experimental site (VDLS). At VDLS the recorded seismic signals can be correlated to other avalanche measurement techniques, including both advanced remote sensing methods (radars, videogrammetry) and obstacle based sensors (pressure, capacitance, optical sender-reflector barriers). This comparison between different measurement techniques allows the group to address the question if seismic analysis can be used alone, on more additional avalanche tracks, to gain insight and validate numerical avalanche dynamics models in different terrain conditions. In this study, we aim to add the seismic data as an external record of the phenomena, able to validate RAMMS models. The seismic sensors are considerable easy and cheaper to install than other physical measuring tools, and are able to record data from the phenomena in every atmospheric conditions (e.g. bad weather, low light, freezing make photography, and other kind of sensors not usable). With seismic signals, we record the temporal evolution of the inner and denser parts of the avalanche. We are able to recognize the approximate position

  3. The hypothetical world of CoMFA and model validation

    Energy Technology Data Exchange (ETDEWEB)

    Oprea, T.I. [Los Alamos National Lab., NM (United States)

    1996-12-31

    CoMFA is a technique used to establish the three-dimensional similarity of molecular structures, in relationship to a target property. Because the risk of chance correlation is high, validation is required for all CoMFA models. The following validation steps should be performed: the choice of alignment rules (superimposition and conformer criteria) has to use experimental data when available, or different (alternate) hypotheses; statistical methods (e.g., cross-validation with randomized groups), have to emphasize simplicity, robustness, predictivity and explanatory power. When several CoMFA-QSAR models on similar targets and/or structures are available, qualitative lateral validation can be applied. This meta-analysis for CoMFA models offers a broader perspective on the similarities and differences between compared biological targets, with potential applications in rational drug design [e.g., selectivity, efficacy] and environmental toxicology. Examples that focus on validation of CoMFA models include the following steroid-binding proteins: aromatase, the estrogen and the androgen receptors, a monoclonal antibody against progesterone and two steroid binding globulins.

  4. Validation Analysis of the Shoal Groundwater Flow and Transport Model

    Energy Technology Data Exchange (ETDEWEB)

    A. Hassan; J. Chapman

    2008-11-01

    Environmental restoration at the Shoal underground nuclear test is following a process prescribed by a Federal Facility Agreement and Consent Order (FFACO) between the U.S. Department of Energy, the U.S. Department of Defense, and the State of Nevada. Characterization of the site included two stages of well drilling and testing in 1996 and 1999, and development and revision of numerical models of groundwater flow and radionuclide transport. Agreement on a contaminant boundary for the site and a corrective action plan was reached in 2006. Later that same year, three wells were installed for the purposes of model validation and site monitoring. The FFACO prescribes a five-year proof-of-concept period for demonstrating that the site groundwater model is capable of producing meaningful results with an acceptable level of uncertainty. The corrective action plan specifies a rigorous seven step validation process. The accepted groundwater model is evaluated using that process in light of the newly acquired data. The conceptual model of ground water flow for the Project Shoal Area considers groundwater flow through the fractured granite aquifer comprising the Sand Springs Range. Water enters the system by the infiltration of precipitation directly on the surface of the mountain range. Groundwater leaves the granite aquifer by flowing into alluvial deposits in the adjacent basins of Fourmile Flat and Fairview Valley. A groundwater divide is interpreted as coinciding with the western portion of the Sand Springs Range, west of the underground nuclear test, preventing flow from the test into Fourmile Flat. A very low conductivity shear zone east of the nuclear test roughly parallels the divide. The presence of these lateral boundaries, coupled with a regional discharge area to the northeast, is interpreted in the model as causing groundwater from the site to flow in a northeastward direction into Fairview Valley. Steady-state flow conditions are assumed given the absence of

  5. The turbulent viscosity models and their experimental validation; Les modeles de viscosite turbulente et leur validation experimentale

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-12-31

    This workshop on turbulent viscosity models and on their experimental validation was organized by the `convection` section of the French society of thermal engineers. From the 9 papers presented during this workshop, 8 deal with the modeling of turbulent flows inside combustion chambers, turbo-machineries or in other energy-related applications, and have been selected for ETDE. (J.S.)

  6. A framework for computer model validation with a stochastic traffic microsimulators as test-bed

    Energy Technology Data Exchange (ETDEWEB)

    Sacks, J.; Rouphail, N. M.; Park, B. B.

    2001-07-01

    The validation of computer (simulation) models is a crucial element in assessing their utility for science and for policy-making. Often discussed and sometimes practiced informally, the process is straightforward conceptually: data are collected that represent both the inputs and the outputs of the model, the model is run at those inputs, and the output is compared to field data. In reality, complications abound, field data may be expensive, scarce or noisy, the model may be so complex that only a few runs are possible, and uncertainly enters the process at every turn. Even though is inherently a statistical issue, model validation lacks a unifying statistical framework. The need to develop such a framework is compelling, even urgent. The use of computer models by scientists and planners is growing, costs of poor decisions are escalating: and increasing computing power, for both computational and data collection, is magnifying the scale of the issues. Building a framework for validation of computer models requires an assortment of procedures and considerations and recognition of the multiple stages in the development and use of the models. Verification, which encompasses procedures to assure that the computer code is bug-free, is often seen as a predecessor of validation whereas, in fact, it may be enmeshed with validation. Feedback from outcomes of steps in the validation process can impact model development through detection of flaws or gaps- the result is an intertwining of validation with development. We will focus on (the) five essential characteristics of a validation: context, data uncertainty. feedback, and prediction and use a traffic microsimulator model applied to the planning of traffic signal timing as a test-bed. Our goal is to draw attention to the many complexities that need to be considered in order to achieve a successful validation. (Author) 3 refs.

  7. Nonequilibrium stage modelling of dividing wall columns and experimental validation

    Science.gov (United States)

    Hiller, Christoph; Buck, Christina; Ehlers, Christoph; Fieg, Georg

    2010-11-01

    Dealing with complex process units like dividing wall columns pushes the focus on the determination of suitable modelling approaches. For this purpose a nonequilibrium stage model is developed. The successful validation is achieved by an experimental investigation of fatty alcohol mixtures under vacuum condition at pilot scale. Aim is the recovery of high purity products. The proposed model predicts the product qualities and temperature profiles very well.

  8. Human surrogate models of neuropathic pain: validity and limitations.

    Science.gov (United States)

    Binder, Andreas

    2016-02-01

    Human surrogate models of neuropathic pain in healthy subjects are used to study symptoms, signs, and the hypothesized underlying mechanisms. Although different models are available, different spontaneous and evoked symptoms and signs are inducible; 2 key questions need to be answered: are human surrogate models conceptually valid, ie, do they share the sensory phenotype of neuropathic pain states, and are they sufficiently reliable to allow consistent translational research?

  9. Blast Load Simulator Experiments for Computational Model Validation: Report 2

    Science.gov (United States)

    2017-02-01

    O’Daniel, 2016. Blast load simulator experiments for computational model validation – Report 1. ERDC/GSL TR-16-27. Vicksburg, MS: U.S. Army Engineer ...ER D C/ G SL T R- 16 -2 7 Blast Load Simulator Experiments for Computational Model Validation Report 2 G eo te ch ni ca l a nd S tr uc...Approved for public release; distribution is unlimited. The U.S. Army Engineer Research and Development Center (ERDC) solves the nation’s toughest

  10. Validation techniques of agent based modelling for geospatial simulations

    Science.gov (United States)

    Darvishi, M.; Ahmadi, G.

    2014-10-01

    One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS) is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS), biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI's ArcGIS, OpenMap, GeoTools, etc) for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  11. Validation techniques of agent based modelling for geospatial simulations

    Directory of Open Access Journals (Sweden)

    M. Darvishi

    2014-10-01

    Full Text Available One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS, biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI’s ArcGIS, OpenMap, GeoTools, etc for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  12. MODEL-BASED VALIDATION AND VERIFICATION OF ANOMALIES IN LEGISLATION

    Directory of Open Access Journals (Sweden)

    Vjeran Strahonja

    2006-12-01

    Full Text Available An anomaly in legislation is absence of completeness, consistency and other desirable properties, caused by different semantic, syntactic or pragmatic reasons. In general, the detection of anomalies in legislation comprises validation and verification. The basic idea of research, as presented in this paper, is modelling legislation by capturing domain knowledge of legislation and specifying it in a generic way by using commonly agreed and understandable modelling concepts of the Unified Modelling Language (UML. Models of legislation enable to understand the system better, support the detection of anomalies and help to improve the quality of legislation by validation and verification. By implementing model-based approach, the object of validation and verification moves from legislation to its model. The business domain of legislation has two distinct aspects: a structural or static aspect (functionality, business data etc., and a behavioural or dynamic part (states, transitions, activities, sequences etc.. Because anomalism can occur on two different levels, on the level of a model, or on the level of legislation itself, a framework for validation and verification of legal regulation and its model is discussed. The presented framework includes some significant types of semantic and syntactic anomalies. Some ideas for assessment of pragmatic anomalies of models were found in the field of software quality metrics. Thus pragmatic features and attributes can be determined that could be relevant for evaluation purposes of models. Based on analogue standards for the evaluation of software, a qualitative and quantitative scale can be applied to determine the value of some feature for a specific model.

  13. External model validation of binary clinical risk prediction models in cardiovascular and thoracic surgery.

    Science.gov (United States)

    Hickey, Graeme L; Blackstone, Eugene H

    2016-08-01

    Clinical risk-prediction models serve an important role in healthcare. They are used for clinical decision-making and measuring the performance of healthcare providers. To establish confidence in a model, external model validation is imperative. When designing such an external model validation study, thought must be given to patient selection, risk factor and outcome definitions, missing data, and the transparent reporting of the analysis. In addition, there are a number of statistical methods available for external model validation. Execution of a rigorous external validation study rests in proper study design, application of suitable statistical methods, and transparent reporting.

  14. Model Identification and Validation for a Heating System using MATLAB System Identification Toolbox

    Science.gov (United States)

    Junaid Rabbani, Muhammad; Hussain, Kashan; khan, Asim-ur-Rehman; Ali, Abdullah

    2013-12-01

    This paper proposed a systematic approach to select a mathematical model for an industrial heating system by adopting system identification techniques with the aim of fulfilling the design requirement for the controller. The model identification process will begin by collecting real measurement data samples with the aid of MATLAB system identification toolbox. The criteria for selecting the model that could validate model output with actual data will based upon: parametric identification technique, picking the best model structure with low order among ARX, ARMAX and BJ, and then applying model estimation and validation tests. Simulated results have shown that the BJ model has been best in providing good estimation and validation based upon performance criteria such as: final prediction error, loss function, best percentage of model fit, and co-relation analysis of residual for output.

  15. Neuroinflammatory targets and treatments for epilepsy validated in experimental models.

    Science.gov (United States)

    Aronica, Eleonora; Bauer, Sebastian; Bozzi, Yuri; Caleo, Matteo; Dingledine, Raymond; Gorter, Jan A; Henshall, David C; Kaufer, Daniela; Koh, Sookyong; Löscher, Wolfgang; Louboutin, Jean-Pierre; Mishto, Michele; Norwood, Braxton A; Palma, Eleonora; Poulter, Michael O; Terrone, Gaetano; Vezzani, Annamaria; Kaminski, Rafal M

    2017-07-01

    A large body of evidence that has accumulated over the past decade strongly supports the role of inflammation in the pathophysiology of human epilepsy. Specific inflammatory molecules and pathways have been identified that influence various pathologic outcomes in different experimental models of epilepsy. Most importantly, the same inflammatory pathways have also been found in surgically resected brain tissue from patients with treatment-resistant epilepsy. New antiseizure therapies may be derived from these novel potential targets. An essential and crucial question is whether targeting these molecules and pathways may result in anti-ictogenesis, antiepileptogenesis, and/or disease-modification effects. Therefore, preclinical testing in models mimicking relevant aspects of epileptogenesis is needed to guide integrated experimental and clinical trial designs. We discuss the most recent preclinical proof-of-concept studies validating a number of therapeutic approaches against inflammatory mechanisms in animal models that could represent novel avenues for drug development in epilepsy. Finally, we suggest future directions to accelerate preclinical to clinical translation of these recent discoveries. Wiley Periodicals, Inc. © 2017 International League Against Epilepsy.

  16. A New Symptom Model for Autism Cross-Validated in an Independent Sample

    Science.gov (United States)

    Boomsma, A.; Van Lang, N. D. J.; De Jonge, M. V.; De Bildt, A. A.; Van Engeland, H.; Minderaa, R. B.

    2008-01-01

    Background: Results from several studies indicated that a symptom model other than the DSM triad might better describe symptom domains of autism. The present study focused on a) investigating the stability of a new symptom model for autism by cross-validating it in an independent sample and b) examining the invariance of the model regarding three…

  17. Social Validity of the Critical Incident Stress Management Model for School-Based Crisis Intervention

    Science.gov (United States)

    Morrison, Julie Q.

    2007-01-01

    The Critical Incident Stress Management (CISM) model for crisis intervention was developed for use with emergency service personnel. Research regarding the use of the CISM model has been conducted among civilians and high-risk occupation groups with mixed results. The purpose of this study is to examine the social validity of the CISM model for…

  18. Validation and verification of agent models for trust: Independent compared to relative trust

    NARCIS (Netherlands)

    Hoogendoorn, M.; Jaffry, S.W.; Maanen, P.P. van

    2011-01-01

    In this paper, the results of a validation experiment for two existing computational trust models describing human trust are reported. One model uses experiences of performance in order to estimate the trust in different trustees. The second model in addition carries the notion of relative trust. Th

  19. Validation of a Wave-Body Interaction Model by Experimental Tests

    DEFF Research Database (Denmark)

    Ferri, Francesco; Kramer, Morten; Pecher, Arthur

    2013-01-01

    Within the wave energy field, numerical simulation has recently acquired a worldwide consent as being a useful tool, besides physical model testing. The main goal of this work is the validation of a numerical model by experimental results. The numerical model is based on a linear wave-body intera...

  20. CUORE crystal validation runs: results on radioactive contamination and extrapolation to CUORE background

    CERN Document Server

    Alessandria, F; Ardito, R; Arnaboldi, C; Avignone, F T; Balata, M; Bandac, I; Banks, T I; Bari, G; Beeman, J W; Bellini, F; Bersani, A; Biassoni, M; Bloxham, T; Brofferio, C; Bryant, A; Bucci, C; Cai, X Z; Canonica, L; Capelli, S; Carbone, L; Cardani, L; Carrettoni, M; Chott, N; Clemenza, M; Cosmelli, C; Cremonesi, O; Creswick, R J; Dafinei, I; Dally, A; De Biasi, A; Decowski, M P; Deninno, M M; de Waard, A; Di Domizio, S; Ejzak, L; Faccini, R; Fang, D Q; Farach, H; Ferri, E; Ferroni, F; Fiorini, E; Foggetta, L; Freedman, S; Frossati, G; Giachero, A; Gironi, L; Giuliani, A; Gorla, P; Gotti, C; Guardincerri, E; Gutierrez, T D; Haller, E E; Han, K; Heeger, K M; Huang, H Z; Ichimura, K; Kadel, R; Kazkaz, K; Keppel, G; Kogler, L; Kolomensky, Y G; Kraft, S; Lenz, D; Li, Y L; Liu, X; Longo, E; Ma, Y G; Maiano, C; Maier, G; Martinez, C; Martinez, M; Maruyama, R H; Moggi, N; Morganti, S; Newman, S; Nisi, S; Nones, C; Norman, E B; Nucciotti, A; Orio, F; Orlandi, D; Ouellet, J; Pallavicini, M; Palmieri, V; Pattavina, L; Pavan, M; Pedretti, M; Pessina, G; Pirro, S; Previtali, E; Rampazzo, V; Rimondi, F; Rosenfeld, C; Rusconi, C; Salvioni, C; Sangiorgio, S; Schaeffer, D; Scielzo, N D; Sisti, M; Smith, A R; Stivanello, F; Taffarello, L; Terenziani, G; Tian, W D; Tomei, C; Trentalange, S; Ventura, G; Vignati, M; Wang, B; Wang, H W; Whitten, C A; Wise, T; Woodcraft, A; Xu, N; Zanotti, L; Zarra, C; Zhu, B X; Zucchelli, S

    2011-01-01

    The CUORE Crystal Validation Runs (CCVRs) have been carried out since the end of 2008 at the Gran Sasso National Laboratories, in order to test the performances and the radiopurity of the TeO$_2$ crystals produced at SICCAS (Shanghai Institute of Ceramics, Chinese Academy of Sciences) for the CUORE experiment. In this work the results of the first 5 validation runs are presented. Results have been obtained for bulk contaminations and surface contaminations from several nuclides. An extrapolation to the CUORE background has been performed.

  1. System Modeling, Validation, and Design of Shape Controllers for NSTX

    Science.gov (United States)

    Walker, M. L.; Humphreys, D. A.; Eidietis, N. W.; Leuer, J. A.; Welander, A. S.; Kolemen, E.

    2011-10-01

    Modeling of the linearized control response of plasma shape and position has become fairly routine in the last several years. However, such response models rely on the input of accurate values of model parameters such as conductor and diagnostic sensor geometry and conductor resistivity or resistance. Confidence in use of such a model therefore requires that some effort be spent in validating that the model has been correctly constructed. We describe the process of constructing and validating a response model for NSTX plasma shape and position control, and subsequent use of that model for the development of shape and position controllers. The model development, validation, and control design processes are all integrated within a Matlab-based toolset known as TokSys. The control design method described emphasizes use of so-called decoupling control, in which combinations of coil current modifications are designed to modify only one control parameter at a time, without perturbing any other control parameter values. Work supported by US DOE under DE-FG02-99ER54522 and DE-AC02-09CH11466.

  2. QUANTIFYING FOREST ABOVEGROUND CARBON POOLS AND FLUXES USING MULTI-TEMPORAL LIDAR A report on field monitoring, remote sensing MMV, GIS integration, and modeling results for forestry field validation test to quantify aboveground tree biomass and carbon

    Energy Technology Data Exchange (ETDEWEB)

    Lee Spangler; Lee A. Vierling; Eva K. Stand; Andrew T. Hudak; Jan U.H. Eitel; Sebastian Martinuzzi

    2012-04-01

    Sound policy recommendations relating to the role of forest management in mitigating atmospheric carbon dioxide (CO{sub 2}) depend upon establishing accurate methodologies for quantifying forest carbon pools for large tracts of land that can be dynamically updated over time. Light Detection and Ranging (LiDAR) remote sensing is a promising technology for achieving accurate estimates of aboveground biomass and thereby carbon pools; however, not much is known about the accuracy of estimating biomass change and carbon flux from repeat LiDAR acquisitions containing different data sampling characteristics. In this study, discrete return airborne LiDAR data was collected in 2003 and 2009 across {approx}20,000 hectares (ha) of an actively managed, mixed conifer forest landscape in northern Idaho, USA. Forest inventory plots, established via a random stratified sampling design, were established and sampled in 2003 and 2009. The Random Forest machine learning algorithm was used to establish statistical relationships between inventory data and forest structural metrics derived from the LiDAR acquisitions. Aboveground biomass maps were created for the study area based on statistical relationships developed at the plot level. Over this 6-year period, we found that the mean increase in biomass due to forest growth across the non-harvested portions of the study area was 4.8 metric ton/hectare (Mg/ha). In these non-harvested areas, we found a significant difference in biomass increase among forest successional stages, with a higher biomass increase in mature and old forest compared to stand initiation and young forest. Approximately 20% of the landscape had been disturbed by harvest activities during the six-year time period, representing a biomass loss of >70 Mg/ha in these areas. During the study period, these harvest activities outweighed growth at the landscape scale, resulting in an overall loss in aboveground carbon at this site. The 30-fold increase in sampling density

  3. Validation of spatial variability in downscaling results from the VALUE perfect predictor experiment

    Science.gov (United States)

    Widmann, Martin; Bedia, Joaquin; Gutiérrez, Jose Manuel; Maraun, Douglas; Huth, Radan; Fischer, Andreas; Keller, Denise; Hertig, Elke; Vrac, Mathieu; Wibig, Joanna; Pagé, Christian; Cardoso, Rita M.; Soares, Pedro MM; Bosshard, Thomas; Casado, Maria Jesus; Ramos, Petra

    2016-04-01

    VALUE is an open European network to validate and compare downscaling methods for climate change research. Within VALUE a systematic validation framework to enable the assessment and comparison of both dynamical and statistical downscaling methods has been developed. In the first validation experiment the downscaling methods are validated in a setup with perfect predictors taken from the ERA-interim reanalysis for the period 1997 - 2008. This allows to investigate the isolated skill of downscaling methods without further error contributions from the large-scale predictors. One aspect of the validation is the representation of spatial variability. As part of the VALUE validation we have compared various properties of the spatial variability of downscaled daily temperature and precipitation with the corresponding properties in observations. We have used two test validation datasets, one European-wide set of 86 stations, and one higher-density network of 50 stations in Germany. Here we present results based on three approaches, namely the analysis of i.) correlation matrices, ii.) pairwise joint threshold exceedances, and iii.) regions of similar variability. We summarise the information contained in correlation matrices by calculating the dependence of the correlations on distance and deriving decorrelation lengths, as well as by determining the independent degrees of freedom. Probabilities for joint threshold exceedances and (where appropriate) non-exceedances are calculated for various user-relevant thresholds related for instance to extreme precipitation or frost and heat days. The dependence of these probabilities on distance is again characterised by calculating typical length scales that separate dependent from independent exceedances. Regionalisation is based on rotated Principal Component Analysis. The results indicate which downscaling methods are preferable if the dependency of variability at different locations is relevant for the user.

  4. Cross-validation model assessment for modular networks

    CERN Document Server

    Kawamoto, Tatsuro

    2016-01-01

    Model assessment of the stochastic block model is a crucial step in identification of modular structures in networks. Although this has typically been done according to the principle that a parsimonious model with a large marginal likelihood or a short description length should be selected, another principle is that a model with a small prediction error should be selected. We show that the leave-one-out cross-validation estimate of the prediction error can be efficiently obtained using belief propagation for sparse networks. Furthermore, the relations among the objectives for model assessment enable us to determine the exact cause of overfitting.

  5. Validation of statistical models for creep rupture by parametric analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bolton, J., E-mail: john.bolton@uwclub.net [65, Fisher Ave., Rugby, Warks CV22 5HW (United Kingdom)

    2012-01-15

    Statistical analysis is an efficient method for the optimisation of any candidate mathematical model of creep rupture data, and for the comparative ranking of competing models. However, when a series of candidate models has been examined and the best of the series has been identified, there is no statistical criterion to determine whether a yet more accurate model might be devised. Hence there remains some uncertainty that the best of any series examined is sufficiently accurate to be considered reliable as a basis for extrapolation. This paper proposes that models should be validated primarily by parametric graphical comparison to rupture data and rupture gradient data. It proposes that no mathematical model should be considered reliable for extrapolation unless the visible divergence between model and data is so small as to leave no apparent scope for further reduction. This study is based on the data for a 12% Cr alloy steel used in BS PD6605:1998 to exemplify its recommended statistical analysis procedure. The models considered in this paper include a) a relatively simple model, b) the PD6605 recommended model and c) a more accurate model of somewhat greater complexity. - Highlights: Black-Right-Pointing-Pointer The paper discusses the validation of creep rupture models derived from statistical analysis. Black-Right-Pointing-Pointer It demonstrates that models can be satisfactorily validated by a visual-graphic comparison of models to data. Black-Right-Pointing-Pointer The method proposed utilises test data both as conventional rupture stress and as rupture stress gradient. Black-Right-Pointing-Pointer The approach is shown to be more reliable than a well-established and widely used method (BS PD6605).

  6. TIME-IGGCAS model validation:Comparisons with empirical models and observations

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The TIME-IGGCAS (Theoretical Ionospheric Model of the Earth in Institute of Ge- ology and Geophysics, Chinese Academy of Sciences) has been developed re- cently on the basis of previous works. To test its validity, we have made compari- sons of model results with other typical empirical ionospheric models (IRI, NeQuick-ITUR, and TItheridge temperature models) and multi-observations (GPS, Ionosondes, Topex, DMSP, FORMOSAT, and CHAMP) in this paper. Several conclu- sions are obtained from our comparisons. The modeled electron density and elec- tron and ion temperatures are quantitatively in good agreement with those of em- pirical models and observations. TIME-IGGCAS can model the electron density variations versus several factors such as local time, latitude, and season very well and can reproduce most anomalistic features of ionosphere including equatorial anomaly, winter anomaly, and semiannual anomaly. These results imply a good base for the development of ionospheric data assimilation model in the future. TIME-IGGCAS underestimates electron temperature and overestimates ion tem- perature in comparison with either empirical models or observations. The model results have relatively large deviations near sunrise time and sunset time and at the low altitudes. These results give us a reference to improve the model and enhance its performance in the future.

  7. Development and validation of mathematical modelling for pressurised combustion

    Energy Technology Data Exchange (ETDEWEB)

    Richter, S.; Knaus, H.; Risio, B.; Schnell, U.; Hein, K.R.G. [University of Stuttgart, Stuttgart (Germany). Inst. fuer Verfahrenstechnik und Dampfkesselwesen

    1998-12-31

    The advanced 3D-coal combustion code AIOLOS for quasi-stationary turbulent reacting flows is based on a conservative finite-volume procedure. Equations for the conservation of mass, momentum and scalar quantities are solved. In order to deal with pressurized combustion chambers which are usually of cylindrical shape, a first task in the frame of the project consisted in the extension of the code towards cylindrical co-ordinates, since the basic version of AIOLOS was only suitable for cartesian grids. Furthermore, the domain decomposition method was extended to the new co-ordinate system. Its advantage consists in the possibility to introduce refined sub-grids, providing a better resolution of regions where high gradients occur (e.g. high velocity and temperature gradients near the burners). The accuracy of the code was proven by means of a small-scale test case. The results obtained with AIOLOS were compared with the predictions of the commercial CFD-code FLOW3D and validated against the velocity and temperature distributions measured at the test facility. The work during the second period focused mainly on the extension of the reaction model, as well as on the modelling of the optical properties of the flue gas. A modified submodel for char burnout was developed, considering the influence of pressure on diffusion mechanisms and on the chemical reaction at the char particle. The goal during the third project period was to improve the numerical description of turbulence effects and of the radiative heat transfer, in order to obtain an adequate modelling of the complex processes in pressurized coal combustion furnaces. Therefore, a differential Reynolds stress turbulence model (RSM) and a Discrete-Ordinates radiation model were implemented, respectively. 13 refs., 13 figs., 1 tab.

  8. Cross-validation analysis of bias models in Bayesian multi-model projections of climate

    Science.gov (United States)

    Huttunen, J. M. J.; Räisänen, J.; Nissinen, A.; Lipponen, A.; Kolehmainen, V.

    2017-03-01

    Climate change projections are commonly based on multi-model ensembles of climate simulations. In this paper we consider the choice of bias models in Bayesian multimodel predictions. Buser et al. (Clim Res 44(2-3):227-241, 2010a) introduced a hybrid bias model which combines commonly used constant bias and constant relation bias assumptions. The hybrid model includes a weighting parameter which balances these bias models. In this study, we use a cross-validation approach to study which bias model or bias parameter leads to, in a specific sense, optimal climate change projections. The analysis is carried out for summer and winter season means of 2 m-temperatures spatially averaged over the IPCC SREX regions, using 19 model runs from the CMIP5 data set. The cross-validation approach is applied to calculate optimal bias parameters (in the specific sense) for projecting the temperature change from the control period (1961-2005) to the scenario period (2046-2090). The results are compared to the results of the Buser et al. (Clim Res 44(2-3):227-241, 2010a) method which includes the bias parameter as one of the unknown parameters to be estimated from the data.

  9. Modeling and validation of a 3D premolar for finite element analysis

    Directory of Open Access Journals (Sweden)

    Letícia Brandão DURAND

    Full Text Available Abstract Introduction The development and validation of mathematical models is an important step of the methodology of finite element studies. Objective This study aims to describe the development and validation of a three-dimensional numerical model of a maxillary premolar for finite element analysis. Material and method The 3D model was based on standardized photographs of sequential slices of an intact premolar and generated with the use of SolidWorks Software (Dassault, France. In order to validate the model, compression and numerical tests were performed. The load versus displacement graphs of both tests were visually compared, the percentage of error calculated and homogeneity of regression coefficients tested. Result An accurate 3D model was developed and validated since the graphs were visually similar, the percentage error was within acceptable limits, and the straight lines were considered parallel. Conclusion The modeling procedures and validation described allows the development of accurate 3D dental models with biomechanical behavior similar to natural teeth. The methods may be applied in development and validation of new models and computer-aided simulations using FEM.

  10. Validation of a tuber blight (Phytophthora infestans) prediction model

    Science.gov (United States)

    Potato tuber blight caused by Phytophthora infestans accounts for significant losses in storage. There is limited published quantitative data on predicting tuber blight. We validated a tuber blight prediction model developed in New York with cultivars Allegany, NY 101, and Katahdin using independent...

  11. Technical Note: Calibration and validation of geophysical observation models

    NARCIS (Netherlands)

    Salama, M.S.; van der Velde, R.; van der Woerd, H.J.; Kromkamp, J.C.; Philippart, C.J.M.; Joseph, A.T.; O'Neill, P.E.; Lang, R.H.; Gish, T.; Werdell, P.J.; Su, Z.

    2012-01-01

    We present a method to calibrate and validate observational models that interrelate remotely sensed energy fluxes to geophysical variables of land and water surfaces. Coincident sets of remote sensing observation of visible and microwave radiations and geophysical data are assembled and subdivided i

  12. Empirical validation data sets for double skin facade models

    DEFF Research Database (Denmark)

    Kalyanova, Olena; Jensen, Rasmus Lund; Heiselberg, Per

    2008-01-01

    During recent years application of double skin facades (DSF) has greatly increased. However, successful application depends heavily on reliable and validated models for simulation of the DSF performance and this in turn requires access to high quality experimental data. Three sets of accurate emp...

  13. Validation of Models : Statistical Techniques and Data Availability

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    1999-01-01

    This paper shows which statistical techniques can be used to validate simulation models, depending on which real-life data are available. Concerning this availability three situations are distinguished (i) no data, (ii) only output data, and (iii) both input and output data. In case (i) - no real

  14. Improving Perovskite Solar Cells: Insights From a Validated Device Model

    NARCIS (Netherlands)

    Sherkar, Tejas S.; Momblona, Cristina; Gil-Escrig, Lidon; Bolink, Henk J.; Koster, L. Jan Anton

    2017-01-01

    To improve the efficiency of existing perovskite solar cells (PSCs), a detailed understanding of the underlying device physics during their operation is essential. Here, a device model has been developed and validated that describes the operation of PSCs and quantitatively explains the role of conta

  15. ID Model Construction and Validation: A Multiple Intelligences Case

    Science.gov (United States)

    Tracey, Monica W.; Richey, Rita C.

    2007-01-01

    This is a report of a developmental research study that aimed to construct and validate an instructional design (ID) model that incorporates the theory and practice of multiple intelligences (MI). The study consisted of three phases. In phase one, the theoretical foundations of multiple Intelligences and ID were examined to guide the development…

  16. Requirements Validation: Execution of UML Models with CPN Tools

    DEFF Research Database (Denmark)

    Machado, Ricardo J.; Lassen, Kristian Bisgaard; Oliveira, Sérgio

    2007-01-01

    Requirements validation is a critical task in any engineering project. The confrontation of stakeholders with static requirements models is not enough, since stakeholders with non-computer science education are not able to discover all the inter-dependencies between the elicited requirements. Eve...... requirements, where the system to be built must explicitly support the interaction between people within a pervasive cooperative workflow execution. A case study from a real project is used to illustrate the proposed approach.......Requirements validation is a critical task in any engineering project. The confrontation of stakeholders with static requirements models is not enough, since stakeholders with non-computer science education are not able to discover all the inter-dependencies between the elicited requirements. Even...... with simple unified modelling language (UML) requirements models, it is not easy for the development team to get confidence on the stakeholders' requirements validation. This paper describes an approach, based on the construction of executable interactive prototypes, to support the validation of workflow...

  17. Improving Perovskite Solar Cells: Insights From a Validated Device Model

    NARCIS (Netherlands)

    Sherkar, Tejas S.; Momblona, Cristina; Gil-Escrig, Lidon; Bolink, Henk J.; Koster, L. Jan Anton

    2017-01-01

    To improve the efficiency of existing perovskite solar cells (PSCs), a detailed understanding of the underlying device physics during their operation is essential. Here, a device model has been developed and validated that describes the operation of PSCs and quantitatively explains the role of conta

  18. Prospective validation of two models predicting pregnancy leading to live birth among untreated subfertile couples.

    Science.gov (United States)

    Hunault, Claudine C; Laven, Joop S E; van Rooij, Ilse A J; Eijkemans, Marinus J C; te Velde, Egbert R; Habbema, J Dik F

    2005-06-01

    Models predicting clinical outcome need external validation before they can be applied safely in daily practice. This study aimed to validate two models for the prediction of the chance of treatment-independent pregnancy leading to live birth among subfertile couples. The first model uses the woman's age, duration and type of subfertility, percentage of progressive sperm motility and referral status. The second model in addition uses the result of the post-coital test (PCT). For validation, these characteristics were collected prospectively in two University hospitals for 302 couples consulting for subfertility. The models' ability to distinguish between women who became pregnant and women who did not (discrimination) and the agreement between predicted and observed probabilities of treatment-independent pregnancy (calibration) were assessed. The discrimination of both models was slightly lower in the validation sample than in the original sample which provided the model. Calibration was good: the observed and predicted probabilities of treatment-independent pregnancy leading to live birth did not differ for both models. The chance of pregnancy leading to live birth was reliably estimated in the validation sample by both models. The use of PCT improved the discrimination of the models. These models can be useful in counselling subfertile couples.

  19. Validation of the dermal exposure model in ECETOC TRA.

    Science.gov (United States)

    Marquart, Hans; Franken, Remy; Goede, Henk; Fransman, Wouter; Schinkel, Jody

    2017-08-01

    The ECETOC TRA model (presently version 3.1) is often used to estimate worker inhalation and dermal exposure in regulatory risk assessment. The dermal model in ECETOC TRA has not yet been validated by comparison with independent measured exposure levels. This was the goal of the present study. Measured exposure levels and relevant contextual information were gathered via literature search, websites of relevant occupational health institutes and direct requests for data to industry. Exposure data were clustered in so-called exposure cases, which are sets of data from one data source that are expected to have the same values for input parameters in the ECETOC TRA dermal exposure model. For each exposure case, the 75th percentile of measured values was calculated, because the model intends to estimate these values. The input values for the parameters in ECETOC TRA were assigned by an expert elicitation and consensus building process, based on descriptions of relevant contextual information.From more than 35 data sources, 106 useful exposure cases were derived, that were used for direct comparison with the model estimates. The exposure cases covered a large part of the ECETOC TRA dermal exposure model. The model explained 37% of the variance in the 75th percentiles of measured values. In around 80% of the exposure cases, the model estimate was higher than the 75th percentile of measured values. In the remaining exposure cases, the model estimate may not be sufficiently conservative.The model was shown to have a clear bias towards (severe) overestimation of dermal exposure at low measured exposure values, while all cases of apparent underestimation by the ECETOC TRA dermal exposure model occurred at high measured exposure values. This can be partly explained by a built-in bias in the effect of concentration of substance in product used, duration of exposure and the use of protective gloves in the model. The effect of protective gloves was calculated to be on average a

  20. Validation Hydrodynamic Models of Three Topological Models of Secondary Facultative Ponds

    Directory of Open Access Journals (Sweden)

    Aponte-Reyes Alxander

    2014-10-01

    Full Text Available A methodology was developed to analyze boundary conditions, the size of the mesh and the turbulence of a mathematical model of CFD, which could explain hydrodynamic behavior on facultative stabilization ponds, FSP, built to pilot scale: conventional pond, CP, baffled pond, BP, and baffled-mesh pond, BMP. Models dispersion studies were performed in field for validation, taking samples into and out of the FSP, the information was used to carry out CFD model simulations of the three topologies. Evaluated mesh sizes ranged from 500,000 to 2,000,000 elements. The boundary condition in Pared surface-free slip showed good qualitative behavior and the turbulence model κ–ε Low Reynolds yielded good results. The biomass contained in LFS generates interference on dispersion studies and should be taken into account in assessing the CFD modeling, the tracer injection times, its concentration at the entrance, the effect of wind on CFD, and the flow models adopted as a basis for modeling are parameters to be taken into account for the CFD model validation and calibration.

  1. Community-Based Participatory Research Conceptual Model: Community Partner Consultation and Face Validity.

    Science.gov (United States)

    Belone, Lorenda; Lucero, Julie E; Duran, Bonnie; Tafoya, Greg; Baker, Elizabeth A; Chan, Domin; Chang, Charlotte; Greene-Moton, Ella; Kelley, Michele A; Wallerstein, Nina

    2016-01-01

    A national community-based participatory research (CBPR) team developed a conceptual model of CBPR partnerships to understand the contribution of partnership processes to improved community capacity and health outcomes. With the model primarily developed through academic literature and expert consensus building, we sought community input to assess face validity and acceptability. Our research team conducted semi-structured focus groups with six partnerships nationwide. Participants validated and expanded on existing model constructs and identified new constructs based on "real-world" praxis, resulting in a revised model. Four cross-cutting constructs were identified: trust development, capacity, mutual learning, and power dynamics. By empirically testing the model, we found community face validity and capacity to adapt the model to diverse contexts. We recommend partnerships use and adapt the CBPR model and its constructs, for collective reflection and evaluation, to enhance their partnering practices and achieve their health and research goals. © The Author(s) 2014.

  2. Validating firn compaction model with remote sensing data

    DEFF Research Database (Denmark)

    Simonsen, S. B.; Stenseng, Lars; Sørensen, Louise Sandberg

    A comprehensive understanding of firn processes is of outmost importance, when estimating present and future changes of the Greenland Ice Sheet. Especially, when remote sensing altimetry is used to assess the state of ice sheets and their contribution to global sea level rise, firn compaction...... models have been shown to be a key component. Now, remote sensing data can also be used to validate the firn models. Radar penetrating the upper part of the firn column in the interior part of Greenland shows a clear layering. The observed layers from the radar data can be used as an in-situ validation...... correction relative to the changes in the elevation of the surface observed with remote sensing altimetry? What model time resolution is necessary to resolved the observed layering? What model refinements are necessary to give better estimates of the surface mass balance of the Greenland ice sheet from...

  3. Validation of a Model for Ice Formation around Finned Tubes

    OpenAIRE

    Kamal A. R. Ismai; Fatima A. M. Lino

    2016-01-01

    Phase change materials although attaractive option for thermal storage applications its main drawback is the slow thermal response during charging and discharging processes due to their low thermal conductivity. The present study validates a model developed by the authors some years ago on radial fins as a method to meliorate the thermal performance of PCM in horizontal storage system. The developed model for the radial finned tube is based on pure conduction, the enthalpy approach and was di...

  4. Toward metrics and model validation in web-site QEM

    OpenAIRE

    Olsina Santos, Luis Antonio; Pons, Claudia; Rossi, Gustavo Héctor

    2000-01-01

    In this work, a conceptual framework and the associated strategies for metrics and model validation are analyzed regarding website measurement and evaluation. Particularly, we have conducted three case studies in different Web domains in order to evaluate and compare the quality of sites. For such an end the quantitative, model-based methodology, so-called Web-site QEM (Quality Evaluation Methodology), was utilized. In the assessment process of sites, definition of attributes and measurements...

  5. Validating firn compaction model with remote sensing data

    OpenAIRE

    2011-01-01

    A comprehensive understanding of firn processes is of outmost importance, when estimating present and future changes of the Greenland Ice Sheet. Especially, when remote sensing altimetry is used to assess the state of ice sheets and their contribution to global sea level rise, firn compaction models have been shown to be a key component. Now, remote sensing data can also be used to validate the firn models. Radar penetrating the upper part of the firn column in the interior part of Greenland ...

  6. Experimental validation of the twins prediction program for rolling noise. Pt.2: results

    NARCIS (Netherlands)

    Thompson, D.J.; Fodiman, P.; Mahé, H.

    1996-01-01

    Two extensive measurement campaigns have been carried out to validate the TWINS prediction program for rolling noise, as described in part 1 of this paper. This second part presents the experimental results of vibration and noise during train pass-bys and compares them with predictions from the TWIN

  7. Validation and intercomparison of Persistent Scatterers Interferometry: PSIC4 project results

    NARCIS (Netherlands)

    Raucoules, D.; Bourgine, B.; Michele, M. de; Le Cozannet, G.; Closset, L.; Bremmer, C.; Veldkamp, H.; Tragheim, D.; Bateson, L.; Crosetto, M.; Agudo, M.; Engdahl, M.

    2009-01-01

    This article presents the main results of the Persistent Scatterer Interferometry Codes Cross Comparison and Certification for long term differential interferometry (PSIC4) project. The project was based on the validation of the PSI (Persistent Scatterer Interferometry) data with respect to levellin

  8. Intervention Validity of Social Behavior Rating Scales: Features of Assessments that Link Results to Treatment Plans

    Science.gov (United States)

    Elliott, Stephen N.; Gresham, Frank M.; Frank, Jennifer L.; Beddow, Peter A., III

    2008-01-01

    The term "intervention validity" refers to the extent to which assessment results can be used to guide the selection of interventions and evaluation of outcomes. In this article, the authors review the defining attributes of rating scales that distinguish them from other assessment tools, assumptions regarding the use of rating scales to measure…

  9. Experimental validation of the twins prediction program for rolling noise. Pt.2: results

    NARCIS (Netherlands)

    Thompson, D.J.; Fodiman, P.; Mahé, H.

    1996-01-01

    Two extensive measurement campaigns have been carried out to validate the TWINS prediction program for rolling noise, as described in part 1 of this paper. This second part presents the experimental results of vibration and noise during train pass-bys and compares them with predictions from the

  10. Validation of simulation strategies for the flow in a model propeller turbine during a runaway event

    Science.gov (United States)

    Fortin, M.; Houde, S.; Deschênes, C.

    2014-03-01

    Recent researches indicate that the useful life of a turbine can be affected by transient events. This study aims to define and validate strategies for the simulation of the flow within a propeller turbine model in runaway condition. Using unsteady pressure measurements on two runner blades for validation, different strategies are compared and their results analysed in order to quantify their precision. This paper will focus on justifying the choice of the simulations strategies and on the analysis of preliminary results.

  11. ALTWAVE: Toolbox for use of satellite L2P altimeter data for wave model validation

    Science.gov (United States)

    Appendini, Christian M.; Camacho-Magaña, Víctor; Breña-Naranjo, José Agustín

    2016-03-01

    To characterize some of the world's ocean physical processes such as its wave height, wind speed and sea surface elevation is a major need for coastal and marine infrastructure planning and design, tourism activities, wave power and storm surge risk assessment, among others. Over the last decades, satellite remote sensing tools have provided quasi-global measurements of ocean altimetry by merging data from different satellite missions. While there is a widely use of altimeter data for model validation, practical tools for model validation remain scarce. Our purpose is to fill this gap by introducing ALTWAVE, a MATLAB user-oriented toolbox for oceanographers and coastal engineers developed to validate wave model results based on visual features and statistical estimates against satellite derived altimetry. Our toolbox uses altimetry information from the GlobWave initiative, and provides a sample application to validate a one year wave hindcast for the Gulf of Mexico. ALTWAVE also offers an effective toolbox to validate wave model results using altimeter data, as well as a guidance for non-experienced satellite data users. This article is intended for wave modelers with no experience using altimeter data to validate their results.

  12. Army Synthetic Validity Project: Report of Phase 2 Results. Volume 1

    Science.gov (United States)

    1990-06-01

    Interest in Efficiency & Org. 394 198.59 28.42 Prformane Criterio Measure CEP: Core Technical Prof. 394 102.72 15.03 6-12 Army Syheric Validation...measures: The modeling of performance. Paper presented at the Second Annual Conference of the Society for Industrial and Organizational Psychology, Atlanta ...the Society of Industrial and Organizational Psychology, Atlanta . Peterson, N. G., Owens-Kurtz, C. K., Hoffman, R. G., Arabian, J. M. & Whetzel, D. L

  13. Validity of the Bersohn–Zewail model beyond justification

    DEFF Research Database (Denmark)

    Petersen, Jakob; Henriksen, Niels Engholm; Møller, Klaus Braagaard

    2012-01-01

    The absorption of probe pulses in ultrafast pump–probe experiments can be determined from the Bersohn–Zewail (BZ) model. The model relies on classical mechanics to describe the dynamics of the nuclei in the excited electronic state prepared by the ultrashort pump pulse. The BZ model provides...... excellent agreement between the classical trajectory and the average position of the excited state wave packet. By investigating the approximations connecting the nuclear dynamics described by quantum mechanics and the BZ model, we conclude that this agreement goes far beyond the validity of the individual...

  14. Experimentally validated finite element model of electrocaloric multilayer ceramic structures

    Energy Technology Data Exchange (ETDEWEB)

    Smith, N. A. S., E-mail: nadia.smith@npl.co.uk, E-mail: maciej.rokosz@npl.co.uk, E-mail: tatiana.correia@npl.co.uk; Correia, T. M., E-mail: nadia.smith@npl.co.uk, E-mail: maciej.rokosz@npl.co.uk, E-mail: tatiana.correia@npl.co.uk [National Physical Laboratory, Hampton Road, TW11 0LW Middlesex (United Kingdom); Rokosz, M. K., E-mail: nadia.smith@npl.co.uk, E-mail: maciej.rokosz@npl.co.uk, E-mail: tatiana.correia@npl.co.uk [National Physical Laboratory, Hampton Road, TW11 0LW Middlesex (United Kingdom); Department of Materials, Imperial College London, London SW7 2AZ (United Kingdom)

    2014-07-28

    A novel finite element model to simulate the electrocaloric response of a multilayer ceramic capacitor (MLCC) under real environment and operational conditions has been developed. The two-dimensional transient conductive heat transfer model presented includes the electrocaloric effect as a source term, as well as accounting for radiative and convective effects. The model has been validated with experimental data obtained from the direct imaging of MLCC transient temperature variation under application of an electric field. The good agreement between simulated and experimental data, suggests that the novel experimental direct measurement methodology and the finite element model could be used to support the design of optimised electrocaloric units and operating conditions.

  15. An approach to model validation and model-based prediction -- polyurethane foam case study.

    Energy Technology Data Exchange (ETDEWEB)

    Dowding, Kevin J.; Rutherford, Brian Milne

    2003-07-01

    Enhanced software methodology and improved computing hardware have advanced the state of simulation technology to a point where large physics-based codes can be a major contributor in many systems analyses. This shift toward the use of computational methods has brought with it new research challenges in a number of areas including characterization of uncertainty, model validation, and the analysis of computer output. It is these challenges that have motivated the work described in this report. Approaches to and methods for model validation and (model-based) prediction have been developed recently in the engineering, mathematics and statistical literatures. In this report we have provided a fairly detailed account of one approach to model validation and prediction applied to an analysis investigating thermal decomposition of polyurethane foam. A model simulates the evolution of the foam in a high temperature environment as it transforms from a solid to a gas phase. The available modeling and experimental results serve as data for a case study focusing our model validation and prediction developmental efforts on this specific thermal application. We discuss several elements of the ''philosophy'' behind the validation and prediction approach: (1) We view the validation process as an activity applying to the use of a specific computational model for a specific application. We do acknowledge, however, that an important part of the overall development of a computational simulation initiative is the feedback provided to model developers and analysts associated with the application. (2) We utilize information obtained for the calibration of model parameters to estimate the parameters and quantify uncertainty in the estimates. We rely, however, on validation data (or data from similar analyses) to measure the variability that contributes to the uncertainty in predictions for specific systems or units (unit-to-unit variability). (3) We perform statistical

  16. Structure Modeling and Validation applied to Source Physics Experiments (SPEs)

    Science.gov (United States)

    Larmat, C. S.; Rowe, C. A.; Patton, H. J.

    2012-12-01

    The U. S. Department of Energy's Source Physics Experiments (SPEs) comprise a series of small chemical explosions used to develop a better understanding of seismic energy generation and wave propagation for low-yield explosions. In particular, we anticipate improved understanding of the processes through which shear waves are generated by the explosion source. Three tests, 100, 1000 and 1000 kg yields respectively, were detonated in the same emplacement hole and recorded on the same networks of ground motion sensors in the granites of Climax Stock at the Nevada National Security Site. We present results for the analysis and modeling of seismic waveforms recorded close-in on five linear geophone lines extending radially from ground zero, having offsets from 100 to 2000 m and station spacing of 100 m. These records exhibit azimuthal variations of P-wave arrival times, and phase velocity, spreading and attenuation properties of high-frequency Rg waves. We construct a 1D seismic body-wave model starting from a refraction analysis of P-waves and adjusting to address time-domain and frequency-domain dispersion measurements of Rg waves between 2 and 9 Hz. The shallowest part of the structure we address using the arrival times recorded by near-field accelerometers residing within 200 m of the shot hole. We additionally perform a 2D modeling study with the Spectral Element Method (SEM) to investigate which structural features are most responsible for the observed variations, in particular anomalously weak amplitude decay in some directions of this topographically complicated locality. We find that a near-surface, thin, weathered layer of varying thickness and low wave speeds plays a major role on the observed waveforms. We anticipate performing full 3D modeling of the seismic near-field through analysis and validation of waveforms on the 5 radial receiver arrays.

  17. Validation of a Model for Teaching Canine Fundoscopy.

    Science.gov (United States)

    Nibblett, Belle Marie D; Pereira, Mary Mauldin; Williamson, Julie A; Sithole, Fortune

    2015-01-01

    A validated teaching model for canine fundoscopic examination was developed to improve Day One fundoscopy skills while at the same time reducing use of teaching dogs. This novel eye model was created from a hollow plastic ball with a cutout for the pupil, a suspended 20-diopter lens, and paint and paper simulation of relevant eye structures. This eye model was mounted on a wooden stand with canine head landmarks useful in performing fundoscopy. Veterinary educators performed fundoscopy using this model and completed a survey to establish face and content validity. Subsequently, veterinary students were randomly assigned to pre-laboratory training with or without the use of this teaching model. After completion of an ophthalmology laboratory on teaching dogs, student outcome was assessed by measuring students' ability to see a symbol inserted on the simulated retina in the model. Students also completed a survey regarding their experience with the model and the laboratory. Overall, veterinary educators agreed that this eye model was well constructed and useful in teaching good fundoscopic technique. Student performance of fundoscopy was not negatively impacted by the use of the model. This novel canine model shows promise as a teaching and assessment tool for fundoscopy.

  18. FDA 2011 process validation guidance: lifecycle compliance model.

    Science.gov (United States)

    Campbell, Cliff

    2014-01-01

    This article has been written as a contribution to the industry's efforts in migrating from a document-driven to a data-driven compliance mindset. A combination of target product profile, control engineering, and general sum principle techniques is presented as the basis of a simple but scalable lifecycle compliance model in support of modernized process validation. Unit operations and significant variables occupy pole position within the model, documentation requirements being treated as a derivative or consequence of the modeling process. The quality system is repositioned as a subordinate of system quality, this being defined as the integral of related "system qualities". The article represents a structured interpretation of the U.S. Food and Drug Administration's 2011 Guidance for Industry on Process Validation and is based on the author's educational background and his manufacturing/consulting experience in the validation field. The U.S. Food and Drug Administration's Guidance for Industry on Process Validation (2011) provides a wide-ranging and rigorous outline of compliant drug manufacturing requirements relative to its 20(th) century predecessor (1987). Its declared focus is patient safety, and it identifies three inter-related (and obvious) stages of the compliance lifecycle. Firstly, processes must be designed, both from a technical and quality perspective. Secondly, processes must be qualified, providing evidence that the manufacturing facility is fully "roadworthy" and fit for its intended purpose. Thirdly, processes must be verified, meaning that commercial batches must be monitored to ensure that processes remain in a state of control throughout their lifetime.

  19. Infrared ship signature prediction, model validation and sky radiance

    NARCIS (Netherlands)

    Neele, F.P.

    2005-01-01

    The increased interest during the last decade in the infrared signature of (new) ships results in a clear need of validated infrared signature prediction codes. This paper presents the results of comparing an in-house developed signature prediction code with measurements made in the 3-5 μm band in b

  20. Interpreting Results from the Multinomial Logit Model

    DEFF Research Database (Denmark)

    Wulff, Jesper

    2015-01-01

    This article provides guidelines and illustrates practical steps necessary for an analysis of results from the multinomial logit model (MLM). The MLM is a popular model in the strategy literature because it allows researchers to examine strategic choices with multiple outcomes. However, there see...

  1. Validation model for Raman based skin carotenoid detection.

    Science.gov (United States)

    Ermakov, Igor V; Gellermann, Werner

    2010-12-01

    measurement, and which can be easily removed for subsequent biochemical measurements. Excellent correlation (coefficient R=0.95) is obtained for this tissue site which could serve as a model site for scaled up future validation studies of large populations. The obtained results provide proof that resonance Raman spectroscopy is a valid non-invasive objective methodology for the quantitative assessment of carotenoid antioxidants in human skin in vivo.

  2. Integral Reactor Containment Condensation Model and Experimental Validation

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Qiao [Oregon State Univ., Corvallis, OR (United States); Corradini, Michael [Univ. of Wisconsin, Madison, WI (United States)

    2016-05-02

    This NEUP funded project, NEUP 12-3630, is for experimental, numerical and analytical studies on high-pressure steam condensation phenomena in a steel containment vessel connected to a water cooling tank, carried out at Oregon State University (OrSU) and the University of Wisconsin at Madison (UW-Madison). In the three years of investigation duration, following the original proposal, the planned tasks have been completed: (1) Performed a scaling study for the full pressure test facility applicable to the reference design for the condensation heat transfer process during design basis accidents (DBAs), modified the existing test facility to route the steady-state secondary steam flow into the high pressure containment for controllable condensation tests, and extended the operations at negative gage pressure conditions (OrSU). (2) Conducted a series of DBA and quasi-steady experiments using the full pressure test facility to provide a reliable high pressure condensation database (OrSU). (3) Analyzed experimental data and evaluated condensation model for the experimental conditions, and predicted the prototypic containment performance under accidental conditions (UW-Madison). A film flow model was developed for the scaling analysis, and the results suggest that the 1/3 scaled test facility covers large portion of laminar film flow, leading to a lower average heat transfer coefficient comparing to the prototypic value. Although it is conservative in reactor safety analysis, the significant reduction of heat transfer coefficient (50%) could under estimate the prototypic condensation heat transfer rate, resulting in inaccurate prediction of the decay heat removal capability. Further investigation is thus needed to quantify the scaling distortion for safety analysis code validation. Experimental investigations were performed in the existing MASLWR test facility at OrST with minor modifications. A total of 13 containment condensation tests were conducted for pressure

  3. Alaska North Slope Tundra Travel Model and Validation Study

    Energy Technology Data Exchange (ETDEWEB)

    Harry R. Bader; Jacynthe Guimond

    2006-03-01

    lack of variability in snow depth cover throughout the period of field experimentation. The amount of change in disturbance indicators was greater in the tundra communities of the Foothills than in those of the Coastal Plain. However the overall level of change in both community types was less than expected. In Coastal Plain communities, ground hardness and snow slab thickness were found to play an important role in change in active layer depth and soil moisture as a result of treatment. In the Foothills communities, snow cover had the most influence on active layer depth and soil moisture as a result of treatment. Once certain minimum thresholds for ground hardness, snow slab thickness, and snow depth were attained, it appeared that little or no additive effect was realized regarding increased resistance to disturbance in the tundra communities studied. DNR used the results of this modeling project to set a standard for maximum permissible disturbance of cross-country tundra travel, with the threshold set below the widely accepted standard of Low Disturbance levels (as determined by the U.S. Fish and Wildlife Service). DNR followed the modeling project with a validation study, which seemed to support the field trial conclusions and indicated that the standard set for maximum permissible disturbance exhibits a conservative bias in favor of environmental protection. Finally DNR established a quick and efficient tool for visual estimations of disturbance to determine when investment in field measurements is warranted. This Visual Assessment System (VAS) seemed to support the plot disturbance measurements taking during the modeling and validation phases of this project.

  4. Hierarchical multi-scale approach to validation and uncertainty quantification of hyper-spectral image modeling

    Science.gov (United States)

    Engel, Dave W.; Reichardt, Thomas A.; Kulp, Thomas J.; Graff, David L.; Thompson, Sandra E.

    2016-05-01

    Validating predictive models and quantifying uncertainties inherent in the modeling process is a critical component of the HARD Solids Venture program [1]. Our current research focuses on validating physics-based models predicting the optical properties of solid materials for arbitrary surface morphologies and characterizing the uncertainties in these models. We employ a systematic and hierarchical approach by designing physical experiments and comparing the experimental results with the outputs of computational predictive models. We illustrate this approach through an example comparing a micro-scale forward model to an idealized solid-material system and then propagating the results through a system model to the sensor level. Our efforts should enhance detection reliability of the hyper-spectral imaging technique and the confidence in model utilization and model outputs by users and stakeholders.

  5. Hierarchical Multi-Scale Approach To Validation and Uncertainty Quantification of Hyper-Spectral Image Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Engel, David W.; Reichardt, Thomas A.; Kulp, Thomas J.; Graff, David; Thompson, Sandra E.

    2016-09-17

    Validating predictive models and quantifying uncertainties inherent in the modeling process is a critical component of the HARD Solids Venture program [1]. Our current research focuses on validating physics-based models predicting the optical properties of solid materials for arbitrary surface morphologies and characterizing the uncertainties in these models. We employ a systematic and hierarchical approach by designing physical experiments and comparing the experimental results with the outputs of computational predictive models. We illustrate this approach through an example comparing a micro-scale forward model to an idealized solid-material system and then propagating the results through a system model to the sensor level. Our efforts should enhance detection reliability of the hyper-spectral imaging technique and the confidence in model utilization and model outputs by users and stakeholders.

  6. Validation of a finite element model of the human metacarpal.

    Science.gov (United States)

    Barker, D S; Netherway, D J; Krishnan, J; Hearn, T C

    2005-03-01

    Implant loosening and mechanical failure of components are frequently reported following metacarpophalangeal (MCP) joint replacement. Studies of the mechanical environment of the MCP implant-bone construct are rare. The objective of this study was to evaluate the predictive ability of a finite element model of the intact second human metacarpal to provide a validated baseline for further mechanical studies. A right index human metacarpal was subjected to torsion and combined axial/bending loading using strain gauge (SG) and 3D finite element (FE) analysis. Four different representations of bone material properties were considered. Regression analyses were performed comparing maximum and minimum principal surface strains taken from the SG and FE models. Regression slopes close to unity and high correlation coefficients were found when the diaphyseal cortical shell was modelled as anisotropic and cancellous bone properties were derived from quantitative computed tomography. The inclusion of anisotropy for cortical bone was strongly influential in producing high model validity whereas variation in methods of assigning stiffness to cancellous bone had only a minor influence. The validated FE model provides a tool for future investigations of current and novel MCP joint prostheses.

  7. Development and validation of a cisplatin dose-ototoxicity model.

    Science.gov (United States)

    Dille, Marilyn F; Wilmington, Debra; McMillan, Garnett P; Helt, Wendy; Fausti, Stephen A; Konrad-Martin, Dawn

    2012-01-01

    Cisplatin is effective in the treatment of several cancers but is a known ototoxin resulting in shifts to hearing sensitivity in up to 50-60% of patients. Cisplatin-induced hearing shifts tend to occur first within an octave of a patient's high frequency hearing limit, termed the sensitive range for ototoxicity (SRO), and progress to lower frequencies. While it is currently not possible to know which patients will experience ototoxicity without testing their hearing directly, monitoring the SRO provides an early indication of damage. A tool to help forecast susceptibility to ototoxic-induced changes in the SRO in advance of each chemotherapy treatment visit may prove useful for ototoxicity monitoring efforts, patient counseling, and therapeutic planning. This project was designed to (1) establish pretreatment risk curves that quantify the probability that a new patient will suffer hearing loss within the SRO during treatment with cisplatin and (2) evaluate the accuracy of these predictions in an independent sample of Veterans receiving cisplatin for the treatment of cancer. Two study samples were used. The Developmental sample contained 23 subjects while the Validation sample consisted of 12 subjects. Risk curve predictions for SRO threshold shifts following cisplatin exposure were developed using a Developmental sample comprised of data from a total of 155 treatment visits obtained in 45 ears of 23 Veterans. Pure-tone thresholds were obtained within each subject's SRO at each treatment visit and compared with baseline measures. The risk of incurring an SRO shift was statistically modeled as a function of factors related to chemotherapy treatment (cisplatin dose, radiation treatment, doublet medication) and patient status (age, pre-exposure hearing, cancer location and stage). The model was reduced so that only statistically significant variables were included. Receiver-operating characteristic (ROC) curve analyses were then used to determine the accuracy of the

  8. Beyond Corroboration: Strengthening Model Validation by Looking for Unexpected Patterns.

    Directory of Open Access Journals (Sweden)

    Guillaume Chérel

    Full Text Available Models of emergent phenomena are designed to provide an explanation to global-scale phenomena from local-scale processes. Model validation is commonly done by verifying that the model is able to reproduce the patterns to be explained. We argue that robust validation must not only be based on corroboration, but also on attempting to falsify the model, i.e. making sure that the model behaves soundly for any reasonable input and parameter values. We propose an open-ended evolutionary method based on Novelty Search to look for the diverse patterns a model can produce. The Pattern Space Exploration method was tested on a model of collective motion and compared to three common a priori sampling experiment designs. The method successfully discovered all known qualitatively different kinds of collective motion, and performed much better than the a priori sampling methods. The method was then applied to a case study of city system dynamics to explore the model's predicted values of city hierarchisation and population growth. This case study showed that the method can provide insights on potential predictive scenarios as well as falsifiers of the model when the simulated dynamics are highly unrealistic.

  9. Validating a model that predicts daily growth and feed quality of New Zealand dairy pastures.

    Science.gov (United States)

    Woodward, S J

    2001-09-01

    The Pasture Quality (PQ) model is a simple, mechanistic, dynamical system model that was designed to capture the essential biological processes in grazed grass-clover pasture, and to be optimised to derive improved grazing strategies for New Zealand dairy farms. While the individual processes represented in the model (photosynthesis, tissue growth, flowering, leaf death, decomposition, worms) were based on experimental data, this did not guarantee that the assembled model would accurately predict the behaviour of the system as a whole (i.e., pasture growth and quality). Validation of the whole model was thus a priority, since any strategy derived from the model could impact a farm business in the order of thousands of dollars per annum if adopted. This paper describes the process of defining performance criteria for the model, obtaining suitable data to test the model, and carrying out the validation analysis. The validation process highlighted a number of weaknesses in the model, which will lead to the model being improved. As a result, the model's utility will be enhanced. Furthermore, validation was found to have an unexpected additional benefit, in that despite the model's poor initial performance, support was generated for the model among field scientists involved in the wider project.

  10. Validation of Occupants’ Behaviour Models for Indoor Quality Parameter and Energy Consumption Prediction

    DEFF Research Database (Denmark)

    Fabi, Valentina; Sugliano, Martina; Andersen, Rune Korsholm

    2015-01-01

    . For this reason, the validation of occupant's behavioral models is an issue that is gaining importance.In this paper validation was carried out through dynamic Building Energy Performance simulation (BEPS); behavioral models of windows opening and thermostats set-point published in literature were implemented...... in a dynamic BEPS software and the obtained results in terms of temperature, relative humidity and CO2 concentration were compared to real measurements. Through this comparison it will be possible to verify the accuracy of the implemented behavioral models.The models were able to reproduce the general...

  11. Validation of a Wave-Body Interaction Model by Experimental Tests

    DEFF Research Database (Denmark)

    Ferri, Francesco; Kramer, Morten; Pecher, Arthur

    2013-01-01

    Within the wave energy field, numerical simulation has recently acquired a worldwide consent as being a useful tool, besides physical model testing. The main goal of this work is the validation of a numerical model by experimental results. The numerical model is based on a linear wave-body intera......-body interaction theory, applied for a point absorber wave energy converter. The results show that the ratio floater size/wave amplitude is a key parameter for the validity of the applied theory....

  12. Prominent medical journals often provide insufficient information to assess the validity of studies with negative results

    Directory of Open Access Journals (Sweden)

    Dittus Robert S

    2002-09-01

    Full Text Available Abstract Background Physicians reading the medical literature attempt to determine whether research studies are valid. However, articles with negative results may not provide sufficient information to allow physicians to properly assess validity. Methods We analyzed all original research articles with negative results published in 1997 in the weekly journals BMJ, JAMA, Lancet, and New England Journal of Medicine as well as those published in the 1997 and 1998 issues of the bimonthly Annals of Internal Medicine (N = 234. Our primary objective was to quantify the proportion of studies with negative results that comment on power and present confidence intervals. Secondary outcomes were to quantify the proportion of these studies with a specified effect size and a defined primary outcome. Stratified analyses by study design were also performed. Results Only 30% of the articles with negative results comment on power. The reporting of power (range: 15%-52% and confidence intervals (range: 55–81% varied significantly among journals. Observational studies of etiology/risk factors addressed power less frequently (15%, 95% CI, 8–21% than did clinical trials (56%, 95% CI, 46–67%, p Conclusion Prominent medical journals often provide insufficient information to assess the validity of studies with negative results.

  13. Validating global hydrological models by ground and space gravimetry

    Institute of Scientific and Technical Information of China (English)

    ZHOU JiangCun; SUN HePing; XU JianQiao

    2009-01-01

    The long-term continuous gravity observations obtained by the superconducting gravimeters (SG) at seven globally-distributed stations are comprehensively analyzed. After removing the signals related to the Earth's tides and variations in the Earth's rotation, the gravity residuals are used to describe the seasonal fluctuations in gravity field. Meanwhile, the gravity changes due to the air pressure loading are theoretically modeled from the measurements of the local air pressure, and those due to land water and nontidal ocean loading are also calculated according to the corresponding numerical models. The numerical results show that the gravity changes due to both the air pressure and land water loading are as large as 100×10-9 m s-2 in magnitude, and about 10×10-9 m s-2 for those due to the nontidal ocean loading in the coastal area. On the other hand, the monthly-averaged gravity variations over the area surrounding the stations are derived from the spherical harmonic coefficients of the GRACE-recovered gravity fields, by using Gaussian smoothing technique in which the radius is set to be 600 km. Compared the land water induced gravity variations, the SG observations after removal of tides, polar motion effects, air pressure and nontidal ocean loading effects and the GRACE-derived gravity variations with each other, it is inferred that both the ground- and space-based gravity observations can effectively detect the seasonal gravity variations with a magnitude of 100×10-9 m s-2 induced by the land water loading. This implies that high precision gravimetry is an effective technique to validate the reliabilities of the hydrological models.

  14. Tsunami-HySEA model validation for tsunami current predictions

    Science.gov (United States)

    Macías, Jorge; Castro, Manuel J.; González-Vida, José Manuel; Ortega, Sergio

    2016-04-01

    Model ability to compute and predict tsunami flow velocities is of importance in risk assessment and hazard mitigation. Substantial damage can be produced by high velocity flows, particularly in harbors and bays, even when the wave height is small. Besides, an accurate simulation of tsunami flow velocities and accelerations is fundamental for advancing in the study of tsunami sediment transport. These considerations made the National Tsunami Hazard Mitigation Program (NTHMP) proposing a benchmark exercise focussed on modeling and simulating tsunami currents. Until recently, few direct measurements of tsunami velocities were available to compare and to validate model results. After Tohoku 2011 many current meters measurement were made, mainly in harbors and channels. In this work we present a part of the contribution made by the EDANYA group from the University of Malaga to the NTHMP workshop organized at Portland (USA), 9-10 of February 2015. We have selected three out of the five proposed benchmark problems. Two of them consist in real observed data from the Tohoku 2011 event, one at Hilo Habour (Hawaii) and the other at Tauranga Bay (New Zealand). The third one consists in laboratory experimental data for the inundation of Seaside City in Oregon. Acknowledgements: This research has been partially supported by the Junta de Andalucía research project TESELA (P11-RNM7069) and the Spanish Government Research project DAIFLUID (MTM2012-38383-C02-01) and Universidad de Málaga, Campus de Excelencia Andalucía TECH. The GPU and multi-GPU computations were performed at the Unit of Numerical Methods (UNM) of the Research Support Central Services (SCAI) of the University of Malaga.

  15. Validation of Advanced EM Models for UXO Discrimination

    CERN Document Server

    Weichman, Peter B

    2012-01-01

    The work reported here details basic validation of our advanced physics-based EMI forward and inverse models against data collected by the NRL TEMTADS system. The data was collected under laboratory-type conditions using both artificial spheroidal targets and real UXO. The artificial target models are essentially exact, and enable detailed comparison of theory and data in support of measurement platform characterization and target identification. Real UXO targets cannot be treated exactly, but it is demonstrated that quantitative comparisons of the data with the spheroid models nevertheless aids in extracting key target discrimination information, such as target geometry and hollow target shell thickness.

  16. Results from an Independent View on The Validation of Safety-Critical Space Systems

    Science.gov (United States)

    Silva, N.; Lopes, R.; Esper, A.; Barbosa, R.

    2013-08-01

    The Independent verification and validation (IV&V) has been a key process for decades, and is considered in several international standards. One of the activities described in the “ESA ISVV Guide” is the independent test verification (stated as Integration/Unit Test Procedures and Test Data Verification). This activity is commonly overlooked since customers do not really see the added value of checking thoroughly the validation team work (could be seen as testing the tester's work). This article presents the consolidated results of a large set of independent test verification activities, including the main difficulties, results obtained and advantages/disadvantages for the industry of these activities. This study will support customers in opting-in or opting-out for this task in future IV&V contracts since we provide concrete results from real case studies in the space embedded systems domain.

  17. Development and validation of a building design waste reduction model.

    Science.gov (United States)

    Llatas, C; Osmani, M

    2016-10-01

    Reduction in construction waste is a pressing need in many countries. The design of building elements is considered a pivotal process to achieve waste reduction at source, which enables an informed prediction of their wastage reduction levels. However the lack of quantitative methods linking design strategies to waste reduction hinders designing out waste practice in building projects. Therefore, this paper addresses this knowledge gap through the design and validation of a Building Design Waste Reduction Strategies (Waste ReSt) model that aims to investigate the relationships between design variables and their impact on onsite waste reduction. The Waste ReSt model was validated in a real-world case study involving 20 residential buildings in Spain. The validation process comprises three stages. Firstly, design waste causes were analyzed. Secondly, design strategies were applied leading to several alternative low waste building elements. Finally, their potential source reduction levels were quantified and discussed within the context of the literature. The Waste ReSt model could serve as an instrumental tool to simulate designing out strategies in building projects. The knowledge provided by the model could help project stakeholders to better understand the correlation between the design process and waste sources and subsequently implement design practices for low-waste buildings.

  18. Finite Element Model and Validation of Nasal Tip Deformation.

    Science.gov (United States)

    Manuel, Cyrus T; Harb, Rani; Badran, Alan; Ho, David; Wong, Brian J F

    2017-03-01

    Nasal tip mechanical stability is important for functional and cosmetic nasal airway surgery. Palpation of the nasal tip provides information on tip strength to the surgeon, though it is a purely subjective assessment. Providing a means to simulate nasal tip deformation with a validated model can offer a more objective approach in understanding the mechanics and nuances of the nasal tip support and eventual nasal mechanics as a whole. Herein we present validation of a finite element (FE) model of the nose using physical measurements recorded using an ABS plastic-silicone nasal phantom. Three-dimensional photogrammetry was used to capture the geometry of the phantom at rest and while under steady state load. The silicone used to make the phantom was mechanically tested and characterized using a linear elastic constitutive model. Surface point clouds of the silicone and FE model were compared for both the loaded and unloaded state. The average Hausdorff distance between actual measurements and FE simulations across the nose were 0.39 ± 1.04 mm and deviated up to 2 mm at the outermost boundaries of the model. FE simulation and measurements were in near complete agreement in the immediate vicinity of the nasal tip with millimeter accuracy. We have demonstrated validation of a two-component nasal FE model, which could be used to model more complex modes of deformation where direct measurement may be challenging. This is the first step in developing a nasal model to simulate nasal mechanics and ultimately the interaction between geometry and airflow.

  19. Development and Validation of a Path Analytic Model of Students' Performance in Chemistry.

    Science.gov (United States)

    Anamuah-Mensah, Jophus; And Others

    1987-01-01

    Reported the development and validation of an integrated model of performance on chemical concept-volumetric analysis. Model was tested on 265 chemistry students in eight schools.Results indicated that for subjects using algorithms without understanding, performance on volumetric analysis problems was not influenced by proportional reasoning…

  20. Integral Reactor Containment Condensation Model and Experimental Validation

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Qiao [Oregon State Univ., Corvallis, OR (United States); Corradini, Michael [Univ. of Wisconsin, Madison, WI (United States)

    2016-05-02

    This NEUP funded project, NEUP 12-3630, is for experimental, numerical and analytical studies on high-pressure steam condensation phenomena in a steel containment vessel connected to a water cooling tank, carried out at Oregon State University (OrSU) and the University of Wisconsin at Madison (UW-Madison). In the three years of investigation duration, following the original proposal, the planned tasks have been completed: (1) Performed a scaling study for the full pressure test facility applicable to the reference design for the condensation heat transfer process during design basis accidents (DBAs), modified the existing test facility to route the steady-state secondary steam flow into the high pressure containment for controllable condensation tests, and extended the operations at negative gage pressure conditions (OrSU). (2) Conducted a series of DBA and quasi-steady experiments using the full pressure test facility to provide a reliable high pressure condensation database (OrSU). (3) Analyzed experimental data and evaluated condensation model for the experimental conditions, and predicted the prototypic containment performance under accidental conditions (UW-Madison). A film flow model was developed for the scaling analysis, and the results suggest that the 1/3 scaled test facility covers large portion of laminar film flow, leading to a lower average heat transfer coefficient comparing to the prototypic value. Although it is conservative in reactor safety analysis, the significant reduction of heat transfer coefficient (50%) could under estimate the prototypic condensation heat transfer rate, resulting in inaccurate prediction of the decay heat removal capability. Further investigation is thus needed to quantify the scaling distortion for safety analysis code validation. Experimental investigations were performed in the existing MASLWR test facility at OrST with minor modifications. A total of 13 containment condensation tests were conducted for pressure

  1. Validation of the Osteopenia Sheep Model for Orthopaedic Biomaterial Research

    DEFF Research Database (Denmark)

    Ding, Ming; Danielsen, C.C.; Cheng, L.

    2009-01-01

    Validation of the Osteopenia Sheep Model for Orthopaedic Biomaterial Research +1Ding, M; 2Danielsen, CC; 1Cheng, L; 3Bollen, P; 4Schwarz, P; 1Overgaard, S +1Dept of Orthopaedics O, Odense University Hospital, Denmark, 2Dept of Connective Tissue Biology, University of Aarhus, Denmark, 3Biomedicine...... Lab, University of Southern Denmark, 4Dept of Geriatrics, Glostrup University Hospital, Denmark ming.ding@ouh.regionsyddanmark.dk   Introduction:  Currently, majority orthopaedic prosthesis and biomaterial researches have been based on investigation in normal animals. In most clinical situations, most...... resemble osteoporosis in humans. This study aimed to validate glucocorticoid-induced osteopenia sheep model for orthopaedic implant and biomaterial research. We hypothesized that a 7-month GC treatment together with restricted diet but without OVX would induce osteopenia. Materials and Methods: Eighteen...

  2. Seine estuary modelling and AirSWOT measurements validation

    Science.gov (United States)

    Chevalier, Laetitia; Lyard, Florent; Laignel, Benoit

    2013-04-01

    In the context of global climate change, knowing water fluxes and storage, from the global scale to the local scale, is a crucial issue. The future satellite SWOT (Surface Water and Ocean Topography) mission, dedicated to the surface water observation, is proposed to meet this challenge. SWOT main payload will be a Ka-band Radar Interferometer (KaRIn). To validate this new kind of measurements, preparatory airborne campaigns (called AirSWOT) are currently being designed. AirSWOT will carry an interferometer similar to Karin: Kaspar-Ka-band SWOT Phenomenology Airborne Radar. Some campaigns are planned in France in 2014. During these campaigns, the plane will fly over the Seine River basin, especially to observe its estuary, the upstream river main channel (to quantify river-aquifer exchange) and some wetlands. The present work objective is to validate the ability of AirSWOT and SWOT, using a Seine estuary hydrodynamic modelling. In this context, field measurements will be collected by different teams such as GIP (Public Interest Group) Seine Aval, the GPMR (Rouen Seaport), SHOM (Hydrographic and Oceanographic Service of the Navy), the IFREMER (French Research Institute for Sea Exploitation), Mercator-Ocean, LEGOS (Laboratory of Space Study in Geophysics and Oceanography), ADES (Data Access Groundwater) ... . These datasets will be used first to validate locally AirSWOT measurements, and then to improve a hydrodynamic simulations (using tidal boundary conditions, river and groundwater inflows ...) for AirSWOT data 2D validation. This modelling will also be used to estimate the benefit of the future SWOT mission for mid-latitude river hydrology. To do this modelling,the TUGOm barotropic model (Toulouse Unstructured Grid Ocean model 2D) is used. Preliminary simulations have been performed by first modelling and then combining to different regions: first the Seine River and its estuarine area and secondly the English Channel. These two simulations h are currently being

  3. Empirical Validation of a Thermal Model of a Complex Roof Including Phase Change Materials

    CERN Document Server

    Guichard, Stéphane; Bigot, Dimitri; Malet-Damour, Bruno; Libelle, Teddy; Boyer, Harry

    2015-01-01

    This paper deals with the empirical validation of a building thermal model using a phase change material (PCM) in a complex roof. A mathematical model dedicated to phase change materials based on the heat apparent capacity method was implemented in a multi-zone building simulation code, the aim being to increase understanding of the thermal behavior of the whole building with PCM technologies. To empirically validate the model, the methodology is based both on numerical and experimental studies. A parametric sensitivity analysis was performed and a set of parameters of the thermal model have been identified for optimization. The use of a generic optimization program called GenOpt coupled to the building simulation code enabled to determine the set of adequate parameters. We first present the empirical validation methodology and main results of previous work. We then give an overview of GenOpt and its coupling with the building simulation code. Finally, once the optimization results are obtained, comparisons o...

  4. Assessment model validity document - HYDRASTAR. A stochastic continuum program for groundwater flow

    Energy Technology Data Exchange (ETDEWEB)

    Gylling, B. [Kemakta Konsult AB, Stockholm (Sweden); Eriksson, Lars [Equa Simulation AB, Sundbyberg (Sweden)

    2001-12-01

    The prevailing document addresses validation of the stochastic continuum model HYDRASTAR designed for Monte Carlo simulations of groundwater flow in fractured rocks. Here, validation is defined as a process to demonstrate that a model concept is fit for its purpose. Preferably, the validation is carried out by comparison of model predictions with independent field observations and experimental measurements. In addition, other sources can also be used to confirm that the model concept gives acceptable results. One method is to compare results with the ones achieved using other model concepts for the same set of input data. Another method is to compare model results with analytical solutions. The model concept HYDRASTAR has been used in several studies including performance assessments of hypothetical repositories for spent nuclear fuel. In the performance assessments, the main tasks for HYDRASTAR have been to calculate groundwater travel time distributions, repository flux distributions, path lines and their exit locations. The results have then been used by other model concepts to calculate the near field release and far field transport. The aim and framework for the validation process includes describing the applicability of the model concept for its purpose in order to build confidence in the concept. Preferably, this is made by comparisons of simulation results with the corresponding field experiments or field measurements. Here, two comparisons with experimental results are reported. In both cases the agreement was reasonably fair. In the broader and more general context of the validation process, HYDRASTAR results have been compared with other models and analytical solutions. Commonly, the approximation calculations agree well with the medians of model ensemble results. Additional indications that HYDRASTAR is suitable for its purpose were obtained from the comparisons with results from other model concepts. Several verification studies have been made for

  5. Validation of body composition models for high school wrestlers.

    Science.gov (United States)

    Williford, H N; Smith, J F; Mansfield, E R; Conerly, M D; Bishop, P A

    1986-04-01

    This study investigates the utility of two equations for predicting minimum wrestling weight and three equations for predicting body density for the population of high school wrestlers. A sample of 54 wrestlers was assessed for body density by underwater weighing, residual volume by helium dilution, and selected anthropometric measures. The differences between observed and predicted responses were analyzed for the five models. Four statistical tests were used to validate the equations, including tests for the mean of differences, proportion of positive differences, equality of standard errors from regression, and equivalence of regression coefficients between original and second sample data. The Michael and Katch equation and two Forsyth and Sinning equations (FS1 and FS21) for body density did not predict as well as expected. The Michael and Katch equation tends to overpredict body density while FS1 underpredicts. The FS2 equation, consisting of a constant adjustment to FS1, predicts well near the mean but not at the ends of the sample range. The two Tcheng and Tipton equations produce estimates which slightly but consistently overpredict minimum wrestling weight, the long form equation by 2.5 pounds and the short form by 3.8 pounds. As a result the proportion of positive differences is less than would be expected. But based on the tests for the standard errors and regression coefficients, the evidence does not uniformly reject these two equations.

  6. Non-residential water demand model validated with extensive measurements

    Directory of Open Access Journals (Sweden)

    E. J. Pieterse-Quirijns

    2012-08-01

    Full Text Available Existing guidelines related to the water demand of non-residential buildings are outdated and do not cover hot water demand for the appropriate selection of hot water devices. Moreover, they generally overestimate peak demand values required for the design of an efficient and reliable water system. Recently, a procedure was developed based on the end-use model SIMDEUM® to derive design rules for peak demand values of both cold and hot water during various time steps for several types and sizes of non-residential buildings, i.e. offices, hotels and nursing homes. In this paper, the design rules are validated with measurements of cold and hot water patterns on a per second base. The good correlation between the simulated patterns and the measured patterns indicates that the basis of the design rules, the SIMDEUM simulated standardised buildings, is solid. Moreover, the SIMDEUM based rules give a better prediction of the measured peak values for cold water flow than the existing guidelines. Furthermore, the new design rules can predict hot water use well. In this paper it is illustrated that the new design rules lead to reliable and improved designs of building installations and water heater capacity, resulting in more hygienic and economical installations.

  7. A community diagnostic tool for chemistry climate model validation

    Directory of Open Access Journals (Sweden)

    A. Gettelman

    2012-09-01

    Full Text Available This technical note presents an overview of the Chemistry-Climate Model Validation Diagnostic (CCMVal-Diag tool for model evaluation. The CCMVal-Diag tool is a flexible and extensible open source package that facilitates the complex evaluation of global models. Models can be compared to other models, ensemble members (simulations with the same model, and/or many types of observations. The initial construction and application is to coupled chemistry-climate models (CCMs participating in CCMVal, but the evaluation of climate models that submitted output to the Coupled Model Intercomparison Project (CMIP is also possible. The package has been used to assist with analysis of simulations for the 2010 WMO/UNEP Scientific Ozone Assessment and the SPARC Report on the Evaluation of CCMs. The CCMVal-Diag tool is described and examples of how it functions are presented, along with links to detailed descriptions, instructions and source code. The CCMVal-Diag tool supports model development as well as quantifies model changes, both for different versions of individual models and for different generations of community-wide collections of models used in international assessments. The code allows further extensions by different users for different applications and types, e.g. to other components of the Earth system. User modifications are encouraged and easy to perform with minimum coding.

  8. High-speed AMB machining spindle model updating and model validation

    Science.gov (United States)

    Wroblewski, Adam C.; Sawicki, Jerzy T.; Pesch, Alexander H.

    2011-04-01

    High-Speed Machining (HSM) spindles equipped with Active Magnetic Bearings (AMBs) have been envisioned to be capable of automated self-identification and self-optimization in efforts to accurately calculate parameters for stable high-speed machining operation. With this in mind, this work presents rotor model development accompanied by automated model-updating methodology followed by updated model validation. The model updating methodology is developed to address the dynamic inaccuracies of the nominal open-loop plant model when compared with experimental open-loop transfer function data obtained by the built in AMB sensors. The nominal open-loop model is altered by utilizing an unconstrained optimization algorithm to adjust only parameters that are a result of engineering assumptions and simplifications, in this case Young's modulus of selected finite elements. Minimizing the error of both resonance and anti-resonance frequencies simultaneously (between model and experimental data) takes into account rotor natural frequencies and mode shape information. To verify the predictive ability of the updated rotor model, its performance is assessed at the tool location which is independent of the experimental transfer function data used in model updating procedures. Verification of the updated model is carried out with complementary temporal and spatial response comparisons substantiating that the updating methodology is effective for derivation of open-loop models for predictive use.

  9. Hydraulic fracture model comparison study: Complete results

    Energy Technology Data Exchange (ETDEWEB)

    Warpinski, N.R. [Sandia National Labs., Albuquerque, NM (United States); Abou-Sayed, I.S. [Mobil Exploration and Production Services (United States); Moschovidis, Z. [Amoco Production Co. (US); Parker, C. [CONOCO (US)

    1993-02-01

    Large quantities of natural gas exist in low permeability reservoirs throughout the US. Characteristics of these reservoirs, however, make production difficult and often economic and stimulation is required. Because of the diversity of application, hydraulic fracture design models must be able to account for widely varying rock properties, reservoir properties, in situ stresses, fracturing fluids, and proppant loads. As a result, fracture simulation has emerged as a highly complex endeavor that must be able to describe many different physical processes. The objective of this study was to develop a comparative study of hydraulic-fracture simulators in order to provide stimulation engineers with the necessary information to make rational decisions on the type of models most suited for their needs. This report compares the fracture modeling results of twelve different simulators, some of them run in different modes for eight separate design cases. Comparisons of length, width, height, net pressure, maximum width at the wellbore, average width at the wellbore, and average width in the fracture have been made, both for the final geometry and as a function of time. For the models in this study, differences in fracture length, height and width are often greater than a factor of two. In addition, several comparisons of the same model with different options show a large variability in model output depending upon the options chosen. Two comparisons were made of the same model run by different companies; in both cases the agreement was good. 41 refs., 54 figs., 83 tabs.

  10. Certified reduced basis model validation: A frequentistic uncertainty framework

    OpenAIRE

    Patera, A. T.; Huynh, Dinh Bao Phuong; Knezevic, David; Patera, Anthony T.

    2011-01-01

    We introduce a frequentistic validation framework for assessment — acceptance or rejection — of the consistency of a proposed parametrized partial differential equation model with respect to (noisy) experimental data from a physical system. Our method builds upon the Hotelling T[superscript 2] statistical hypothesis test for bias first introduced by Balci and Sargent in 1984 and subsequently extended by McFarland and Mahadevan (2008). Our approach introduces two new elements: a spectral repre...

  11. Trailing Edge Noise Model Validation and Application to Airfoil Optimization

    DEFF Research Database (Denmark)

    Bertagnolio, Franck; Aagaard Madsen, Helge; Bak, Christian

    2010-01-01

    The aim of this article is twofold. First, an existing trailing edge noise model is validated by comparing with airfoil surface pressure fluctuations and far field sound pressure levels measured in three different experiments. The agreement is satisfactory in one case but poor in two other cases...... across the boundary layer near the trailing edge and to a lesser extent by a smaller boundary layer displacement thickness. ©2010 American Society of Mechanical Engineers...

  12. What every urologist should know about surgical trials Part I: Are the results valid?

    Directory of Open Access Journals (Sweden)

    Sohail Bajammal

    2008-01-01

    Full Text Available Surgical interventions have inherent benefits and associated risks. Before implementing a new therapy, we should ascertain the benefits and risks of the therapy, and assure ourselves that the resources consumed in the intervention will not be exorbitant. Materials and Methods: We suggest a three-step approach to the critical appraisal of a clinical research study that addresses a question of therapy. Readers should ask themselves the following three questions: Are the study results valid? What are the results? And can I apply them to the care of an individual patient? This first review article on surgical trials will address the question as to whether we consider a study valid or not. Results: Once the reader has found an article of interest on a urological intervention, it is necessary to assess the quality of the evidence. According to the hierarchy of evidence, a randomized controlled trial is the study design which is the most likely to provide an unbiased estimate of the truth. Important methodological criteria which characterize a high-quality randomized trial include description of allocation concealment, blinding, intention-to-treat analysis, and completeness of follow-up. Failure of investigators to apply these principles may raise concerns about the validity of the study results, thereby making its finding irrelevant. Conclusion: Assessing the validity of a given study is a critical first step when evaluating a clinical research study. Making this process explicit with guidelines to assess the strength of the available evidence serves to improve patient care. It will also allow urologists to defend therapeutic interventions, based on available evidence and not anecdotes.

  13. Performance results of HESP physical model

    Science.gov (United States)

    Chanumolu, Anantha; Thirupathi, Sivarani; Jones, Damien; Giridhar, Sunetra; Grobler, Deon; Jakobsson, Robert

    2017-02-01

    As a continuation to the published work on model based calibration technique with HESP(Hanle Echelle Spectrograph) as a case study, in this paper we present the performance results of the technique. We also describe how the open parameters were chosen in the model for optimization, the glass data accuracy and handling the discrepancies. It is observed through simulations that the discrepancies in glass data can be identified but not quantifiable. So having an accurate glass data is important which is possible to obtain from the glass manufacturers. The model's performance in various aspects is presented using the ThAr calibration frames from HESP during its pre-shipment tests. Accuracy of model predictions and its wave length calibration comparison with conventional empirical fitting, the behaviour of open parameters in optimization, model's ability to track instrumental drifts in the spectrum and the double fibres performance were discussed. It is observed that the optimized model is able to predict to a high accuracy the drifts in the spectrum from environmental fluctuations. It is also observed that the pattern in the spectral drifts across the 2D spectrum which vary from image to image is predictable with the optimized model. We will also discuss the possible science cases where the model can contribute.

  14. Experimental validation of Swy-2 clay standard's PHREEQC model

    Science.gov (United States)

    Szabó, Zsuzsanna; Hegyfalvi, Csaba; Freiler, Ágnes; Udvardi, Beatrix; Kónya, Péter; Székely, Edit; Falus, György

    2017-04-01

    One of the challenges of the present century is to limit the greenhouse gas emissions for the mitigation of climate change which is possible for example by a transitional technology, CCS (Carbon Capture and Storage) and, among others, by the increase of nuclear proportion in the energy mix. Clay minerals are considered to be responsible for the low permeability and sealing capacity of caprocks sealing off stored CO2 and they are also the main constituents of bentonite in high level radioactive waste disposal facilities. The understanding of clay behaviour in these deep geological environments is possible through laboratory batch experiments of well-known standards and coupled geochemical models. Such experimentally validated models are scarce even though they allow deriving more precise long-term predictions of mineral reactions and rock and bentonite degradation underground and, therefore, ensuring the safety of the above technologies and increase their public acceptance. This ongoing work aims to create a kinetic geochemical model of Na-montmorillonite standard Swy-2 in the widely used PHREEQC code, supported by solution and mineral composition results from batch experiments. Several four days experiments have been carried out in 1:35 rock:water ratio at atmospheric conditions, and with inert and CO2 supercritical phase at 100 bar and 80 ⁰C relevant for the potential Hungarian CO2 reservoir complex. Solution samples have been taken during and after experiments and their compositions were measured by ICP-OES. The treated solid phase has been analysed by XRD and ATR-FTIR and compared to in-parallel measured references (dried Swy-2). Kinetic geochemical modelling of the experimental conditions has been performed by PHREEQC version 3 using equations and kinetic rate parameters from the USGS report of Palandri and Kharaka (2004). The visualization of experimental and numerous modelling results has been automatized by R. Experiments and models show very fast

  15. Full-scale validation of a model of algal productivity.

    Science.gov (United States)

    Béchet, Quentin; Shilton, Andy; Guieysse, Benoit

    2014-12-02

    While modeling algal productivity outdoors is crucial to assess the economic and environmental performance of full-scale cultivation, most of the models hitherto developed for this purpose have not been validated under fully relevant conditions, especially with regard to temperature variations. The objective of this study was to independently validate a model of algal biomass productivity accounting for both light and temperature and constructed using parameters experimentally derived using short-term indoor experiments. To do this, the accuracy of a model developed for Chlorella vulgaris was assessed against data collected from photobioreactors operated outdoor (New Zealand) over different seasons, years, and operating conditions (temperature-control/no temperature-control, batch, and fed-batch regimes). The model accurately predicted experimental productivities under all conditions tested, yielding an overall accuracy of ±8.4% over 148 days of cultivation. For the purpose of assessing the feasibility of full-scale algal cultivation, the use of the productivity model was therefore shown to markedly reduce uncertainty in cost of biofuel production while also eliminating uncertainties in water demand, a critical element of environmental impact assessments. Simulations at five climatic locations demonstrated that temperature-control in outdoor photobioreactors would require tremendous amounts of energy without considerable increase of algal biomass. Prior assessments neglecting the impact of temperature variations on algal productivity in photobioreactors may therefore be erroneous.

  16. Statistical Analysis Methods for Physics Models Verification and Validation

    CERN Document Server

    De Luca, Silvia

    2017-01-01

    The validation and verification process is a fundamental step for any software like Geant4 and GeantV, which aim to perform data simulation using physics models and Monte Carlo techniques. As experimental physicists, we have to face the problem to compare the results obtained using simulations with what the experiments actually observed. One way to solve the problem is to perform a consistency test. Within the Geant group, we developed a C++ compact library which will be added to the automated validation process on the Geant Validation Portal

  17. Updated Results for the Wake Vortex Inverse Model

    Science.gov (United States)

    Robins, Robert E.; Lai, David Y.; Delisi, Donald P.; Mellman, George R.

    2008-01-01

    NorthWest Research Associates (NWRA) has developed an Inverse Model for inverting aircraft wake vortex data. The objective of the inverse modeling is to obtain estimates of the vortex circulation decay and crosswind vertical profiles, using time history measurements of the lateral and vertical position of aircraft vortices. The Inverse Model performs iterative forward model runs using estimates of vortex parameters, vertical crosswind profiles, and vortex circulation as a function of wake age. Iterations are performed until a user-defined criterion is satisfied. Outputs from an Inverse Model run are the best estimates of the time history of the vortex circulation derived from the observed data, the vertical crosswind profile, and several vortex parameters. The forward model, named SHRAPA, used in this inverse modeling is a modified version of the Shear-APA model, and it is described in Section 2 of this document. Details of the Inverse Model are presented in Section 3. The Inverse Model was applied to lidar-observed vortex data at three airports: FAA acquired data from San Francisco International Airport (SFO) and Denver International Airport (DEN), and NASA acquired data from Memphis International Airport (MEM). The results are compared with observed data. This Inverse Model validation is documented in Section 4. A summary is given in Section 5. A user's guide for the inverse wake vortex model is presented in a separate NorthWest Research Associates technical report (Lai and Delisi, 2007a).

  18. Overview on calibration and validation activities and first results for ESA's Soil Moisture and Ocean Salinity Mission

    Science.gov (United States)

    Mecklenburg, Susanne; Bouzinac, Catherine; Delwart, Steven; Lopez-Baeza, Ernesto

    exercise to verify that the methodology proposed actually meets the foreseen performances. Other activities include the deployment of the ground-based ESA funded ELBARA radiometers. Also, in collaboration with the Technical University Vienna, ESA funds the establishment of a soil moisture network data hosting facility in support to the SMOS calibration and validation activities. The validation of sea surface salinity data products will be a challenging task requiring a highly accurate and stable instrument calibration. At local scales, the foreseen validation activities are focused on a better understanding of the emission of L-band radiation from the sea surface through dedicated airborne campaigns, whereas validation at global scales will rely on buoy networks and basin scale ocean models. Close collaboration with the NASA Aquarius Team will further contribute to the validation of sea surface salinity data products. A variety of campaigns, such as DOMEX, CoSMOS, WISE, LOSAC, EUROSTARRS, FROG, SMOSREX have been (and will be) performed to investigate uncertainties in the soil moisture and ocean salinity retrieval. The major aspects to investigate with regard to soil moisture are the influence of the various types of vegetation and their seasonal variability, as well as the influence of surface roughness. Over oceans, the impact of sea-surface state on the polarimetric radiometric signal is the main issue. The DOMEX campaigns will provide information for vicarious calibration over Antarctica. The presentation will provide an overview on the calibration and validation activities carried out in the SMOS commissioning phase as well as first results.

  19. Apar-T: code, validation, and physical interpretation of particle-in-cell results

    CERN Document Server

    Melzani, Mickaël; Walder, Rolf; Folini, Doris; Favre, Jean M; Krastanov, Stefan; Messmer, Peter

    2013-01-01

    We present the parallel particle-in-cell (PIC) code Apar-T and, more importantly, address the fundamental question of the relations between the PIC model, the Vlasov-Maxwell theory, and real plasmas. First, we present four validation tests: spectra from simulations of thermal plasmas, linear growth rates of the relativistic tearing instability and of the filamentation instability, and non-linear filamentation merging phase. For the filamentation instability we show that the effective growth rates measured on the total energy can differ by more than 50% from the linear cold predictions and from the fastest modes of the simulation. Second, we detail a new method for initial loading of Maxwell-J\\"uttner particle distributions with relativistic bulk velocity and relativistic temperature, and explain why the traditional method with individual particle boosting fails. Third, we scrutinize the question of what description of physical plasmas is obtained by PIC models. These models rely on two building blocks: coarse...

  20. Statistical validation of high-dimensional models of growing networks

    CERN Document Server

    Medo, Matus

    2013-01-01

    The abundance of models of complex networks and the current insufficient validation standards make it difficult to judge which models are strongly supported by data and which are not. We focus here on likelihood maximization methods for models of growing networks with many parameters and compare their performance on artificial and real datasets. While high dimensionality of the parameter space harms the performance of direct likelihood maximization on artificial data, this can be improved by introducing a suitable penalization term. Likelihood maximization on real data shows that the presented approach is able to discriminate among available network models. To make large-scale datasets accessible to this kind of analysis, we propose a subset sampling technique and show that it yields substantial model evidence in a fraction of time necessary for the analysis of the complete data.

  1. Model selection, identification and validation in anaerobic digestion: a review.

    Science.gov (United States)

    Donoso-Bravo, Andres; Mailier, Johan; Martin, Cristina; Rodríguez, Jorge; Aceves-Lara, César Arturo; Vande Wouwer, Alain

    2011-11-01

    Anaerobic digestion enables waste (water) treatment and energy production in the form of biogas. The successful implementation of this process has lead to an increasing interest worldwide. However, anaerobic digestion is a complex biological process, where hundreds of microbial populations are involved, and whose start-up and operation are delicate issues. In order to better understand the process dynamics and to optimize the operating conditions, the availability of dynamic models is of paramount importance. Such models have to be inferred from prior knowledge and experimental data collected from real plants. Modeling and parameter identification are vast subjects, offering a realm of approaches and methods, which can be difficult to fully understand by scientists and engineers dedicated to the plant operation and improvements. This review article discusses existing modeling frameworks and methodologies for parameter estimation and model validation in the field of anaerobic digestion processes. The point of view is pragmatic, intentionally focusing on simple but efficient methods.

  2. Validation of a functional model for integration of safety into process system design

    DEFF Research Database (Denmark)

    Wu, J.; Lind, M.; Zhang, X.

    2015-01-01

    with the process system functionalities as required for the intended safety applications. To provide the scientific rigor and facilitate the acceptance of qualitative modelling, this contribution focuses on developing a scientifically based validation method for functional models. The Multilevel Flow Modeling (MFM......Qualitative modeling paradigm offers process systems engineering a potential for developing effective tools for handling issues related to Process Safety. A qualitative functional modeling environment can accommodate different levels of abstraction for capturing knowledge associated...... behavior sufficiently well. With the reasoning capability provided by the MFM syntax and semantics, the validation procedure is illustrated on a three-phase separator system of an MFM model. The MFM model reasoning results successfully compares against analysis results from API RP. 14-C....

  3. Experimental results and validation of a method to reconstruct forces on the ITER test blanket modules

    Energy Technology Data Exchange (ETDEWEB)

    Zeile, Christian, E-mail: christian.zeile@kit.edu; Maione, Ivan A.

    2015-10-15

    Highlights: • An in operation force measurement system for the ITER EU HCPB TBM has been developed. • The force reconstruction methods are based on strain measurements on the attachment system. • An experimental setup and a corresponding mock-up have been built. • A set of test cases representing ITER relevant excitations has been used for validation. • The influence of modeling errors on the force reconstruction has been investigated. - Abstract: In order to reconstruct forces on the test blanket modules in ITER, two force reconstruction methods, the augmented Kalman filter and a model predictive controller, have been selected and developed to estimate the forces based on strain measurements on the attachment system. A dedicated experimental setup with a corresponding mock-up has been designed and built to validate these methods. A set of test cases has been defined to represent possible excitation of the system. It has been shown that the errors in the estimated forces mainly depend on the accuracy of the identified model used by the algorithms. Furthermore, it has been found that a minimum of 10 strain gauges is necessary to allow for a low error in the reconstructed forces.

  4. Modeling Malaysia's Energy System: Some Preliminary Results

    Directory of Open Access Journals (Sweden)

    Ahmad M. Yusof

    2011-01-01

    Full Text Available Problem statement: The current dynamic and fragile world energy environment necessitates the development of new energy model that solely caters to analyze Malaysia’s energy scenarios. Approach: The model is a network flow model that traces the flow of energy carriers from its sources (import and mining through some conversion and transformation processes for the production of energy products to final destinations (energy demand sectors. The integration to the economic sectors is done exogeneously by specifying the annual sectoral energy demand levels. The model in turn optimizes the energy variables for a specified objective function to meet those demands. Results: By minimizing the inter temporal petroleum product imports for the crude oil system the annual extraction level of Tapis blend is projected at 579600 barrels per day. The aggregate demand for petroleum products is projected to grow at 2.1% year-1 while motor gasoline and diesel constitute 42 and 38% of the petroleum products demands mix respectively over the 5 year planning period. Petroleum products import is expected to grow at 6.0% year-1. Conclusion: The preliminary results indicate that the model performs as expected. Thus other types of energy carriers such as natural gas, coal and biomass will be added to the energy system for the overall development of Malaysia energy model.

  5. Contribution to a dynamic wind turbine model validation from a wind farm islanding experiment

    DEFF Research Database (Denmark)

    Pedersen, Jørgen Kaas; Pedersen, Knud Ole Helgesen; Poulsen, Niels Kjølstad;

    2003-01-01

    and possible discrepancies are explained. The work with the wind turbine model validation relates to the dynamic stability investigations on incorporation of large amount of wind power in the Danish power grid, where the dynamic wind turbine model is applied.......Measurements from an islanding experiment on the Rejsby Hede wind farm, Denmark, are used for the validation of the dynamic model of grid-connected, stall-controlled wind turbines equipped with induction generators. The simulated results are found to be in good agreement with the measurements...

  6. Parameterization and Validation of an Integrated Electro-Thermal LFP Battery Model

    Science.gov (United States)

    2012-01-01

    an equivalent cir- cuit as seen in Fig. 1. The double RC model structure is a good choice for this battery chemistry , as shown in [25]. The two RC...the average of the charge and discharge curves taken at very low current (C/20), since the LiFePO4 cell chemistry is known to yield a hysteresis effect...condition. 4 MODEL VALIDATION AND RESULTS The electro-thermal model is implemented in Simulink to validate its performance under the UAC experiment

  7. Pharmacokinetic modeling of gentamicin in treatment of infective endocarditis: Model development and validation of existing models

    Science.gov (United States)

    van der Wijk, Lars; Proost, Johannes H.; Sinha, Bhanu; Touw, Daan J.

    2017-01-01

    Gentamicin shows large variations in half-life and volume of distribution (Vd) within and between individuals. Thus, monitoring and accurately predicting serum levels are required to optimize effectiveness and minimize toxicity. Currently, two population pharmacokinetic models are applied for predicting gentamicin doses in adults. For endocarditis patients the optimal model is unknown. We aimed at: 1) creating an optimal model for endocarditis patients; and 2) assessing whether the endocarditis and existing models can accurately predict serum levels. We performed a retrospective observational two-cohort study: one cohort to parameterize the endocarditis model by iterative two-stage Bayesian analysis, and a second cohort to validate and compare all three models. The Akaike Information Criterion and the weighted sum of squares of the residuals divided by the degrees of freedom were used to select the endocarditis model. Median Prediction Error (MDPE) and Median Absolute Prediction Error (MDAPE) were used to test all models with the validation dataset. We built the endocarditis model based on data from the modeling cohort (65 patients) with a fixed 0.277 L/h/70kg metabolic clearance, 0.698 (±0.358) renal clearance as fraction of creatinine clearance, and Vd 0.312 (±0.076) L/kg corrected lean body mass. External validation with data from 14 validation cohort patients showed a similar predictive power of the endocarditis model (MDPE -1.77%, MDAPE 4.68%) as compared to the intensive-care (MDPE -1.33%, MDAPE 4.37%) and standard (MDPE -0.90%, MDAPE 4.82%) models. All models acceptably predicted pharmacokinetic parameters for gentamicin in endocarditis patients. However, these patients appear to have an increased Vd, similar to intensive care patients. Vd mainly determines the height of peak serum levels, which in turn correlate with bactericidal activity. In order to maintain simplicity, we advise to use the existing intensive-care model in clinical practice to avoid

  8. Validation of Power Requirement Model for Active Loudspeakers

    DEFF Research Database (Denmark)

    Schneider, Henrik; Madsen, Anders Normann; Bjerregaard, Ruben

    2015-01-01

    The actual power requirement of an active loudspeaker during playback of music has not received much attention in the literature. This is probably because no single and simple solution exists and because a complete system knowledge from input voltage to output sound pressure level is required....... There are however many advantages that could be harvested from such knowledge like size, cost and efficiency improvements. In this paper a recently proposed power requirement model for active loudspeakers is experimentally validated and the model is expanded to include the closed and vented type enclosures...

  9. Validation of a Business Model for Cultural Heritage Institutions

    Directory of Open Access Journals (Sweden)

    Cristian CIUREA

    2015-01-01

    Full Text Available The paper proposes a business model for the efficiency optimization of the interaction between all actors involved in cultural heritage sector, such as galleries, libraries, archives and museums (GLAM. The validation of the business model is subject of analyses and implementations in a real environment made by different cultural institutions. The implementation of virtual exhibitions on mobile devices is described and analyzed as a key factor for increasing the cultural heritage visibility. New perspectives on the development of virtual exhibitions for mobile devices are considered. A study on the number of visitors of cultural institutions is carried out and ways to increase the number of visitors are described.

  10. Reduced-complexity modeling of braided rivers: Assessing model performance by sensitivity analysis, calibration, and validation

    Science.gov (United States)

    Ziliani, L.; Surian, N.; Coulthard, T. J.; Tarantola, S.

    2013-12-01

    paper addresses an important question of modeling stream dynamics: How may numerical models of braided stream morphodynamics be rigorously and objectively evaluated against a real case study? Using simulations from the Cellular Automaton Evolutionary Slope and River (CAESAR) reduced-complexity model (RCM) of a 33 km reach of a large gravel bed river (the Tagliamento River, Italy), this paper aims to (i) identify a sound strategy for calibration and validation of RCMs, (ii) investigate the effectiveness of multiperformance model assessments, (iii) assess the potential of using CAESAR at mesospatial and mesotemporal scales. The approach used has three main steps: first sensitivity analysis (using a screening method and a variance-based method), then calibration, and finally validation. This approach allowed us to analyze 12 input factors initially and then to focus calibration only on the factors identified as most important. Sensitivity analysis and calibration were performed on a 7.5 km subreach, using a hydrological time series of 20 months, while validation on the whole 33 km study reach over a period of 8 years (2001-2009). CAESAR was able to reproduce the macromorphological changes of the study reach and gave good results as for annual bed load sediment estimates which turned out to be consistent with measurements in other large gravel bed rivers but showed a poorer performance in reproducing the characteristics of the braided channel (e.g., braiding intensity). The approach developed in this study can be effectively applied in other similar RCM contexts, allowing the use of RCMs not only in an explorative manner but also in obtaining quantitative results and scenarios.

  11. A physiological production model for cacao : results of model simulations

    NARCIS (Netherlands)

    Zuidema, P.A.; Leffelaar, P.A.

    2002-01-01

    CASE2 is a physiological model for cocoa (Theobroma cacao L.) growth and yield. This report introduces the CAcao Simulation Engine for water-limited production in a non-technical way and presents simulation results obtained with the model.

  12. A physiological production model for cacao : results of model simulations

    NARCIS (Netherlands)

    Zuidema, P.A.; Leffelaar, P.A.

    2002-01-01

    CASE2 is a physiological model for cocoa (Theobroma cacao L.) growth and yield. This report introduces the CAcao Simulation Engine for water-limited production in a non-technical way and presents simulation results obtained with the model.

  13. Modelling rainfall erosion resulting from climate change

    Science.gov (United States)

    Kinnell, Peter

    2016-04-01

    It is well known that soil erosion leads to agricultural productivity decline and contributes to water quality decline. The current widely used models for determining soil erosion for management purposes in agriculture focus on long term (~20 years) average annual soil loss and are not well suited to determining variations that occur over short timespans and as a result of climate change. Soil loss resulting from rainfall erosion is directly dependent on the product of runoff and sediment concentration both of which are likely to be influenced by climate change. This presentation demonstrates the capacity of models like the USLE, USLE-M and WEPP to predict variations in runoff and erosion associated with rainfall events eroding bare fallow plots in the USA with a view to modelling rainfall erosion in areas subject to climate change.

  14. Organic acid modeling and model validation: Workshop summary. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Sullivan, T.J.; Eilers, J.M.

    1992-08-14

    A workshop was held in Corvallis, Oregon on April 9--10, 1992 at the offices of E&S Environmental Chemistry, Inc. The purpose of this workshop was to initiate research efforts on the entitled ``Incorporation of an organic acid representation into MAGIC (Model of Acidification of Groundwater in Catchments) and testing of the revised model using Independent data sources.`` The workshop was attended by a team of internationally-recognized experts in the fields of surface water acid-bass chemistry, organic acids, and watershed modeling. The rationale for the proposed research is based on the recent comparison between MAGIC model hindcasts and paleolimnological inferences of historical acidification for a set of 33 statistically-selected Adirondack lakes. Agreement between diatom-inferred and MAGIC-hindcast lakewater chemistry in the earlier research had been less than satisfactory. Based on preliminary analyses, it was concluded that incorporation of a reasonable organic acid representation into the version of MAGIC used for hindcasting was the logical next step toward improving model agreement.

  15. Organic acid modeling and model validation: Workshop summary

    Energy Technology Data Exchange (ETDEWEB)

    Sullivan, T.J.; Eilers, J.M.

    1992-08-14

    A workshop was held in Corvallis, Oregon on April 9--10, 1992 at the offices of E S Environmental Chemistry, Inc. The purpose of this workshop was to initiate research efforts on the entitled Incorporation of an organic acid representation into MAGIC (Model of Acidification of Groundwater in Catchments) and testing of the revised model using Independent data sources.'' The workshop was attended by a team of internationally-recognized experts in the fields of surface water acid-bass chemistry, organic acids, and watershed modeling. The rationale for the proposed research is based on the recent comparison between MAGIC model hindcasts and paleolimnological inferences of historical acidification for a set of 33 statistically-selected Adirondack lakes. Agreement between diatom-inferred and MAGIC-hindcast lakewater chemistry in the earlier research had been less than satisfactory. Based on preliminary analyses, it was concluded that incorporation of a reasonable organic acid representation into the version of MAGIC used for hindcasting was the logical next step toward improving model agreement.

  16. Simulation Modeling of Radio Direction Finding Results

    Directory of Open Access Journals (Sweden)

    K. Pelikan

    1994-12-01

    Full Text Available It is sometimes difficult to determine analytically error probabilities of direction finding results for evaluating algorithms of practical interest. Probalistic simulation models are described in this paper that can be to study error performance of new direction finding systems or to geographical modifications of existing configurations.

  17. Validation of advanced NSSS simulator model for loss-of-coolant accidents

    Energy Technology Data Exchange (ETDEWEB)

    Kao, S.P.; Chang, S.K.; Huang, H.C. [Nuclear Training Branch, Northeast Utilities, Waterford, CT (United States)

    1995-09-01

    The replacement of the NSSS (Nuclear Steam Supply System) model on the Millstone 2 full-scope simulator has significantly increased its fidelity to simulate adverse conditions in the RCS. The new simulator NSSS model is a real-time derivative of the Nuclear Plant Analyzer by ABB. The thermal-hydraulic model is a five-equation, non-homogeneous model for water, steam, and non-condensible gases. The neutronic model is a three-dimensional nodal diffusion model. In order to certify the new NSSS model for operator training, an extensive validation effort has been performed by benchmarking the model performance against RELAP5/MOD2. This paper presents the validation results for the cases of small-and large-break loss-of-coolant accidents (LOCA). Detailed comparisons in the phenomena of reflux-condensation, phase separation, and two-phase natural circulation are discussed.

  18. Implementation and Validation of IEC Generic Type 1A Wind Turbine Generator Model

    DEFF Research Database (Denmark)

    Zhao, Haoran; Wu, Qiuwei; Margaris, Ioannis

    2015-01-01

    This paper presents the implementation of the International Electrotechnical Commission (IEC) generic Type 1A wind turbine generator (WTG) model in Power Factory (PF) and the validation of the implemented model against field measurements. The IEC generic Type 1A WTG model structure is briefly...... described. The details are explained regarding how the two mass mechanical model is implemented when the generator mass is included in the PF built-in generator model. In order to verify the IEC generic Type 1A WTG model, the model to field measurement validation method was employed. The model to field...... the simulation results and measurements were calculated according to the voltage dip windows and the index definition specified in the IEC 61400-27-1 committee draft. Copyright © 2014 John Wiley & Sons, Ltd....

  19. Quantitative analysis of toxic and essential elements in human hair. Clinical validity of results.

    Science.gov (United States)

    Kosanovic, Melita; Jokanovic, Milan

    2011-03-01

    Over the last three decades, there has been an increasing awareness of environmental and occupational exposures to toxic or potentially toxic trace elements. The evolution of biological monitoring includes knowledge of kinetics of toxic and/or essential elements and adverse health effects related to their exposure. The debate whether a hair is a valid sample for biomonitoring or not is still attracting the attention of analysts, health care professionals, and environmentalists. Although researchers have found many correlations of essential elements to diseases, metabolic disorders, environmental exposures, and nutritional status, opponents of the concept of hair analysis object that hair samples are unreliable due to the influence of external factors. This review discusses validity of hair as a sample for biomonitoring of essential and toxic elements, with emphasis on pre-analytical, analytical, and post-analytical factors influencing results.

  20. Validation of spectral gas radiation models under oxyfuel conditions. Part A: Gas cell experiments

    DEFF Research Database (Denmark)

    Becher, Valentin; Clausen, Sønnik; Fateev, Alexander;

    2011-01-01

    AbstractCombustion of hydrocarbon fuels with pure oxygen results in a different flue gas composition as combustion with air. Standard CFD spectral gas radiation models for air combustion are out of their validity range. The series of three articles provides a common spectral basis for the validat......AbstractCombustion of hydrocarbon fuels with pure oxygen results in a different flue gas composition as combustion with air. Standard CFD spectral gas radiation models for air combustion are out of their validity range. The series of three articles provides a common spectral basis...

  1. Experimental Validation of a Mathematical Model for Seabed Liquefaction Under Waves

    DEFF Research Database (Denmark)

    Sumer, B. Mutlu; Kirca, Özgür; Fredsøe, Jørgen

    2012-01-01

    This paper summarizes the results of an experimental study directed towards the validation of a mathematical model for the buildup of pore water pressure and resulting liquefaction of marine soils under progressive waves. Experiments were conducted under controlled conditions with silt (d(50) = 0.......070 mm) in a wave flume with a soil pit. Waves with wave heights in the range of 7.7-18 cm, 55-cm water depth and 1.6-s wave period enabled us to study both the liquefaction and no-liquefaction regime pore water pressure buildup. The experimental data were used to validate the model. A numerical example...

  2. Experimental validation of a mathematical model for seabed liquefaction in waves

    DEFF Research Database (Denmark)

    Sumer, B. Mutlu; Kirca, Özgür; Fredsøe, Jørgen

    2011-01-01

    This paper summarizes the results of an experimental study directed towards the validation of a mathematical model for the buildup of pore water pressure and resulting liquefaction of marine soils under progressive waves. Experiments were conducted under controlled conditions with silt ( d50 = 0.......070 mm) in a wave flume with a soil pit. Waves with wave heights in the range 7.7-18 cm with the water depth 55 cm and the wave period 1.6 s enabled us to study both the liquefaction and no-liquefaction regime pore water pressure buildup. The experimental data was used to validate the model. A numerical...

  3. Towards Automatic Validation and Healing of Citygml Models for Geometric and Semantic Consistency

    Science.gov (United States)

    Alam, N.; Wagner, D.; Wewetzer, M.; von Falkenhausen, J.; Coors, V.; Pries, M.

    2013-09-01

    A steadily growing number of application fields for large 3D city models have emerged in recent years. Like in many other domains, data quality is recognized as a key factor for successful business. Quality management is mandatory in the production chain nowadays. Automated domain-specific tools are widely used for validation of business-critical data but still common standards defining correct geometric modeling are not precise enough to define a sound base for data validation of 3D city models. Although the workflow for 3D city models is well-established from data acquisition to processing, analysis and visualization, quality management is not yet a standard during this workflow. Processing data sets with unclear specification leads to erroneous results and application defects. We show that this problem persists even if data are standard compliant. Validation results of real-world city models are presented to demonstrate the potential of the approach. A tool to repair the errors detected during the validation process is under development; first results are presented and discussed. The goal is to heal defects of the models automatically and export a corrected CityGML model.

  4. Model calibration and validation of an impact test simulation

    Energy Technology Data Exchange (ETDEWEB)

    Hemez, F. M. (François M.); Wilson, A. C. (Amanda C.); Havrilla, G. N. (George N.)

    2001-01-01

    This paper illustrates the methodology being developed at Los Alamos National Laboratory for the validation of numerical simulations for engineering structural dynamics. The application involves the transmission of a shock wave through an assembly that consists of a steel cylinder and a layer of elastomeric (hyper-foam) material. The assembly is mounted on an impact table to generate the shock wave. The input acceleration and three output accelerations are measured. The main objective of the experiment is to develop a finite element representation of the system capable of reproducing the test data with acceptable accuracy. Foam layers of various thicknesses and several drop heights are considered during impact testing. Each experiment is replicated several times to estimate the experimental variability. Instead of focusing on the calibration of input parameters for a single configuration, the numerical model is validated for its ability to predict the response of three different configurations (various combinations of foam thickness and drop height). Design of Experiments is implemented to perform parametric and statistical variance studies. Surrogate models are developed to replace the computationally expensive numerical simulation. Variables of the finite element model are separated into calibration variables and control variables, The models are calibrated to provide numerical simulations that correctly reproduce the statistical variation of the test configurations. The calibration step also provides inference for the parameters of a high strain-rate dependent material model of the hyper-foam. After calibration, the validity of the numerical simulation is assessed through its ability to predict the response of a fourth test setup.

  5. Calibration and validation of DRAINMOD to model bioretention hydrology

    Science.gov (United States)

    Brown, R. A.; Skaggs, R. W.; Hunt, W. F.

    2013-04-01

    SummaryPrevious field studies have shown that the hydrologic performance of bioretention cells varies greatly because of factors such as underlying soil type, physiographic region, drainage configuration, surface storage volume, drainage area to bioretention surface area ratio, and media depth. To more accurately describe bioretention hydrologic response, a long-term hydrologic model that generates a water balance is needed. Some current bioretention models lack the ability to perform long-term simulations and others have never been calibrated from field monitored bioretention cells with underdrains. All peer-reviewed models lack the ability to simultaneously perform both of the following functions: (1) model an internal water storage (IWS) zone drainage configuration and (2) account for soil-water content using the soil-water characteristic curve. DRAINMOD, a widely-accepted agricultural drainage model, was used to simulate the hydrologic response of runoff entering a bioretention cell. The concepts of water movement in bioretention cells are very similar to those of agricultural fields with drainage pipes, so many bioretention design specifications corresponded directly to DRAINMOD inputs. Detailed hydrologic measurements were collected from two bioretention field sites in Nashville and Rocky Mount, North Carolina, to calibrate and test the model. Each field site had two sets of bioretention cells with varying media depths, media types, drainage configurations, underlying soil types, and surface storage volumes. After 12 months, one of these characteristics was altered - surface storage volume at Nashville and IWS zone depth at Rocky Mount. At Nashville, during the second year (post-repair period), the Nash-Sutcliffe coefficients for drainage and exfiltration/evapotranspiration (ET) both exceeded 0.8 during the calibration and validation periods. During the first year (pre-repair period), the Nash-Sutcliffe coefficients for drainage, overflow, and exfiltration

  6. A comprehensive validation toolbox for regional ocean models - Outline, implementation and application to the Baltic Sea

    Science.gov (United States)

    Jandt, Simon; Laagemaa, Priidik; Janssen, Frank

    2014-05-01

    The systematic and objective comparison between output from a numerical ocean model and a set of observations, called validation in the context of this presentation, is a beneficial activity at several stages, starting from early steps in model development and ending at the quality control of model based products delivered to customers. Even though the importance of this kind of validation work is widely acknowledged it is often not among the most popular tasks in ocean modelling. In order to ease the validation work a comprehensive toolbox has been developed in the framework of the MyOcean-2 project. The objective of this toolbox is to carry out validation integrating different data sources, e.g. time-series at stations, vertical profiles, surface fields or along track satellite data, with one single program call. The validation toolbox, implemented in MATLAB, features all parts of the validation process - ranging from read-in procedures of datasets to the graphical and numerical output of statistical metrics of the comparison. The basic idea is to have only one well-defined validation schedule for all applications, in which all parts of the validation process are executed. Each part, e.g. read-in procedures, forms a module in which all available functions of this particular part are collected. The interface between the functions, the module and the validation schedule is highly standardized. Functions of a module are set up for certain validation tasks, new functions can be implemented into the appropriate module without affecting the functionality of the toolbox. The functions are assigned for each validation task in user specific settings, which are externally stored in so-called namelists and gather all information of the used datasets as well as paths and metadata. In the framework of the MyOcean-2 project the toolbox is frequently used to validate the forecast products of the Baltic Sea Marine Forecasting Centre. Hereby the performance of any new product

  7. Validating and Verifying Biomathematical Models of Human Fatigue

    Science.gov (United States)

    Martinez, Siera Brooke; Quintero, Luis Ortiz; Flynn-Evans, Erin

    2015-01-01

    Airline pilots experience acute and chronic sleep deprivation, sleep inertia, and circadian desynchrony due to the need to schedule flight operations around the clock. This sleep loss and circadian desynchrony gives rise to cognitive impairments, reduced vigilance and inconsistent performance. Several biomathematical models, based principally on patterns observed in circadian rhythms and homeostatic drive, have been developed to predict a pilots levels of fatigue or alertness. These models allow for the Federal Aviation Administration (FAA) and commercial airlines to make decisions about pilot capabilities and flight schedules. Although these models have been validated in a laboratory setting, they have not been thoroughly tested in operational environments where uncontrolled factors, such as environmental sleep disrupters, caffeine use and napping, may impact actual pilot alertness and performance. We will compare the predictions of three prominent biomathematical fatigue models (McCauley Model, Harvard Model, and the privately-sold SAFTE-FAST Model) to actual measures of alertness and performance. We collected sleep logs, movement and light recordings, psychomotor vigilance task (PVT), and urinary melatonin (a marker of circadian phase) from 44 pilots in a short-haul commercial airline over one month. We will statistically compare with the model predictions to lapses on the PVT and circadian phase. We will calculate the sensitivity and specificity of each model prediction under different scheduling conditions. Our findings will aid operational decision-makers in determining the reliability of each model under real-world scheduling situations.

  8. Experimental testing procedures and dynamic model validation for vanadium redox flow battery storage system

    Science.gov (United States)

    Baccino, Francesco; Marinelli, Mattia; Nørgård, Per; Silvestro, Federico

    2014-05-01

    The paper aims at characterizing the electrochemical and thermal parameters of a 15 kW/320 kWh vanadium redox flow battery (VRB) installed in the SYSLAB test facility of the DTU Risø Campus and experimentally validating the proposed dynamic model realized in Matlab-Simulink. The adopted testing procedure consists of analyzing the voltage and current values during a power reference step-response and evaluating the relevant electrochemical parameters such as the internal resistance. The results of different tests are presented and used to define the electrical characteristics and the overall efficiency of the battery system. The test procedure has general validity and could also be used for other storage technologies. The storage model proposed and described is suitable for electrical studies and can represent a general model in terms of validity. Finally, the model simulation outputs are compared with experimental measurements during a discharge-charge sequence.

  9. Multicomponent aerosol dynamics model UHMA: model development and validation

    Directory of Open Access Journals (Sweden)

    H. Korhonen

    2004-01-01

    Full Text Available A size-segregated aerosol dynamics model UHMA (University of Helsinki Multicomponent Aerosol model was developed for studies of multicomponent tropospheric aerosol particles. The model includes major aerosol microphysical processes in the atmosphere with a focus on new particle formation and growth; thus it incorporates particle coagulation and multicomponent condensation, applying a revised treatment of condensation flux onto free molecular regime particles and the activation of nanosized clusters by organic vapours (Nano-Köhler theory, as well as recent parameterizations for binary H2SO4-H2O and ternary H2SO4-NH3-H2O homogeneous nucleation and dry deposition. The representation of particle size distribution can be chosen from three sectional methods: the hybrid method, the moving center method, and the retracking method in which moving sections are retracked to a fixed grid after a certain time interval. All these methods can treat particle emissions and atmospheric transport consistently, and are therefore suitable for use in large scale atmospheric models. In a test simulation against an accurate high resolution solution, all the methods showed reasonable treatment of new particle formation with 20 size sections although the hybrid and the retracking methods suffered from artificial widening of the distribution. The moving center approach, on the other hand, showed extra dents in the particle size distribution and failed to predict the onset of detectable particle formation. In a separate test simulation of an observed nucleation event, the model captured the key qualitative behaviour of the system well. Furthermore, its prediction of the organic volume fraction in newly formed particles, suggesting values as high as 0.5 for 3–4 nm particles and approximately 0.8 for 10 nm particles, agrees with recent indirect composition measurements.

  10. Multicomponent aerosol dynamics model UHMA: model development and validation

    Directory of Open Access Journals (Sweden)

    H. Korhonen

    2004-01-01

    Full Text Available A size-segregated aerosol dynamics model UHMA (University of Helsinki Multicomponent Aerosol model was developed for studies of multicomponent tropospheric aerosol particles. The model includes major aerosol microphysical processes in the atmosphere with a focus on new particle formation and growth; thus it incorporates particle coagulation and multicomponent condensation, applying a revised treatment of condensation flux onto free molecular regime particles and the activation of nanosized clusters by organic vapours (Nano-Köhler theory, as well as recent parameterizations for binary H2SO4–H2O and ternary H2SO4–NH3-H2O homogeneous nucleation and dry deposition. The representation of particle size distribution can be chosen from three sectional methods: the hybrid method, the moving center method, and the retracking method in which moving sections are retracked to a fixed grid after a certain time interval. All these methods can treat particle emissions and transport consistently, and are therefore suitable for use in large scale atmospheric models. In a test simulation against an accurate high resolution solution, all the methods showed reasonable treatment of new particle formation with 20 size sections although the hybrid and the retracking methods suffered from artificial widening of the distribution. The moving center approach, on the other hand, showed extra dents in the particle size distribution and failed to predict the onset of detectable particle formation. In a separate test simulation of an observed nucleation event, the model captured the key qualitative behaviour of the system well. Furthermore, its prediction of the organic volume fraction in newly formed particles, suggesting values as high as 0.5 for 3–4 nm particles and approximately 0.8 for 10 nm particles, agrees with recent indirect composition measurements.

  11. Multicomponent aerosol dynamics model UHMA: model development and validation

    Science.gov (United States)

    Korhonen, H.; Lehtinen, K. E. J.; Kulmala, M.

    2004-05-01

    A size-segregated aerosol dynamics model UHMA (University of Helsinki Multicomponent Aerosol model) was developed for studies of multicomponent tropospheric aerosol particles. The model includes major aerosol microphysical processes in the atmosphere with a focus on new particle formation and growth; thus it incorporates particle coagulation and multicomponent condensation, applying a revised treatment of condensation flux onto free molecular regime particles and the activation of nanosized clusters by organic vapours (Nano-Köhler theory), as well as recent parameterizations for binary H2SO4-H2O and ternary H2SO4-NH3-H2O homogeneous nucleation and dry deposition. The representation of particle size distribution can be chosen from three sectional methods: the hybrid method, the moving center method, and the retracking method in which moving sections are retracked to a fixed grid after a certain time interval. All these methods can treat particle emissions and atmospheric transport consistently, and are therefore suitable for use in large scale atmospheric models. In a test simulation against an accurate high resolution solution, all the methods showed reasonable treatment of new particle formation with 20 size sections although the hybrid and the retracking methods suffered from artificial widening of the distribution. The moving center approach, on the other hand, showed extra dents in the particle size distribution and failed to predict the onset of detectable particle formation. In a separate test simulation of an observed nucleation event, the model captured the key qualitative behaviour of the system well. Furthermore, its prediction of the organic volume fraction in newly formed particles, suggesting values as high as 0.5 for 3-4 nm particles and approximately 0.8 for 10 nm particles, agrees with recent indirect composition measurements.

  12. Solar Module Modeling, Simulation And Validation Under Matlab / Simulink

    Directory of Open Access Journals (Sweden)

    M.Diaw

    2016-09-01

    Full Text Available Solar modules are systems which convert sunlight into electricity using the physics of semiconductors. Mathematical modeling of these systems uses weather data such as irradiance and temperature as inputs. It provides the current, voltage or power as outputs, which allows plot the characteristic giving the intensity I as a function of voltage V for photovoltaic cells. In this work, we have developed a model for a diode of a Photovoltaic module under the Matlab / Simulink environment. From this model, we have plotted the characteristic curves I-V and P-V of solar cell for different values of temperature and sunlight. The validation has been done by comparing the experimental curve with power from a solar panel HORONYA 20W type with that obtained by the model.

  13. Model development and validation of a solar cooling plant

    Energy Technology Data Exchange (ETDEWEB)

    Zambrano, Darine; Garcia-Gabin, Winston [Escuela de Ingenieria Electrica, Facultad de Ingenieria, Universidad de Los Andes, La Hechicera, Merida 5101 (Venezuela); Bordons, Carlos; Camacho, Eduardo F. [Departamento de Ingenieria de Sistemas y Automatica, Escuela Superior de Ingenieros, Universidad de Sevilla, Camino de Los Descubrimientos s/n, Sevilla 41092 (Spain)

    2008-03-15

    This paper describes the dynamic model of a solar cooling plant that has been built for demonstration purposes using market-available technology and has been successfully operational since 2001. The plant uses hot water coming from a field of solar flat collectors which feed a single-effect absorption chiller of 35 kW nominal cooling capacity. The work includes model development based on first principles and model validation with a set of experiments carried out on the real plant. The simulation model has been done in a modular way, and can be adapted to other solar cooling-plants since the main modules (solar field, absorption machine, accumulators and auxiliary heater) can be easily replaced. This simulator is a powerful tool for solar cooling systems both during the design phase, when it can be used for component selection, and also for the development and testing of control strategies. (author)

  14. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models

    Science.gov (United States)

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; Price, Stephen; Hoffman, Matthew; Lipscomb, William H.; Fyke, Jeremy; Vargo, Lauren; Boghozian, Adrianna; Norman, Matthew; Worley, Patrick H.

    2017-06-01

    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptops to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Ultimately, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.

  15. Deviatoric constitutive model: domain of strain rate validity

    Energy Technology Data Exchange (ETDEWEB)

    Zocher, Marvin A [Los Alamos National Laboratory

    2009-01-01

    A case is made for using an enhanced methodology in determining the parameters that appear in a deviatoric constitutive model. Predictability rests on our ability to solve a properly posed initial boundary value problem (IBVP), which incorporates an accurate reflection of material constitutive behavior. That reflection is provided through the constitutive model. Moreover, the constitutive model is required for mathematical closure of the IBVP. Common practice in the shock physics community is to divide the Cauchy tensor into spherical and deviatoric parts, and to develop separate models for spherical and deviatoric constitutive response. Our focus shall be on the Cauchy deviator and deviatoric constitutive behavior. Discussions related to the spherical part of the Cauchy tensor are reserved for another time. A number of deviatoric constitutive models have been developed for utilization in the solution of IBVPs that are of interest to those working in the field of shock physics, e.g. All of these models are phenomenological and contain a number of parameters that must be determined in light of experimental data. The methodology employed in determining these parameters dictates the loading regime over which the model can be expected to be accurate. The focus of this paper is the methodology employed in determining model parameters and the consequences of that methodology as it relates to the domain of strain rate validity. We shall begin by describing the methodology that is typically employed. We shall discuss limitations imposed upon predictive capability by the typically employed methodology. We shall propose a modification to the typically employed methodology that significantly extends the domain of strain rate validity.

  16. Modeling, Robust Control, and Experimental Validation of a Supercavitating Vehicle

    Science.gov (United States)

    Escobar Sanabria, David

    This dissertation considers the mathematical modeling, control under uncertainty, and experimental validation of an underwater supercavitating vehicle. By traveling inside a gas cavity, a supercavitating vehicle reduces hydrodynamic drag, increases speed, and minimizes power consumption. The attainable speed and power efficiency make these vehicles attractive for undersea exploration, high-speed transportation, and defense. However, the benefits of traveling inside a cavity come with difficulties in controlling the vehicle dynamics. The main challenge is the nonlinear force that arises when the back-end of the vehicle pierces the cavity. This force, referred to as planing, leads to oscillatory motion and instability. Control technologies that are robust to planing and suited for practical implementation need to be developed. To enable these technologies, a low-order vehicle model that accounts for inaccuracy in the characterization of planing is required. Additionally, an experimental method to evaluate possible pitfalls in the models and controllers is necessary before undersea testing. The major contribution of this dissertation is a unified framework for mathematical modeling, robust control synthesis, and experimental validation of a supercavitating vehicle. First, we introduce affordable experimental methods for mathematical modeling and controller testing under planing and realistic flow conditions. Then, using experimental observations and physical principles, we create a low-order nonlinear model of the longitudinal vehicle motion. This model quantifies the planing uncertainty and is suitable for robust controller synthesis. Next, based on the vehicle model, we develop automated tools for synthesizing controllers that deliver a certificate of performance in the face of nonlinear and uncertain planing forces. We demonstrate theoretically and experimentally that the proposed controllers ensure higher performance when the uncertain planing dynamics are

  17. Validation of thermal models for a prototypical MEMS thermal actuator.

    Energy Technology Data Exchange (ETDEWEB)

    Gallis, Michail A.; Torczynski, John Robert; Piekos, Edward Stanley; Serrano, Justin Raymond; Gorby, Allen D.; Phinney, Leslie Mary

    2008-09-01

    This report documents technical work performed to complete the ASC Level 2 Milestone 2841: validation of thermal models for a prototypical MEMS thermal actuator. This effort requires completion of the following task: the comparison between calculated and measured temperature profiles of a heated stationary microbeam in air. Such heated microbeams are prototypical structures in virtually all electrically driven microscale thermal actuators. This task is divided into four major subtasks. (1) Perform validation experiments on prototypical heated stationary microbeams in which material properties such as thermal conductivity and electrical resistivity are measured if not known and temperature profiles along the beams are measured as a function of electrical power and gas pressure. (2) Develop a noncontinuum gas-phase heat-transfer model for typical MEMS situations including effects such as temperature discontinuities at gas-solid interfaces across which heat is flowing, and incorporate this model into the ASC FEM heat-conduction code Calore to enable it to simulate these effects with good accuracy. (3) Develop a noncontinuum solid-phase heat transfer model for typical MEMS situations including an effective thermal conductivity that depends on device geometry and grain size, and incorporate this model into the FEM heat-conduction code Calore to enable it to simulate these effects with good accuracy. (4) Perform combined gas-solid heat-transfer simulations using Calore with these models for the experimentally investigated devices, and compare simulation and experimental temperature profiles to assess model accuracy. These subtasks have been completed successfully, thereby completing the milestone task. Model and experimental temperature profiles are found to be in reasonable agreement for all cases examined. Modest systematic differences appear to be related to uncertainties in the geometric dimensions of the test structures and in the thermal conductivity of the

  18. Cultural adaptation and validation of an instrument on barriers for the use of research results.

    Science.gov (United States)

    Ferreira, Maria Beatriz Guimarães; Haas, Vanderlei José; Dantas, Rosana Aparecida Spadoti; Felix, Márcia Marques Dos Santos; Galvão, Cristina Maria

    2017-03-02

    to culturally adapt The Barriers to Research Utilization Scale and to analyze the metric validity and reliability properties of its Brazilian Portuguese version. methodological research conducted by means of the cultural adaptation process (translation and back-translation), face and content validity, construct validity (dimensionality and known groups) and reliability analysis (internal consistency and test-retest). The sample consisted of 335 nurses, of whom 43 participated in the retest phase. the validity of the adapted version of the instrument was confirmed. The scale investigates the barriers for the use of the research results in clinical practice. Confirmatory factorial analysis demonstrated that the Brazilian Portuguese version of the instrument is adequately adjusted to the dimensional structure the scale authors originally proposed. Statistically significant differences were observed among the nurses holding a Master's or Doctoral degree, with characteristics favorable to Evidence-Based Practice, and working at an institution with an organizational cultural that targets this approach. The reliability showed a strong correlation (r ranging between 0.77 and 0.84, pcultura organizacional dirigida hacia tal aproximación. La fiabilidad presentó correlación fuerte (r variando entre 0,77 y 0,84, pcultura organizacional direcionada para tal abordagem. A confiabilidade apresentou correlação forte (r variando entre 0,77e 0,84, p<0,001) e a consistência interna foi adequada (alfa de Cronbach variando entre 0,77 e 0,82) . a versão para o português brasileiro do instrumento The Barriers Scale demonstrou-se válida e confiável no grupo estudado.

  19. A new validation-assessment tool for health-economic decision models

    NARCIS (Netherlands)

    Mauskopf, J.; Vemer, P.; Voorn, van G.A.K.; Corro Ramos, I.

    2014-01-01

    A validation-assessment tool is being developed for decision makers to transparently and consistently evaluate the validation status of different health-economic decision models. It is designed as a list of validation techniques covering all relevant aspects of model validation to be filled in by

  20. Error Modelling and Experimental Validation for a Planar 3-PPR Parallel Manipulator

    DEFF Research Database (Denmark)

    Wu, Guanglei; Bai, Shaoping; Kepler, Jørgen Asbøl

    2011-01-01

    In this paper, the positioning error of a 3-PPR planar parallel manipulator is studied with an error model and experimental validation. First, the displacement and workspace are analyzed. An error model considering both configuration errors and joint clearance errors is established. Using...... this model, the maximum positioning error was estimated for a U-shape PPR planar manipulator, the results being compared with the experimental measurements. It is found that the error distributions from the simulation is approximate to that of themeasurements....

  1. Error Modelling and Experimental Validation for a Planar 3-PPR Parallel Manipulator

    DEFF Research Database (Denmark)

    Wu, Guanglei; Bai, Shaoping; Kepler, Jørgen Asbøl

    2011-01-01

    In this paper, the positioning error of a 3-PPR planar parallel manipulator is studied with an error model and experimental validation. First, the displacement and workspace are analyzed. An error model considering both configuration errors and joint clearance errors is established. Using...... this model, the maximum positioning error was estimated for a U-shape PPR planar manipulator, the results being compared with the experimental measurements. It is found that the error distributions from the simulation is approximate to that of themeasurements....

  2. Community-wide validation of geospace model local K-index predictions to support model transition to operations

    Science.gov (United States)

    Glocer, A.; Rastätter, L.; Kuznetsova, M.; Pulkkinen, A.; Singer, H. J.; Balch, C.; Weimer, D.; Welling, D.; Wiltberger, M.; Raeder, J.; Weigel, R. S.; McCollough, J.; Wing, S.

    2016-07-01

    We present the latest result of a community-wide space weather model validation effort coordinated among the Community Coordinated Modeling Center (CCMC), NOAA Space Weather Prediction Center (SWPC), model developers, and the broader science community. Validation of geospace models is a critical activity for both building confidence in the science results produced by the models and in assessing the suitability of the models for transition to operations. Indeed, a primary motivation of this work is supporting NOAA/SWPC's effort to select a model or models to be transitioned into operations. Our validation efforts focus on the ability of the models to reproduce a regional index of geomagnetic disturbance, the local K-index. Our analysis includes six events representing a range of geomagnetic activity conditions and six geomagnetic observatories representing midlatitude and high-latitude locations. Contingency tables, skill scores, and distribution metrics are used for the quantitative analysis of model performance. We consider model performance on an event-by-event basis, aggregated over events, at specific station locations, and separated into high-latitude and midlatitude domains. A summary of results is presented in this report, and an online tool for detailed analysis is available at the CCMC.

  3. Validation of symptom validity tests using a "child-model" of adult cognitive impairments

    NARCIS (Netherlands)

    Rienstra, A.; Spaan, P.E.J.; Schmand, B.

    2010-01-01

    Validation studies of symptom validity tests (SVTs) in children are uncommon. However, since children’s cognitive abilities are not yet fully developed, their performance may provide additional support for the validity of these measures in adult populations. Four SVTs, the Test of Memory Malingering

  4. MOLECULAR VALIDATED MODEL FOR ADSORPTION OF PROTONATED DYE ON LDH

    Directory of Open Access Journals (Sweden)

    B. M. Braga

    Full Text Available Abstract Hydrotalcite-like compounds are anionic clays of scientific and technological interest for their use as ion exchange materials, catalysts and modified electrodes. Surface phenomenon are important for all these applications. Although conventional analytical methods have enabled progress in understanding the behavior of anionic clays in solution, an evaluation at the atomic scale of the dynamics of their ionic interactions has never been performed. Molecular simulation has become an extremely useful tool to provide this perspective. Our purpose is to validate a simplified model for the adsorption of 5-benzoyl-4-hydroxy-2-methoxy-benzenesulfonic acid (MBSA, a prototype molecule of anionic dyes, onto a hydrotalcite surface. Monte Carlo simulations were performed in the canonical ensemble with MBSA ions and a pore model of hydrotalcite using UFF and ClayFF force fields. The proposed molecular model has allowed us to reproduce experimental data of atomic force microscopy. Influences of protonation during the adsorption process are also presented.

  5. Experimental Validation of a Dynamic Model for Lightweight Robots

    Directory of Open Access Journals (Sweden)

    Alessandro Gasparetto

    2013-03-01

    Full Text Available Nowadays, one of the main topics in robotics research is dynamic performance improvement by means of a lightening of the overall system structure. The effective motion and control of these lightweight robotic systems occurs with the use of suitable motion planning and control process. In order to do so, model-based approaches can be adopted by exploiting accurate dynamic models that take into account the inertial and elastic terms that are usually neglected in a heavy rigid link configuration. In this paper, an effective method for modelling spatial lightweight industrial robots based on an Equivalent Rigid Link System approach is considered from an experimental validation perspective. A dynamic simulator implementing the formulation is used and an experimental test-bench is set-up. Experimental tests are carried out with a benchmark L-shape mechanism.

  6. Development and Validation of Linear Alternator Models for the Advanced Stirling Convertor

    Science.gov (United States)

    Metscher, Jonathan F.; Lewandowski, Edward J.

    2015-01-01

    Two models of the linear alternator of the Advanced Stirling Convertor (ASC) have been developed using the Sage 1-D modeling software package. The first model relates the piston motion to electric current by means of a motor constant. The second uses electromagnetic model components to model the magnetic circuit of the alternator. The models are tuned and validated using test data and also compared against each other. Results show both models can be tuned to achieve results within 7 of ASC test data under normal operating conditions. Using Sage enables the creation of a complete ASC model to be developed and simulations completed quickly compared to more complex multi-dimensional models. These models allow for better insight into overall Stirling convertor performance, aid with Stirling power system modeling, and in the future support NASA mission planning for Stirling-based power systems.

  7. Precise orbit determination for quad-constellation satellites at Wuhan University: strategy, result validation, and comparison

    Science.gov (United States)

    Guo, Jing; Xu, Xiaolong; Zhao, Qile; Liu, Jingnan

    2016-02-01

    This contribution summarizes the strategy used by Wuhan University (WHU) to determine precise orbit and clock products for Multi-GNSS Experiment (MGEX) of the International GNSS Service (IGS). In particular, the satellite attitude, phase center corrections, solar radiation pressure model developed and used for BDS satellites are addressed. In addition, this contribution analyzes the orbit and clock quality of the quad-constellation products from MGEX Analysis Centers (ACs) for a common time period of 1 year (2014). With IGS final GPS and GLONASS products as the reference, Multi-GNSS products of WHU (indicated by WUM) show the best agreement among these products from all MGEX ACs in both accuracy and stability. 3D Day Boundary Discontinuities (DBDs) range from 8 to 27 cm for Galileo-IOV satellites among all ACs' products, whereas WUM ones are the largest (about 26.2 cm). Among three types of BDS satellites, MEOs show the smallest DBDs from 10 to 27 cm, whereas the DBDs for all ACs products are at decimeter to meter level for GEOs and one to three decimeter for IGSOs, respectively. As to the satellite laser ranging (SLR) validation for Galileo-IOV satellites, the accuracy evaluated by SLR residuals is at the one decimeter level with the well-known systematic bias of about -5 cm for all ACs. For BDS satellites, the accuracy could reach decimeter level, one decimeter level, and centimeter level for GEOs, IGSOs, and MEOs, respectively. However, there is a noticeable bias in GEO SLR residuals. In addition, systematic errors dependent on orbit angle related to mismodeled solar radiation pressure (SRP) are present for BDS GEOs and IGSOs. The results of Multi-GNSS combined kinematic PPP demonstrate that the best accuracy of position and fastest convergence speed have been achieved using WUM products, particularly in the Up direction. Furthermore, the accuracy of static BDS only PPP degrades when the BDS IGSO and MEO satellites switches to orbit-normal orientation

  8. Prospective Study of One Million Deaths in India: Rationale, Design, and Validation Results.

    Directory of Open Access Journals (Sweden)

    2005-12-01

    Full Text Available BACKGROUND: Over 75% of the annual estimated 9.5 million deaths in India occur in the home, and the large majority of these do not have a certified cause. India and other developing countries urgently need reliable quantification of the causes of death. They also need better epidemiological evidence about the relevance of physical (such as blood pressure and obesity, behavioral (such as smoking, alcohol, HIV-1 risk taking, and immunization history, and biological (such as blood lipids and gene polymorphisms measurements to the development of disease in individuals or disease rates in populations. We report here on the rationale, design, and implementation of the world's largest prospective study of the causes and correlates of mortality. METHODS AND FINDINGS: We will monitor nearly 14 million people in 2.4 million nationally representative Indian households (6.3 million people in 1.1 million households in the 1998-2003 sample frame and 7.6 million people in 1.3 million households in the 2004-2014 sample frame for vital status and, if dead, the causes of death through a well-validated verbal autopsy (VA instrument. About 300,000 deaths from 1998-2003 and some 700,000 deaths from 2004-2014 are expected; of these about 850,000 will be coded by two physicians to provide causes of death by gender, age, socioeconomic status, and geographical region. Pilot studies will evaluate the addition of physical and biological measurements, specifically dried blood spots. Preliminary results from over 35,000 deaths suggest that VA can ascertain the leading causes of death, reduce the misclassification of causes, and derive the probable underlying cause of death when it has not been reported. VA yields broad classification of the underlying causes in about 90% of deaths before age 70. In old age, however, the proportion of classifiable deaths is lower. By tracking underlying demographic denominators, the study permits quantification of absolute mortality rates

  9. Prospective study of one million deaths in India: rationale, design, and validation results.

    Directory of Open Access Journals (Sweden)

    Prabhat Jha

    2006-02-01

    Full Text Available Over 75% of the annual estimated 9.5 million deaths in India occur in the home, and the large majority of these do not have a certified cause. India and other developing countries urgently need reliable quantification of the causes of death. They also need better epidemiological evidence about the relevance of physical (such as blood pressure and obesity, behavioral (such as smoking, alcohol, HIV-1 risk taking, and immunization history, and biological (such as blood lipids and gene polymorphisms measurements to the development of disease in individuals or disease rates in populations. We report here on the rationale, design, and implementation of the world's largest prospective study of the causes and correlates of mortality.We will monitor nearly 14 million people in 2.4 million nationally representative Indian households (6.3 million people in 1.1 million households in the 1998-2003 sample frame and 7.6 million people in 1.3 million households in the 2004-2014 sample frame for vital status and, if dead, the causes of death through a well-validated verbal autopsy (VA instrument. About 300,000 deaths from 1998-2003 and some 700,000 deaths from 2004-2014 are expected; of these about 850,000 will be coded by two physicians to provide causes of death by gender, age, socioeconomic status, and geographical region. Pilot studies will evaluate the addition of physical and biological measurements, specifically dried blood spots. Preliminary results from over 35,000 deaths suggest that VA can ascertain the leading causes of death, reduce the misclassification of causes, and derive the probable underlying cause of death when it has not been reported. VA yields broad classification of the underlying causes in about 90% of deaths before age 70. In old age, however, the proportion of classifiable deaths is lower. By tracking underlying demographic denominators, the study permits quantification of absolute mortality rates. Household case-control, proportional

  10. Criteria of validity for animal models of psychiatric disorders: focus on anxiety disorders and depression

    Directory of Open Access Journals (Sweden)

    Belzung Catherine

    2011-11-01

    Full Text Available Abstract Animal models of psychiatric disorders are usually discussed with regard to three criteria first elaborated by Willner; face, predictive and construct validity. Here, we draw the history of these concepts and then try to redraw and refine these criteria, using the framework of the diathesis model of depression that has been proposed by several authors. We thus propose a set of five major criteria (with sub-categories for some of them; homological validity (including species validity and strain validity, pathogenic validity (including ontopathogenic validity and triggering validity, mechanistic validity, face validity (including ethological and biomarker validity and predictive validity (including induction and remission validity. Homological validity requires that an adequate species and strain be chosen: considering species validity, primates will be considered to have a higher score than drosophila, and considering strains, a high stress reactivity in a strain scores higher than a low stress reactivity in another strain. Pathological validity corresponds to the fact that, in order to shape pathological characteristics, the organism has been manipulated both during the developmental period (for example, maternal separation: ontopathogenic validity and during adulthood (for example, stress: triggering validity. Mechanistic validity corresponds to the fact that the cognitive (for example, cognitive bias or biological mechanisms (such as dysfunction of the hormonal stress axis regulation underlying the disorder are identical in both humans and animals. Face validity corresponds to the observable behavioral (ethological validity or biological (biomarker validity outcomes: for example anhedonic behavior (ethological validity or elevated corticosterone (biomarker validity. Finally, predictive validity corresponds to the identity of the relationship between the triggering factor and the outcome (induction validity and between the effects of

  11. Modeling Methane Emission from Rice Paddy Soils:Ⅱ.Model Validation and Application

    Institute of Scientific and Technical Information of China (English)

    HUANGYAO; R.L.SASS; 等

    1999-01-01

    A simulation model developed by the authors(Huang et al.,1999) was validated against independent field measurements of methane emission from rice paddy soils in Texas of USA,Tuzu of China and Vercelli of Italy.A simplified version of the simulation model was further validated against methane emission measurements from various regions of the world including Italy,China,Indonesia,Philippines and the Unitewd States.Model validation suggested that the seasonal variation of methane emission was mainly regulated by rice growth and development and that methane emisson could be predicted from rice net productivity,cultivar character,soil texture and temperature,and organic matter amendments.Model simulations in general agreed with the observations.The compariosn between computed and measured methane emission resulted in correlation coefficients r2 values from 0.450 to 0.952,significant at 0.01-0.001 probability level.On the basis of available information on rice cultivated area.growth duration,grain yield,soil texture and temperature,methane emission from rice paddy soils of mainland China was estimated for 28 rice cultivated provinces /municipal cities by employing the validated model.The calculated daily methane emission rates,on a provincial scale,ranged from 0.12 to 0.71 g m-2 with an average of 0.26 g m-2,A total amount of 7.92Tg CH4 per year,raging from 5.89 to 11.17 Tg year-1,was estimated to be released from Chinese rice paddy soils.Of the total,45% was emitted from the single-rice growing season,and 19%and 36% were from the early-rice and the late-rice growing seasons,respectively.Approximately 70% of the total was emitted in the region located at latitude between 25° and 32°N.The emissions from rice fields in Sichuan and Hunan provinces were calculated to be 2.34 Tg year-1,accounting for approximately 30% of the total.

  12. Application of a mixed DEA model to evaluate relative efficiency validity

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    Data envelopment analysis(DEA) model is widely used to evaluate the relative efficiency of producers. It is a kind of objective decision method with multiple indexes. However, the two basic models frequently used at present, the C2R model and the C2GS2 model have limitations when used alone,resulting in evaluations that are often unsatisfactory. In order to solve this problem, a mixed DEA model is built and is used to evaluate the validity of the business efficiency of listed companies. An explanation of how to use this mixed DEA model is offered and its feasibility is verified.

  13. Systematic validation of non-equilibrium thermochemical models using Bayesian inference

    KAUST Repository

    Miki, Kenji

    2015-10-01

    © 2015 Elsevier Inc. The validation process proposed by Babuška et al. [1] is applied to thermochemical models describing post-shock flow conditions. In this validation approach, experimental data is involved only in the calibration of the models, and the decision process is based on quantities of interest (QoIs) predicted on scenarios that are not necessarily amenable experimentally. Moreover, uncertainties present in the experimental data, as well as those resulting from an incomplete physical model description, are propagated to the QoIs. We investigate four commonly used thermochemical models: a one-temperature model (which assumes thermal equilibrium among all inner modes), and two-temperature models developed by Macheret et al. [2], Marrone and Treanor [3], and Park [4]. Up to 16 uncertain parameters are estimated using Bayesian updating based on the latest absolute volumetric radiance data collected at the Electric Arc Shock Tube (EAST) installed inside the NASA Ames Research Center. Following the solution of the inverse problems, the forward problems are solved in order to predict the radiative heat flux, QoI, and examine the validity of these models. Our results show that all four models are invalid, but for different reasons: the one-temperature model simply fails to reproduce the data while the two-temperature models exhibit unacceptably large uncertainties in the QoI predictions.

  14. Systematic validation of non-equilibrium thermochemical models using Bayesian inference

    Energy Technology Data Exchange (ETDEWEB)

    Miki, Kenji [NASA Glenn Research Center, OAI, 22800 Cedar Point Rd, Cleveland, OH 44142 (United States); Panesi, Marco, E-mail: mpanesi@illinois.edu [Department of Aerospace Engineering, University of Illinois at Urbana-Champaign, 306 Talbot Lab, 104 S. Wright St., Urbana, IL 61801 (United States); Prudhomme, Serge [Département de mathématiques et de génie industriel, Ecole Polytechnique de Montréal, C.P. 6079, succ. Centre-ville, Montréal, QC, H3C 3A7 (Canada)

    2015-10-01

    The validation process proposed by Babuška et al. [1] is applied to thermochemical models describing post-shock flow conditions. In this validation approach, experimental data is involved only in the calibration of the models, and the decision process is based on quantities of interest (QoIs) predicted on scenarios that are not necessarily amenable experimentally. Moreover, uncertainties present in the experimental data, as well as those resulting from an incomplete physical model description, are propagated to the QoIs. We investigate four commonly used thermochemical models: a one-temperature model (which assumes thermal equilibrium among all inner modes), and two-temperature models developed by Macheret et al. [2], Marrone and Treanor [3], and Park [4]. Up to 16 uncertain parameters are estimated using Bayesian updating based on the latest absolute volumetric radiance data collected at the Electric Arc Shock Tube (EAST) installed inside the NASA Ames Research Center. Following the solution of the inverse problems, the forward problems are solved in order to predict the radiative heat flux, QoI, and examine the validity of these models. Our results show that all four models are invalid, but for different reasons: the one-temperature model simply fails to reproduce the data while the two-temperature models exhibit unacceptably large uncertainties in the QoI predictions.

  15. Validation & verification of a Bayesian network model for aircraft vulnerability

    CSIR Research Space (South Africa)

    Schietekat, Sunelle

    2016-09-01

    Full Text Available : Series B (Statistical Methodology), 50(2), pp. 157-224. 12th INCOSE SA Systems Engineering Conference ISBN 978-0-620-72719-8 Page 103 M&SCO. 2013. Verification, Validation, & Accreditation (VV&A) Recommended Practices Guide (RPG). Retrieved from U....S. DoD Modelling & Simulation Coordination Office. http://www.msco.mil/VVA_RPG.html (last accessed April 8, 2016). Pearl, J. 1988. Probabilistic reasoning in intelligent systems: Networks of plausible inference. Morgan Kaufmann. Sargent, R. G. 1981...

  16. Recent Progress Validating the HADES Model of LLNL's HEAF MicroCT Measurements

    Energy Technology Data Exchange (ETDEWEB)

    White, W. T. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Bond, K. C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Lennox, K. P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Aufderheide, M. B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Seetho, I. M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Roberson, G. P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-07-17

    This report compares recent HADES calculations of x-ray linear attenuation coefficients to previous MicroCT measurements made at Lawrence Livermore National Laboratory’s High Energy Applications Facility (HEAF). The chief objective is to investigate what impact recent changes in HADES modeling have on validation results. We find that these changes have no obvious effect on the overall accuracy of the model. Detailed comparisons between recent and previous results are presented.

  17. Validation of multi-body modelling methodology for reconfigurable underwater robots

    DEFF Research Database (Denmark)

    Nielsen, M.C.; Eidsvik, O. A.; Blanke, Mogens;

    2016-01-01

    This paper investigates the problem of employing reconfigurable robots in an underwater setting. The main results presented is the experimental validation of a modelling methodology for a system consisting of N dynamically connected robots with heterogeneous dynamics. Two distinct types....... The purpose of the model is to enable design of control strategies for cooperative reconfigurable underwater systems....... of experiments are performed, a series of hydrostatic free-decay tests and a series of open-loop trajectory tests. The results are compared to a simulation based on the modelling methodology. The modelling methodology shows promising results for usage with systems composed of reconfigurable underwater modules...

  18. The ASCAT Soil Moisture Product: A Review of its Specifications, Validation Results, and Emerging Applications

    Directory of Open Access Journals (Sweden)

    Wolfgang Wagner

    2013-02-01

    applications. To provide a comprehensive overview of the major characteristics and caveats of the ASCAT soil moisture product, this paper describes the ASCAT instrument and the soil moisture processor and near-real-time distribution service implemented by the European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT. A review of the most recent validation studies shows that the quality of ASCAT soil moisture product is - with the exception of arid environments -comparable to, and over some regions (e.g. Europe even better than currently available soil moisture data derived from passive microwave sensors. Further, a review of applications studies shows that the use of the ASCAT soil moisture product is particularly advanced in the fields of numerical weather prediction and hydrologic modelling. But also in other application areas such as yield monitoring, epidemiologic modelling, or societal risks assessment some first progress can be noted. Considering the generally positive evaluation results, it is expected that the ASCAT soil moisture product will increasingly be used by a growing number of rather diverse land applications.

  19. Electro-thermal modeling of high power IGBT module short-circuits with experimental validation

    DEFF Research Database (Denmark)

    Wu, Rui; Iannuzzo, Francesco; Wang, Huai

    2015-01-01

    A novel Insulated Gate Bipolar Transistor (IGBT) electro-thermal modeling approach involving PSpice and ANSYS/Icepak with both high accuracy and simulation speed has been presented to study short-circuit of a 1.7 kV/1 kA commercial IGBT module. The approach successfully predicts the current...... and temperature distribution inside the chip of power IGBT modules. The simulation result is further validated using a 6 kA/1.1 kV non-destructive tester. The experimental validation demonstrates the modeling approach’s capability for reliable design of high power IGBT power modules given electrical...

  20. Numerical simulation and experimental validation of aircraft ground deicing model

    Directory of Open Access Journals (Sweden)

    Bin Chen

    2016-05-01

    Full Text Available Aircraft ground deicing plays an important role of guaranteeing the aircraft safety. In practice, most airports generally use as many deicing fluids as possible to remove the ice, which causes the waste of the deicing fluids and the pollution of the environment. Therefore, the model of aircraft ground deicing should be built to establish the foundation for the subsequent research, such as the optimization of the deicing fluid consumption. In this article, the heat balance of the deicing process is depicted, and the dynamic model of the deicing process is provided based on the analysis of the deicing mechanism. In the dynamic model, the surface temperature of the deicing fluids and the ice thickness are regarded as the state parameters, while the fluid flow rate, the initial temperature, and the injection time of the deicing fluids are treated as control parameters. Ignoring the heat exchange between the deicing fluids and the environment, the simplified model is obtained. The rationality of the simplified model is verified by the numerical simulation and the impacts of the flow rate, the initial temperature and the injection time on the deicing process are investigated. To verify the model, the semi-physical experiment system is established, consisting of the low-constant temperature test chamber, the ice simulation system, the deicing fluid heating and spraying system, the simulated wing, the test sensors, and the computer measure and control system. The actual test data verify the validity of the dynamic model and the accuracy of the simulation analysis.

  1. Evaluation and cross-validation of Environmental Models

    Science.gov (United States)

    Lemaire, Joseph

    Before scientific models (statistical or empirical models based on experimental measurements; physical or mathematical models) can be proposed and selected as ISO Environmental Standards, a Commission of professional experts appointed by an established International Union or Association (e.g. IAGA for Geomagnetism and Aeronomy, . . . ) should have been able to study, document, evaluate and validate the best alternative models available at a given epoch. Examples will be given, indicating that different values for the Earth radius have been employed in different data processing laboratories, institutes or agencies, to process, analyse or retrieve series of experimental observations. Furthermore, invariant magnetic coordinates like B and L, commonly used in the study of Earth's radiation belts fluxes and for their mapping, differ from one space mission data center to the other, from team to team, and from country to country. Worse, users of empirical models generally fail to use the original magnetic model which had been employed to compile B and L , and thus to build these environmental models. These are just some flagrant examples of inconsistencies and misuses identified so far; there are probably more of them to be uncovered by careful, independent examination and benchmarking. A meter prototype, the standard unit length that has been determined on 20 May 1875, during the Diplomatic Conference of the Meter, and deposited at the BIPM (Bureau International des Poids et Mesures). In the same token, to coordinate and safeguard progress in the field of Space Weather, similar initiatives need to be undertaken, to prevent wild, uncontrolled dissemination of pseudo Environmental Models and Standards. Indeed, unless validation tests have been performed, there is guaranty, a priori, that all models on the market place have been built consistently with the same units system, and that they are based on identical definitions for the coordinates systems, etc... Therefore

  2. Simulation of plasma turbulence in scrape-off layer conditions: the GBS code, simulation results and code validation

    Science.gov (United States)

    Ricci, P.; Halpern, F. D.; Jolliet, S.; Loizu, J.; Mosetto, A.; Fasoli, A.; Furno, I.; Theiler, C.

    2012-12-01

    Based on the drift-reduced Braginskii equations, the Global Braginskii Solver, GBS, is able to model the scrape-off layer (SOL) plasma turbulence in terms of the interplay between the plasma outflow from the tokamak core, the turbulent transport, and the losses at the vessel. Model equations, the GBS numerical algorithm, and GBS simulation results are described. GBS has been first developed to model turbulence in basic plasma physics devices, such as linear and simple magnetized toroidal devices, which contain some of the main elements of SOL turbulence in a simplified setting. In this paper we summarize the findings obtained from the simulation carried out in these configurations and we report the first simulations of SOL turbulence. We also discuss the validation project that has been carried out together with the GBS development.

  3. Experimental validation of a numerical model for subway induced vibrations

    Science.gov (United States)

    Gupta, S.; Degrande, G.; Lombaert, G.

    2009-04-01

    This paper presents the experimental validation of a coupled periodic finite element-boundary element model for the prediction of subway induced vibrations. The model fully accounts for the dynamic interaction between the train, the track, the tunnel and the soil. The periodicity or invariance of the tunnel and the soil in the longitudinal direction is exploited using the Floquet transformation, which allows for an efficient formulation in the frequency-wavenumber domain. A general analytical formulation is used to compute the response of three-dimensional invariant or periodic media that are excited by moving loads. The numerical model is validated by means of several experiments that have been performed at a site in Regent's Park on the Bakerloo line of London Underground. Vibration measurements have been performed on the axle boxes of the train, on the rail, the tunnel invert and the tunnel wall, and in the free field, both at the surface and at a depth of 15 m. Prior to these vibration measurements, the dynamic soil characteristics and the track characteristics have been determined. The Bakerloo line tunnel of London Underground has been modelled using the coupled periodic finite element-boundary element approach and free field vibrations due to the passage of a train at different speeds have been predicted and compared to the measurements. The correspondence between the predicted and measured response in the tunnel is reasonably good, although some differences are observed in the free field. The discrepancies are explained on the basis of various uncertainties involved in the problem. The variation in the response with train speed is similar for the measurements as well as the predictions. This study demonstrates the applicability of the coupled periodic finite element-boundary element model to make realistic predictions of the vibrations from underground railways.

  4. Planck Intermediate Results. IV. The XMM-Newton validation programme for new Planck clusters

    CERN Document Server

    Ade, P A R; Arnaud, M; Ashdown, M; Aumont, J; Baccigalupi, C; Balbi, A; Banday, A J; Barreiro, R B; Bartlett, J G; Battaner, E; Benabed, K; Benoît, A; Bernard, J -P; Bikmaev, I; Böhringer, H; Bonaldi, A; Borgani, S; Borrill, J; Bouchet, F R; Brown, M L; Burigana, C; Butler, R C; Cabella, P; Carvalho, P; Catalano, A; Cayón, L; Chamballu, A; Chary, R -R; Chiang, L -Y; Chon, G; Christensen, P R; Clements, D L; Colafrancesco, S; Colombi, S; Crill, B P; Cuttaia, F; Da Silva, A; Dahle, H; Davis, R J; de Bernardis, P; de Gasperis, G; de Zotti, G; Delabrouille, J; Démoclès, J; Désert, F -X; Diego, J M; Dolag, K; Dole, H; Donzelli, S; Doré, O; Douspis, M; Dupac, X; Enßlin, T A; Eriksen, H K; Finelli, F; Flores-Cacho, I; Frailis, M; Franceschi, E; Frommert, M; Galeotta, S; Ganga, K; Génova-Santos, R T; Giraud-Héraud, Y; González-Nuevo, J; González-Riestra, R; Górski, K M; Gregorio, A; Hansen, F K; Harrison, D; Henrot-Versillé, S; Hernández-Monteagudo, C; Herranz, D; Hildebrandt, S R; Hivon, E; Hobson, M; Holmes, W A; Hornstrup, A; Hovest, W; Huffenberger, K M; Hurier, G; Jaffe, A H; Jagemann, T; Jones, W C; Juvela, M; Kneissl, R; Knoche, J; Knox, L; Kunz, M; Kurki-Suonio, H; Lagache, G; Lamarre, J -M; Lasenby, A; Lawrence, C R; Jeune, M Le; Leach, S; Leonardi, R; Liddle, A; Lilje, P B; Linden-Vornle, M; López-Caniego, M; Luzzi, G; Macías-Pérez, J F; Maino, D; Mann, R; Marleau, F; Marshall, D J; Martínez-González, E; Masi, S; Massardi, M; Matarrese, S; Mazzotta, P; Mei, S; Melchiorri, A; Melin, J -B; Mendes, L; Mennella, A; Mitra, S; Miville-Deschênes, M -A; Moneti, A; Morgante, G; Mortlock, D; Munshi, D; Naselsky, P; Nati, F; Norgaard-Nielsen, H U; Noviello, F; Osborne, S; Pajot, F; Paoletti, D; Perdereau, O; Perrotta, F; Piacentini, F; Piat, M; Pierpaoli, E; Piffaretti, R; Plaszczynski, S; Platania, P; Pointecouteau, E; Polenta, G; Popa, L; Poutanen, T; Pratt, G W; Prunet, S; Puget, J -L; Reinecke, M; Remazeilles, M; Renault, C; Ricciardi, S; Rocha, G; Rosset, C; Rossetti, M; Rubiño-Martín, J A; Rusholme, B; Sandri, M; Savini, G; Scott, D; Smoot, G F; Stanford, A; Stivoli, F; Sudiwala, R; Sunyaev, R; Sutton, D; Sygnet, J -F; Tauber, J A; Terenzi, L; Toffolatti, L; Tomasi, M; Tristram, M; Valenziano, L; Van Tent, B; Vielva, P; Villa, F; Vittorio, N; Wade, L A; Wandelt, B D; Welikala, N; Weller, J; White, S D M; Yvon, D; Zacchei, A; Zonca, A

    2012-01-01

    We present the final results from the XMM-Newton validation follow-up of new Planck cluster candidates. We observed 15 new candidates, detected with signal-to-noise ratios between 4.0 and 6.1 in the 15.5-month nominal Planck survey. The candidates were selected using ancillary data flags derived from the ROSAT All Sky Survey (RASS) and Digitized Sky Survey all-sky maps, with the aim of pushing into the low SZ flux, high- z regime and testing RASS flags as indicators of candidate reliability. 14 new clusters were detected by XMM-Newton, 10 single clusters and 2 double systems. Redshifts from X-ray spectroscopy lie in the range 0.2 to 0.9, with six clusters at z>0.5. Estimated M500 ranges from 2.5 X 10^14 to 8 X 10^14 Msun. We discuss our results in the context of the full XMM validation programme, in which 51 new clusters have been detected. This includes 4 double and 2 triple systems, some of which are chance projections on the sky of clusters at different redshifts. Association with a source from the RASS-Br...

  5. Planck Intermediate Results. I. Further validation of new Planck clusters with XMM-Newton

    CERN Document Server

    Aghanim, N; Ashdown, M; Atrio-Barandela, F; Aumont, J; Baccigalupi, C; Balbi, A; Banday, A J; Barreiro, R B; Bartlett, J G; Battaner, E; Bernard, J -P; Böhringer, H; Bonaldi, A; Borrill, J; Bourdin, H; Brown, M L; Burigana, C; Butler, R C; Cabella, P; Cardoso, J -F; Carvalho, P; Catalano, A; Cayón, L; Chamballu, A; Chary, R -R; Chiang, L -Y; Chon, G; Christensen, P R; Clements, D L; Colafrancesco, S; Colombi, S; Coulais, A; Crill, B P; Cuttaia, F; Da Silva, A; Dahle, H; Davis, R J; de Bernardis, P; de Gasperis, G; de Zotti, G; Delabrouille, J; Démoclés, J; Désert, F -X; Diego, J M; Dolag, K; Dole, H; Donzelli, S; Doré, O; Douspis, M; Dupac, X; Enßlin, T A; Eriksen, H K; Finelli, F; Flores-Cacho, I; Forni, O; Fosalba, P; Frailis, M; Fromenteau, S; Galeotta, S; Ganga, K; Génova-Santos, R T; Giard, M; González-Nuevo, J; González-Riestra, R; Gruppuso, A; Hansen, F K; Harrison, D; Hempel, A; Hernández-Monteagudo, C; Herranz, D; Hildebrandt, S R; Hornstrup, A; Huffenberger, K M; Hurier, G; Jasche, J; Juvela, M; Keihänen, E; Kisner, T S; Kneissl, R; Knoche, J; Knox, L; Kurki-Suonio, H; Lähteenmäki, A; Lamarre, J -M; Lasenby, A; Leonardi, R; Liddle, A; Lilje, P B; López-Caniego, M; Luzzi, G; Macías-Pérez, J F; Maino, D; Mann, R; Marleau, F; Marshall, D J; Martínez-González, E; Masi, S; Massardi, M; Matarrese, S; Matthai, F; Mazzotta, P; Meinhold, P R; Melchiorri, A; Melin, J -B; Mendes, L; Mennella, A; Miville-Deschênes, M -A; Moneti, A; Montier, L; Morgante, G; Mortlock, D; Munshi, D; Naselsky, P; Natoli, P; Nørgaard-Nielsen, H U; Noviello, F; Osborne, S; Pasian, F; Patanchon, G; Perdereau, O; Perrotta, F; Piacentini, F; Plaszczynski, S; Pointecouteau, E; Polenta, G; Ponthieu, N; Popa, L; Poutanen, T; Pratt, G W; Rachen, J P; Rebolo, R; Reinecke, M; Remazeilles, M; Renault, C; Ricciardi, S; Riller, T; Ristorcelli, I; Rosset, C; Rossetti, M; Rubiño-Martín, J A; Rusholme, B; Sandri, M; Savini, G; Schaefer, B M; Scott, D; Smoot, G F; Starck, J -L; Stivoli, F; Sutton, D; Sygnet, J -F; Tauber, J A; Terenzi, L; Toffolatti, L; Tomasi, M; Tristram, M; Valenziano, L; Van Tent, B; Vielva, P; Villa, F; Vittorio, N; Wandelt, B D; Weller, J; Yvon, D; Zacchei, A

    2011-01-01

    We present further results from the ongoing XMM-Newton validation follow-up of Planck cluster candidates, detailing X-ray observations of eleven candidates detected at a signal-to-noise ratio of 4.5validation results. Ten of the candidates are found to be bona fide clusters lying below the RASS flux limit. Redshift estimates are available for all confirmed systems via X-ray Fe-line spectroscopy. They lie in the redshift range 0.19

  6. The use of reconstructed human epidermis for skin absorption testing: Results of the validation study.

    Science.gov (United States)

    Schäfer-Korting, Monika; Bock, Udo; Diembeck, Walter; Düsing, Hans-Jürgen; Gamer, Armin; Haltner-Ukomadu, Eleonore; Hoffmann, Christine; Kaca, Monika; Kamp, Hennicke; Kersen, Silke; Kietzmann, Manfred; Korting, Hans Christian; Krächter, Hans-Udo; Lehr, Claus-Michael; Liebsch, Manfred; Mehling, Annette; Müller-Goymann, Christel; Netzlaff, Frank; Niedorf, Frank; Rübbelke, Maria K; Schäfer, Ulrich; Schmidt, Elisabeth; Schreiber, Sylvia; Spielmann, Horst; Vuia, Alexander; Weimer, Michaela

    2008-05-01

    A formal validation study was performed, in order to investigate whether the commercially-available reconstructed human epidermis (RHE) models, EPISKIN, EpiDerm and SkinEthic, are suitable for in vitro skin absorption testing. The skin types currently recommended in the OECD Test Guideline 428, namely, ex vivo human epidermis and pig skin, were used as references. Based on the promising outcome of the prevalidation study, the panel of test substances was enlarged to nine substances, covering a wider spectrum of physicochemical properties. The substances were tested under both infinite-dose and finite-dose conditions, in ten laboratories, under strictly controlled conditions. The data were subjected to independent statistical analyses. Intra-laboratory and inter-laboratory variability contributed almost equally to the total variability, which was in the same range as that in preceding studies. In general, permeation of the RHE models exceeded that of human epidermis and pig skin (the SkinEthic RHE was found to be the most permeable), yet the ranking of substance permeation through the three tested RHE models and the pig skin reflected the permeation through human epidermis. In addition, both infinite-dose and finite-dose experiments are feasible with RHE models. The RHE models did not show the expected significantly better reproducibility, as compared to excised skin, despite a tendency toward lower variability of the data. Importantly, however, the permeation data showed a sufficient correlation between all the preparations examined. Thus, the RHE models, EPISKIN, EpiDerm and SkinEthic, are appropriate alternatives to human and pig skin, for the in vitro assessment of the permeation and penetration of substances when applied as aqueous solutions.

  7. Empirical validation of the InVEST water yield ecosystem service model at a national scale.

    Science.gov (United States)

    Redhead, J W; Stratford, C; Sharps, K; Jones, L; Ziv, G; Clarke, D; Oliver, T H; Bullock, J M

    2016-11-01

    A variety of tools have emerged with the goal of mapping the current delivery of ecosystem services and quantifying the impact of environmental changes. An important and often overlooked question is how accurate the outputs of these models are in relation to empirical observations. In this paper we validate a hydrological ecosystem service model (InVEST Water Yield Model) using widely available data. We modelled annual water yield in 22 UK catchments with widely varying land cover, population and geology, and compared model outputs with gauged river flow data from the UK National River Flow Archive. Values for input parameters were selected from existing literature to reflect conditions in the UK and were subjected to sensitivity analyses. We also compared model performance between precipitation and potential evapotranspiration data sourced from global- and UK-scale datasets. We then tested the transferability of the results within the UK by additional validation in a further 20 catchments. Whilst the model performed only moderately with global-scale data (linear regression of modelled total water yield against empirical data; slope=0.763, intercept=54.45, R(2)=0.963) with wide variation in performance between catchments, the model performed much better when using UK-scale input data, with closer fit to the observed data (slope=1.07, intercept=3.07, R(2)=0.990). With UK data the majority of catchments showed modelled water yield but there was a minor but consistent overestimate per hectare (86m(3)/ha/year). Additional validation on a further 20 UK catchments was similarly robust, indicating that these results are transferable within the UK. These results suggest that relatively simple models can give accurate measures of ecosystem services. However, the choice of input data is critical and there is a need for further validation in other parts of the world. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. The impact of school leadership on school level factors: validation of a causal model

    NARCIS (Netherlands)

    Krüger, M.L.; Witziers, B.; Sleegers, P.

    2007-01-01

    This study aims to contribute to a better understanding of the antecedents and effects of educational leadership, and of the influence of the principal's leadership on intervening and outcome variables. A path analysis was conducted to test and validate a causal model. The results show no direct or

  9. The impact of school leadership on school level factors: validation of a causal model

    NARCIS (Netherlands)

    M.L. Krüger; B. Witziers; P. Sleegers

    2007-01-01

    This study aims to contribute to a better understanding of the antecedents and effects of educational leadership, and of the influence of the principal's leadership on intervening and outcome variables. A path analysis was conducted to test and validate a causal model. The results show no direct or

  10. Validity and reliability of patient reported outcomes used in Psoriasis: results from two randomized clinical trials

    Directory of Open Access Journals (Sweden)

    Koo John

    2003-10-01

    Full Text Available Abstract Background Two Phase III randomized controlled clinical trials were conducted to assess the efficacy, safety, and tolerability of weekly subcutaneous administration of efalizumab for the treatment of psoriasis. Patient reported measures of psoriasis-related functionality and health-related quality of life and of psoriasis-related symptom assessments were included as part of the trials. Objective To assess the reliability, validity, and responsiveness of the patient reported outcome measures that were used in the trials – the Dermatology Life Quality Index (DLQI, the Psoriasis Symptom Assessment (PSA Scale, and two itch measures, a Visual Analog Scale (VAS and the National Psoriasis Foundation (NPF itch measure. Methods Subjects aged 18 to 70 years with moderate to severe psoriasis for at least 6 months were recruited into the two clinical trials (n = 1095. Internal consistency reliability was evaluated for all patient reported outcomes at baseline and at 12 weeks. Construct validity was evaluated by relations among the different patient reported outcomes and between the patient reported outcomes and the clinical assessments (Psoriasis Area and Severity Index; Overall Lesion Severity Scale; Physician's Global Assessment of Change assessed at baseline and at 12 weeks, as was the change over the course of the 12 week portion of the trial. Results Internal consistency reliability ranged from 0.86 to 0.95 for the patient reported outcome measures. The patient reported outcome measures were all shown to have significant construct validity with respect to each other and with respect to the clinical assessments. The four measures also demonstrated significant responsiveness to change in underlying clinical status of the patients over the course of the trial, as measured by the independently assessed clinical outcomes. Conclusions The DLQI, the PSA, VAS, and the NPF are considered useful tools for the measurement of dermatology

  11. Low Altitude Validation of Geomagnetic Cutoff Models Using SAMPEX Data

    Science.gov (United States)

    Young, S. L.; Kress, B. T.

    2011-12-01

    Single event upsets (SEUs) caused by MeV protons are a concern for satellite operators so AFRL is working to create a tool that can specify and/or forecast SEU probabilities. An important component of the tool's SEU probability calculation will be the local energetic ion spectrum. The portion of that spectrum due to trapped energetic ion population is relatively stable and predictable; however it is more difficult to account for the transient solar energetic particles (SEPs). These particles, which can be ejected from the solar atmosphere during a solar flare or filament eruption or can be energized by coronal mass ejection (CME) driven shocks, can penetrate the Earth's magnetosphere into regions not normally populated by energetic protons. The magnetosphere will provide energy dependent shielding that also depends on its magnetic configuration. During magnetic storms that configuration is modified and the SEP cutoff latitude for a given particle energy can be suppressed up to ~15 degrees equatorward exposing normally shielded regions. As a first step to creating the satellite SEU prediction tool, we are comparing the Smart et al. (Advances in Space Research, 2006) and CISM-Dartmouth (Kress et al., Space Weather, 2010) geomagnetic cutoff tools. While they have provided some of their own validations in the noted papers, our validation will be done consistently between models allowing us to better compare the models.

  12. A geomagnetically induced current warning system: model development and validation

    Science.gov (United States)

    McKay, A.; Clarke, E.; Reay, S.; Thomson, A.

    Geomagnetically Induced Currents (GIC), which can flow in technological systems at the Earth's surface, are a consequence of magnetic storms and Space Weather. A well-documented practical problem for the power transmission industry is that GIC can affect the lifetime and performance of transformers within the power grid. Operational mitigation is widely considered to be one of the best strategies to manage the Space Weather and GIC risk. Therefore in the UK a magnetic storm warning and GIC monitoring and analysis programme has been under development by the British Geological Survey and Scottish Power plc (the power grid operator for Central Scotland) since 1999. Under the auspices of the European Space Agency's service development activities BGS is developing the capability to meet two key user needs that have been identified. These needs are, firstly, the development of a near real-time solar wind shock/ geomagnetic storm warning, based on L1 solar wind data and, secondly, the development of an integrated surface geo-electric field and power grid network model that should allow prediction of GIC throughout the power grid in near real time. While the final goal is a `seamless package', the components of the package utilise diverse scientific techniques. We review progress to date with particular regard to the validation of the individual components of the package. The Scottish power grid response to the October 2003 magnetic storms is also discussed and model and validation data are presented.

  13. Physiologically Based Modelling of Dioxins. I. Validation of a rodent toxicokinetic model

    NARCIS (Netherlands)

    Zeilmaker MJ; Slob W

    1993-01-01

    In this report a rodent Physiologically Based PharmacoKinetic (PBPK) model for 2,3,7,8-tetrachlorodibenzodioxin is described. Validation studies, in which model simulations of TCDD disposition were compared with in vivo TCDD disposition in rodents exposed to TCDD, showed that the model adequately p

  14. Validation of numerical models for flow simulation in labyrinth seals

    Science.gov (United States)

    Frączek, D.; Wróblewski, W.

    2016-10-01

    CFD results were compared with the results of experiments for the flow through the labyrinth seal. RANS turbulence models (k-epsilon, k-omega, SST and SST-SAS) were selected for the study. Steady and transient results were analyzed. ANSYS CFX was used for numerical computation. The analysis included flow through sealing section with the honeycomb land. Leakage flows and velocity profiles in the seal were compared. In addition to the comparison of computational models, the divergence of modeling and experimental results has been determined. Tips for modeling these problems were formulated.

  15. A Validation Process for the Groundwater Flow and Transport Model of the Faultless Nuclear Test at Central Nevada Test Area

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed Hassan

    2003-01-01

    Many sites of groundwater contamination rely heavily on complex numerical models of flow and transport to develop closure plans. This has created a need for tools and approaches that can be used to build confidence in model predictions and make it apparent to regulators, policy makers, and the public that these models are sufficient for decision making. This confidence building is a long-term iterative process and it is this process that should be termed ''model validation.'' Model validation is a process not an end result. That is, the process of model validation cannot always assure acceptable prediction or quality of the model. Rather, it provides safeguard against faulty models or inadequately developed and tested models. Therefore, development of a systematic approach for evaluating and validating subsurface predictive models and guiding field activities for data collection and long-term monitoring is strongly needed. This report presents a review of model validation studies that pertain to groundwater flow and transport modeling. Definitions, literature debates, previously proposed validation strategies, and conferences and symposia that focused on subsurface model validation are reviewed and discussed. The review is general in nature, but the focus of the discussion is on site-specific, predictive groundwater models that are used for making decisions regarding remediation activities and site closure. An attempt is made to compile most of the published studies on groundwater model validation and assemble what has been proposed or used for validating subsurface models. The aim is to provide a reasonable starting point to aid the development of the validation plan for the groundwater flow and transport model of the Faultless nuclear test conducted at the Central Nevada Test Area (CNTA). The review of previous studies on model validation shows that there does not exist a set of specific procedures and tests that can be easily adapted and

  16. Satellite information of sea ice for model validation

    Science.gov (United States)

    Saheed, P. P.; Mitra, Ashis K.; Momin, Imranali M.; Mahapatra, Debasis K.; Rajagopal, E. N.

    2016-05-01

    Emergence of extensively large computational facilities have enabled the scientific world to use earth system models for understating the prevailing dynamics of the earth's atmosphere, ocean and cryosphere and their inter relations. The sea ice in the arctic and the Antarctic has been identified as one of the main proxies to study the climate changes. The rapid sea-ice melting in the Arctic and disappearance of multi-year sea ice has become a matter of concern. The earth system models couple the ocean, atmosphere and sea-ice in order to bring out the possible inter connections between these three very important components and their role in the changing climate. The Indian monsoon is seen to be subjected to nonlinear changes in the recent years. The rapid ice melt in the Arctic sea ice is apparently linked to the changes in the weather and climate of the Indian subcontinent. The recent findings reveal the relation between the high events occurs in the Indian subcontinent and the Arctic sea ice melt episodes. The coupled models are being used in order to study the depth of these relations. However, the models have to be validated extensively by using measured parameters. The satellite measurements of sea-ice starts from way back in 1979. There have been many data sets available since then. Here in this study, an evaluation of the existing data sets is conducted. There are some uncertainties in these data sets. It could be associated with the absence of a single sensor for a long period of time and also the absence of accurate in-situ measurements in order to validate the satellite measurements.

  17. Development and validation of a liquid composite molding model

    Science.gov (United States)

    Bayldon, John Michael

    2007-12-01

    In composite manufacturing, Vacuum Assisted Resin Transfer Molding (VARTM) is becoming increasingly important as a cost effective manufacturing method of structural composites. In this process the dry preform (reinforcement) is placed on a rigid tool and covered by a flexible film to form an airtight vacuum bag. Liquid resin is drawn under vacuum through the preform inside the vacuum bag. Modeling of this process relies on a good understanding of closely coupled phenomena. The resin flow depends on the preform permeability, which in turn depends on the local fluid pressure and the preform compaction behavior. VARTM models for predicting the flow rate in this process do exist, however, they are not able to properly predict the flow for all classes of reinforcement material. In this thesis, the continuity equation used in VARTM models is reexamined and a modified form proposed. In addition, the compaction behavior of the preform in both saturated and dry states is studied in detail and new models are proposed for the compaction behavior. To assess the validity of the proposed models, the shadow moire method was adapted and used to perform full field measurement of the preform thickness during infusion, in addition to the usual measurements of flow front position. A new method was developed and evaluated for the analysis of the moire data related to the VARTM process, however, the method has wider applicability to other full field thickness measurements. The use of this measurement method demonstrated that although the new compaction models work well in the characterization tests, they do not properly describe all the preform features required for modeling the process. In particular the effect of varying saturation on the preform's behavior requires additional study. The flow models developed did, however, improve the prediction of the flow rate for the more compliant preform material tested, and the experimental techniques have shown where additional test methods

  18. Validation of coastal oceanographic models at Forsmark. Site descriptive modelling SDM-Site Forsmark

    Energy Technology Data Exchange (ETDEWEB)

    Engqvist, Anders (A och I Engqvist Konsult HB, Vaxholm (SE)); Andrejev, Oleg (Finnish Inst. of Marine Research, Helsinki (FI))

    2008-01-15

    The Swedish Nuclear Fuel and Waste Management Company (SKB) is undertaking site characterisation at two different locations, the Forsmark and the Simpevarp areas, with the objective of siting a geological repository for spent nuclear fuel. The characterisation work is divided into an initial site investigation phase and a complete site investigation phase. In this context, the water exchange of the coastal zone is one link of the chain of possible nuclide transport mechanisms that must be assessed in the site description of potential repository areas. For the purpose of validating the pair of nested 3D-models employed to simulate the water exchange in the near-shore coastal zone in the Forsmark area, an encompassing measurement program entailing six stations has been performed. The design of this program was to first assess to what degree the forcing of the fine resolution (FR) model of the Forsmark study area at its interfacial boundary to the coarse resolution (CR) model of the entire Baltic was reproduced. In addition to this scrutiny it is of particular interest how the time-varying density-determining properties, salinity and temperature, at the borders are propagated into the FR-domain, since this corresponds to the most efficient mode of water exchange. An important part of the validation process has been to carefully evaluate which measurement data that can be considered reliable. The result was that several periods of foremost near-surface salinity data had to be discarded due to growth of algae on the conductivity sensors. Lack of thorough absolute calibration of the salinity meters also necessitates dismissal of measurement data. Relative the assessed data that can be accepted as adequate, the outcome of the validation can be summarized in five points: (i) The surface-most salinity of the CR-model drifts downward a little less than one practical salinity unit (psu) per year, requiring that the ensuing correlation analysis be subdivided into periods of a

  19. Using sensitivity analysis to validate the predictions of a biomechanical model of bite forces.

    Science.gov (United States)

    Sellers, William Irvin; Crompton, Robin Huw

    2004-02-01

    Biomechanical modelling has become a very popular technique for investigating functional anatomy. Modern computer simulation packages make producing such models straightforward and it is tempting to take the results produced at face value. However the predictions of a simulation are only valid when both the model and the input parameters are accurate and little work has been done to verify this. In this paper a model of the human jaw is produced and a sensitivity analysis is performed to validate the results. The model is built using the ADAMS multibody dynamic simulation package incorporating the major occlusive muscles of mastication (temporalis, masseter, medial and lateral pterygoids) as well as a highly mobile temporomandibular joint. This model is used to predict the peak three-dimensional bite forces at each teeth location, joint reaction forces, and the contributions made by each individual muscle. The results for occlusive bite-force (1080N at M1) match those previously published suggesting the model is valid. The sensitivity analysis was performed by sampling the input parameters from likely ranges and running the simulation many times rather than using single, best estimate values. This analysis shows that the magnitudes of the peak retractive forces on the lower teeth were highly sensitive to the chosen origin (and hence fibre direction) of the temporalis and masseter muscles as well as the laxity of the TMJ. Peak protrusive force was also sensitive to the masseter origin. These result shows that the model is insufficiently complex to estimate these values reliably although the much lower sensitivity values obtained for the bite forces in the other directions and also for the joint reaction forces suggest that these predictions are sound. Without the sensitivity analysis it would not have been possible to identify these weaknesses which strongly supports the use of sensitivity analysis as a validation technique for biomechanical modelling.

  20. Development and Validation of a Materials Preparation Model from the Perspective of Transformative Pedagogy

    Directory of Open Access Journals (Sweden)

    Hamed Barjesteh

    2015-12-01

    Full Text Available This study is a report on the design, development, and validation of a model within the main tenets of critical pedagogy (CP with a hope to implement in education in general and applied linguistics in particular. To develop a transformative L2 materials preparation (TLMP model, the researchers drew on Crawford’s (1978 principles of CP as a springboard. These principles provide the theoretical framework of the ELT program in general, but they need to be adapted to the specific features of L2 materials development. To this end, Nation and Macalister’s (2010 model of materials development were utilized to base different aspects of materials preparation. The newly developed model has driven 22 principles which was validated through a stepwise process. It was administered among 110 participants in 15 cities of Iran. Exploratory and confirmatory factor analyses were performed. The results indicated a high level of internal consistency and satisfactory construct validity. The TLMP model could be conducive for language policy makers, ELT professionals, materials and curriculum developers. Keywords: Critical pedagogy, materials development, transformative model, ELT community, development, validation

  1. Solar energy hot water heating and electric utilities. A model validation

    Science.gov (United States)

    1981-10-01

    TRNSYS is a residential solar simulation program designed to provide detailed simulations of individual solar systems composed of almost any presently used residential solar technology. The model is described and a validation of the model is presented using a group of domestic solar hot water systems in the metropolitan Philadelphia area. The collection and reduction of the data used is discussed, and the TRNSYS modeling of the systems is presented. The model results are given and a sensitivity analysis of the models was performed to determine the effect of input changes on the electric auxiliary backup consumption.

  2. A More Refined Thermal Model of IGBT Devices: Development and Experimental Validation

    Directory of Open Access Journals (Sweden)

    Nacereddine Benamrouche

    2015-06-01

    Full Text Available Electro-thermal and thermo-mechanical effects are becoming more and more important in power electronic systems as industry seeks to decrease packaging and increase power densities.  Therefore the demand for faster and more accurate thermal models is increasing.  This paper proposes an improved thermal IGBT model based on a lumped parameter modelling method. The experimental setup, testing and analysis of the results are then addressed in order to validate the developed model. The key component of this model is taking into account the temperature dependence of the different materials constituting the IGBT and its  accurate discritisation. .

  3. Stable Eutectoid Transformation in Nodular Cast Iron: Modeling and Validation

    Science.gov (United States)

    Carazo, Fernando D.; Dardati, Patricia M.; Celentano, Diego J.; Godoy, Luis A.

    2016-10-01

    This paper presents a new microstructural model of the stable eutectoid transformation in a spheroidal cast iron. The model takes into account the nucleation and growth of ferrite grains and the growth of graphite spheroids. Different laws are assumed for the growth of both phases during and below the intercritical stable eutectoid. At a microstructural level, the initial conditions for the phase transformations are obtained from the microstructural simulation of solidification of the material, which considers the divorced eutectic and the subsequent growth of graphite spheroids up to the initiation of the stable eutectoid transformation. The temperature field is obtained by solving the energy equation by means of finite elements. The microstructural (phase change) and macrostructural (energy balance) models are coupled by a sequential multiscale procedure. Experimental validation of the model is achieved by comparison with measured values of fractions and radius of 2D view of ferrite grains. Agreement with such experiments indicates that the present model is capable of predicting ferrite phase fraction and grain size with reasonable accuracy.

  4. Stable Eutectoid Transformation in Nodular Cast Iron: Modeling and Validation

    Science.gov (United States)

    Carazo, Fernando D.; Dardati, Patricia M.; Celentano, Diego J.; Godoy, Luis A.

    2017-01-01

    This paper presents a new microstructural model of the stable eutectoid transformation in a spheroidal cast iron. The model takes into account the nucleation and growth of ferrite grains and the growth of graphite spheroids. Different laws are assumed for the growth of both phases during and below the intercritical stable eutectoid. At a microstructural level, the initial conditions for the phase transformations are obtained from the microstructural simulation of solidification of the material, which considers the divorced eutectic and the subsequent growth of graphite spheroids up to the initiation of the stable eutectoid transformation. The temperature field is obtained by solving the energy equation by means of finite elements. The microstructural (phase change) and macrostructural (energy balance) models are coupled by a sequential multiscale procedure. Experimental validation of the model is achieved by comparison with measured values of fractions and radius of 2D view of ferrite grains. Agreement with such experiments indicates that the present model is capable of predicting ferrite phase fraction and grain size with reasonable accuracy.

  5. MODEL IMPROVEMENT AND EXPERI-MENT VALIDATION OF PNEUMATIC ARTIFICIAL MUSCLES

    Institute of Scientific and Technical Information of China (English)

    Zhou Aiguo; Shi Guanglin; Zhong Tingxiu

    2004-01-01

    According to the deficiency of the present model of pneumatic artificial muscles (PAM), a serial model is built up based on the PAM's essential working principle with the elastic theory, it is validated by the quasi-static and dynamic experiment results, which are gained from two experiment systems.The experiment results and the simulation results illustrate that the serial model has made a great success compared with Chou's model, which can describe the force characteristics of PAM more precisely.A compensation item considering the braid's elasticity and the coulomb damp is attached to the serial model based on the analysis of the experiment results.The dynamic experiment proves that the viscous damp of the PAM could be ignored in order to simplify the model of PAM.Finally, an improved serial model of PAM is obtained.

  6. VALIDATION OF SIMULATION MODELS FOR DIFFERENTLY DESIGNED HEAT-PIPE EVACUATED TUBULAR COLLECTORS

    DEFF Research Database (Denmark)

    Fan, Jianhua; Dragsted, Janne; Furbo, Simon

    2007-01-01

    Differently designed heat-pipe evacuated tubular collectors have been investigated theoretically and experimentally. The theoretical work has included development of two TRNSYS [1] simulation models for heat-pipe evacuated tubular collectors utilizing solar radiation from all directions. One model...... coating on both sides. The input to the models is thus not a simple collector efficiency expression but the actual collector geometry. In this study, the TRNSYS models are validated with measurements for four differently designed heat-pipe evacuated tubular collectors. The collectors are produced...... cases, a good degree of similarity between measured and calculated results is found. With these validated models detailed parameter analyses and collector design optimization are now possible. Key words: Evacuated tubular collector, Heat pipe, Thermal performance, TRNSYS simulation....

  7. DEVELOPMENT AND VALIDATION OF A MULTIFIELD MODEL OF CHURN-TURBULENT GAS/LIQUID FLOWS

    Energy Technology Data Exchange (ETDEWEB)

    Elena A. Tselishcheva; Steven P. Antal; Michael Z. Podowski; Donna Post Guillen

    2009-07-01

    The accuracy of numerical predictions for gas/liquid two-phase flows using Computational Multiphase Fluid Dynamics (CMFD) methods strongly depends on the formulation of models governing the interaction between the continuous liquid field and bubbles of different sizes. The purpose of this paper is to develop, test and validate a multifield model of adiabatic gas/liquid flows at intermediate gas concentrations (e.g., churn-turbulent flow regime), in which multiple-size bubbles are divided into a specified number of groups, each representing a prescribed range of sizes. The proposed modeling concept uses transport equations for the continuous liquid field and for each bubble field. The overall model has been implemented in the NPHASE-CMFD computer code. The results of NPHASE-CMFD simulations have been validated against the experimental data from the TOPFLOW test facility. Also, a parametric analysis on the effect of various modeling assumptions has been performed.

  8. Summarising and validating test accuracy results across multiple studies for use in clinical practice.

    Science.gov (United States)

    Riley, Richard D; Ahmed, Ikhlaaq; Debray, Thomas P A; Willis, Brian H; Noordzij, J Pieter; Higgins, Julian P T; Deeks, Jonathan J

    2015-06-15

    Following a meta-analysis of test accuracy studies, the translation of summary results into clinical practice is potentially problematic. The sensitivity, specificity and positive (PPV) and negative (NPV) predictive values of a test may differ substantially from the average meta-analysis findings, because of heterogeneity. Clinicians thus need more guidance: given the meta-analysis, is a test likely to be useful in new populations, and if so, how should test results inform the probability of existing disease (for a diagnostic test) or future adverse outcome (for a prognostic test)? We propose ways to address this. Firstly, following a meta-analysis, we suggest deriving prediction intervals and probability statements about the potential accuracy of a test in a new population. Secondly, we suggest strategies on how clinicians should derive post-test probabilities (PPV and NPV) in a new population based on existing meta-analysis results and propose a cross-validation approach for examining and comparing their calibration performance. Application is made to two clinical examples. In the first example, the joint probability that both sensitivity and specificity will be >80% in a new population is just 0.19, because of a low sensitivity. However, the summary PPV of 0.97 is high and calibrates well in new populations, with a probability of 0.78 that the true PPV will be at least 0.95. In the second example, post-test probabilities calibrate better when tailored to the prevalence in the new population, with cross-validation revealing a probability of 0.97 that the observed NPV will be within 10% of the predicted NPV.

  9. Model-Based Verification and Validation of Spacecraft Avionics

    Science.gov (United States)

    Khan, Mohammed Omair

    2012-01-01

    Our simulation was able to mimic the results of 30 tests on the actual hardware. This shows that simulations have the potential to enable early design validation - well before actual hardware exists. Although simulations focused around data processing procedures at subsystem and device level, they can also be applied to system level analysis to simulate mission scenarios and consumable tracking (e.g. power, propellant, etc.). Simulation engine plug-in developments are continually improving the product, but handling time for time-sensitive operations (like those of the remote engineering unit and bus controller) can be cumbersome.

  10. Validation and modeling of earthquake strong ground motion using a composite source model

    Science.gov (United States)

    Zeng, Y.

    2001-12-01

    Zeng et al. (1994) have proposed a composite source model for synthetic strong ground motion prediction. In that model, the source is taken as a superposition of circular subevents with a constant stress drop. The number of subevents and their radius follows a power law distribution equivalent to the Gutenberg and Richter's magnitude-frequency relation for seismicity. The heterogeneous nature of the composite source model is characterized by its maximum subevent size and subevent stress drop. As rupture propagates through each subevent, it radiates a Brune's pulse or a Sato and Hirasawa's circular crack pulse. The method has been proved to be successful in generating realistic strong motion seismograms in comparison with observations from earthquakes in California, eastern US, Guerrero of Mexico, Turkey and India. The model has since been improved by including scattering waves from small scale heterogeneity structure of the earth, site specific ground motion prediction using weak motion site amplification, and nonlinear soil response using geotechnical engineering models. Last year, I have introduced an asymmetric circular rupture to improve the subevent source radiation and to provide a consistent rupture model between overall fault rupture process and its subevents. In this study, I revisit the Landers, Loma Prieta, Northridge, Imperial Valley and Kobe earthquakes using the improved source model. The results show that the improved subevent ruptures provide an improved effect of rupture directivity compared to our previous studies. Additional validation includes comparison of synthetic strong ground motions to the observed ground accelerations from the Chi-Chi, Taiwan and Izmit, Turkey earthquakes. Since the method has evolved considerably when it was first proposed, I will also compare results between each major modification of the model and demonstrate its backward compatibility to any of its early simulation procedures.

  11. CMS standard model Higgs boson results

    Directory of Open Access Journals (Sweden)

    Garcia-Abia Pablo

    2013-11-01

    Full Text Available In July 2012 CMS announced the discovery of a new boson with properties resembling those of the long-sought Higgs boson. The analysis of the proton-proton collision data recorded by the CMS detector at the LHC, corresponding to integrated luminosities of 5.1 fb−1 at √s = 7 TeV and 19.6 fb−1 at √s = 8 TeV, confirm the Higgs-like nature of the new boson, with a signal strength associated with vector bosons and fermions consistent with the expectations for a standard model (SM Higgs boson, and spin-parity clearly favouring the scalar nature of the new boson. In this note I review the updated results of the CMS experiment.

  12. Validation of coastal oceanographic models at Laxemar-Simpevarp. Site descriptive modelling SDM-Site Laxemar

    Energy Technology Data Exchange (ETDEWEB)

    Engqvist, Anders (A och I Engqvist Konsult HB, Vaxholm (SE)); Andrejev, Oleg (Finnish Inst. of Marine Research, Helsinki (FI))

    2008-12-15

    The Swedish Nuclear Fuel and Waste Management Company (SKB) is undertaking site characterization at two different locations, the Forsmark and the Laxemar-Simpevarp areas, with the objective of siting a geological repository for spent nuclear fuel. The characterization work is divided into an initial site investigation phase and a complete site investigation phase. In this context, the water exchange of the coastal zone is one link of the chain of possible nuclide transport mechanisms that must be assessed in the site description of potential repository areas. For the purpose of validating the pair of nested 3D-models and the coupled discrete basin (CDB-) model employed to simulate the water exchange in the near-shore coastal zone in the Laxemar-Simpevarp area, an encompassing measurement program entailing data from six stations (of which two are close) has been performed. The design of this program was to first assess to what degree the forcing of the fine resolution (FR-) model of the Laxemar- Simpevarp study area at its interfacial boundary to the coarse resolution (CR-) model of the entire Baltic was reproduced. In addition to this, it is of particular interest how the time-varying density-determining properties, salinity and temperature, at the borders are propagated into the FR-domain and further influence the water exchange with the interior, more secluded, basins. An important part of the validation process has been to carefully evaluate which measurement data that can be considered reliable. The result was that some periods of foremost near-surface salinity data had to be discarded due to growth of algae on the conductivity sensors. Interference with ship traffic and lack of absolute calibration of the salinity meters necessitated dismissal of measurement data too. In this study so-called Mesan data have been consistently used for the meteorological forcing of the 3D-models. Relative the assessed data that can be accepted as adequate, the outcome of the

  13. Validation of Nonlinear Bipolar Transistor Model by Small-Signal Measurements

    DEFF Research Database (Denmark)

    Vidkjær, Jens; Porra, V.; Zhu, J.;

    1992-01-01

    A new method for the validity analysis of nonlinear transistor models is presented based on DC-and small-signal S-parameter measurements and realistic consideration of the measurement and de-embedding errors and singularities of the small-signal equivalent circuit. As an example, some analysis...... results for an extended Gummel Poon model are presented in the case of a UHF bipolar power transistor....

  14. An exercise in model validation: Comparing univariate statistics and Monte Carlo-based multivariate statistics

    Energy Technology Data Exchange (ETDEWEB)

    Weathers, J.B. [Shock, Noise, and Vibration Group, Northrop Grumman Shipbuilding, P.O. Box 149, Pascagoula, MS 39568 (United States)], E-mail: James.Weathers@ngc.com; Luck, R. [Department of Mechanical Engineering, Mississippi State University, 210 Carpenter Engineering Building, P.O. Box ME, Mississippi State, MS 39762-5925 (United States)], E-mail: Luck@me.msstate.edu; Weathers, J.W. [Structural Analysis Group, Northrop Grumman Shipbuilding, P.O. Box 149, Pascagoula, MS 39568 (United States)], E-mail: Jeffrey.Weathers@ngc.com

    2009-11-15

    The complexity of mathematical models used by practicing engineers is increasing due to the growing availability of sophisticated mathematical modeling tools and ever-improving computational power. For this reason, the need to define a well-structured process for validating these models against experimental results has become a pressing issue in the engineering community. This validation process is partially characterized by the uncertainties associated with the modeling effort as well as the experimental results. The net impact of the uncertainties on the validation effort is assessed through the 'noise level of the validation procedure', which can be defined as an estimate of the 95% confidence uncertainty bounds for the comparison error between actual experimental results and model-based predictions of the same quantities of interest. Although general descriptions associated with the construction of the noise level using multivariate statistics exists in the literature, a detailed procedure outlining how to account for the systematic and random uncertainties is not available. In this paper, the methodology used to derive the covariance matrix associated with the multivariate normal pdf based on random and systematic uncertainties is examined, and a procedure used to estimate this covariance matrix using Monte Carlo analysis is presented. The covariance matrices are then used to construct approximate 95% confidence constant probability contours associated with comparison error results for a practical example. In addition, the example is used to show the drawbacks of using a first-order sensitivity analysis when nonlinear local sensitivity coefficients exist. Finally, the example is used to show the connection between the noise level of the validation exercise calculated using multivariate and univariate statistics.

  15. Validation of PV-RPM Code in the System Advisor Model.

    Energy Technology Data Exchange (ETDEWEB)

    Klise, Geoffrey Taylor [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lavrova, Olga [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Freeman, Janine [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-04-01

    This paper describes efforts made by Sandia National Laboratories (SNL) and the National Renewable Energy Laboratory (NREL) to validate the SNL developed PV Reliability Performance Model (PV - RPM) algorithm as implemented in the NREL System Advisor Model (SAM). The PV - RPM model is a library of functions that estimates component failure and repair in a photovoltaic system over a desired simulation period. The failure and repair distributions in this paper are probabilistic representations of component failure and repair based on data collected by SNL for a PV power plant operating in Arizona. The validation effort focuses on whether the failure and repair dist ributions used in the SAM implementation result in estimated failures that match the expected failures developed in the proof - of - concept implementation. Results indicate that the SAM implementation of PV - RPM provides the same results as the proof - of - concep t implementation, indicating the algorithms were reproduced successfully.

  16. Modelling and validation of spectral reflectance for the colon

    Science.gov (United States)

    Hidovic-Rowe, Dzena; Claridge, Ela

    2005-03-01

    The spectral reflectance of the colon is known to be affected by malignant and pre-malignant changes in the tissue. As part of long-term research on the derivation of diagnostically important parameters characterizing colon histology, we have investigated the effects of the normal histological variability on the remitted spectra. This paper presents a detailed optical model of the normal colon comprising mucosa, submucosa and the smooth muscle layer. Each layer is characterized by five variable histological parameters: the volume fraction of blood, the haemoglobin saturation, the size of the scattering particles, including collagen, the volume fraction of the scattering particles and the layer thickness, and three optical parameters: the anisotropy factor, the refractive index of the medium and the refractive index of the scattering particles. The paper specifies the parameter ranges corresponding to normal colon tissue, including some previously unpublished ones. Diffuse reflectance spectra were modelled using the Monte Carlo method. Validation of the model-generated spectra against measured spectra demonstrated that good correspondence was achieved between the two. The analysis of the effect of the individual histological parameters on the behaviour of the spectra has shown that the spectral variability originates mainly from changes in the mucosa. However, the submucosa and the muscle layer must be included in the model as they have a significant constant effect on the spectral reflectance above 600 nm. The nature of variations in the spectra also suggests that it may be possible to carry out model inversion and to recover parameters characterizing the colon from multi-spectral images. A preliminary study, in which the mucosal blood and collagen parameters were modified to reflect histopathological changes associated with colon cancer, has shown that the spectra predicted by our model resemble measured spectral reflectance of adenocarcinomas. This suggests that

  17. Preliminary results of the Geoid Slope Validation Survey 2014 in Iowa

    Science.gov (United States)

    Wang, Y. M.; Becker, C.; Breidenbach, S.; Geoghegan, C.; Martin, D.; Winester, D.; Hanson, T.; Mader, G. L.; Eckl, M. C.

    2014-12-01

    The National Geodetic Survey conducted a second Geoid Slope Validation Survey in the summer of 2014 (GSVS14). The survey took place in Iowa along U.S Route 30. The survey line is approximately 200 miles long (325 km), extending from Denison, IA to Cedar Rapids, IA. There are over 200 official survey bench marks. A leveling survey was performed, conforming to 1st order, class II specifications. A GPS survey was performed using 24 to 48 hour occupations. Absolute gravity, relative gravity, and gravity gradient measurements were also collected during the survey. In addition, deflections of the vertical were acquired at 200 eccentric survey benchmarks using the Compact Digital Astrometric Camera (CODIAC) camera. This paper presents the preliminary results of the survey, including the accuracy analysis of the leveling data, GPS ellipsoidal heights, and the deflections of the vertical which serves as an independent data set in addition to the GPS/leveling implied geoid heights.

  18. Validation of elastic cross section models for space radiation applications

    Science.gov (United States)

    Werneth, C. M.; Xu, X.; Norman, R. B.; Ford, W. P.; Maung, K. M.

    2017-02-01

    The space radiation field is composed of energetic particles that pose both acute and long-term risks for astronauts in low earth orbit and beyond. In order to estimate radiation risk to crew members, the fluence of particles and biological response to the radiation must be known at tissue sites. Given that the spectral fluence at the boundary of the shielding material is characterized, radiation transport algorithms may be used to find the fluence of particles inside the shield and body, and the radio-biological response is estimated from experiments and models. The fidelity of the radiation spectrum inside the shield and body depends on radiation transport algorithms and the accuracy of the nuclear cross sections. In a recent study, self-consistent nuclear models based on multiple scattering theory that include the option to study relativistic kinematics were developed for the prediction of nuclear cross sections for space radiation applications. The aim of the current work is to use uncertainty quantification to ascertain the validity of the models as compared to a nuclear reaction database and to identify components of the models that can be improved in future efforts.

  19. Development and validation of a railgun hydrogen pellet injector model

    Energy Technology Data Exchange (ETDEWEB)

    King, T.L. [Univ. of Houston, TX (United States). Dept. of Electrical and Computer Engineering; Zhang, J.; Kim, K. [Univ. of Illinois, Urbana, IL (United States). Dept. of Electrical and Computer Engineering

    1995-12-31

    A railgun hydrogen pellet injector model is presented and its predictions are compared with the experimental data. High-speed hydrogenic ice injection is the dominant refueling method for magnetically confined plasmas used in controlled thermonuclear fusion research. As experimental devices approach the scale of power-producing fusion reactors, the fueling requirements become increasingly more difficult to meet since, due to the large size and the high electron densities and temperatures of the plasma, hypervelocity pellets of a substantial size will need to be injected into the plasma continuously and at high repetition rates. Advanced technologies, such as the railgun pellet injector, are being developed to address this demand. Despite the apparent potential of electromagnetic launchers to produce hypervelocity projectiles, physical effects that were neither anticipated nor well understood have made it difficult to realize this potential. Therefore, it is essential to understand not only the theory behind railgun operation, but the primary loss mechanisms, as well. Analytic tools have been used by many researchers to design and optimize railguns and analyze their performance. This has led to a greater understanding of railgun behavior and opened the door for further improvement. A railgun hydrogen pellet injector model has been developed. The model is based upon a pellet equation of motion that accounts for the dominant loss mechanisms, inertial and viscous drag. The model has been validated using railgun pellet injectors developed by the Fusion Technology Research Laboratory at the University of Illinois at Urbana-Champaign.

  20. Using of Structural Equation Modeling Techniques in Cognitive Levels Validation

    Directory of Open Access Journals (Sweden)

    Natalija Curkovic

    2012-10-01

    Full Text Available When constructing knowledge tests, cognitive level is usually one of the dimensions comprising the test specifications with each item assigned to measure a particular level. Recently used taxonomies of the cognitive levels most often represent some modification of the original Bloom’s taxonomy. There are many concerns in current literature about existence of predefined cognitive levels. The aim of this article is to investigate can structural equation modeling techniques confirm existence of different cognitive levels. For the purpose of the research, a Croatian final high-school Mathematics exam was used (N = 9626. Confirmatory factor analysis and structural regression modeling were used to test three different models. Structural equation modeling techniques did not support existence of different cognitive levels in this case. There is more than one possible explanation for that finding. Some other techniques that take into account nonlinear behaviour of the items as well as qualitative techniques might be more useful for the purpose of the cognitive levels validation. Furthermore, it seems that cognitive levels were not efficient descriptors of the items and so improvements are needed in describing the cognitive skills measured by items.

  1. Assessment of Galileo modal test results for mathematical model verification

    Science.gov (United States)

    Trubert, M.

    1984-01-01

    The modal test program for the Galileo Spacecraft was completed at the Jet Propulsion Laboratory in the summer of 1983. The multiple sine dwell method was used for the baseline test. The Galileo Spacecraft is a rather complex 2433 kg structure made of a central core on which seven major appendages representing 30 percent of the total mass are attached, resulting in a high modal density structure. The test revealed a strong nonlinearity in several major modes. This nonlinearity discovered in the course of the test necessitated running additional tests at the unusually high response levels of up to about 21 g. The high levels of response were required to obtain a model verification valid at the level of loads for which the spacecraft was designed. Because of the high modal density and the nonlinearity, correlation between the dynamic mathematical model and the test results becomes a difficult task. Significant changes in the pre-test analytical model are necessary to establish confidence in the upgraded analytical model used for the final load verification. This verification, using a test verified model, is required by NASA to fly the Galileo Spacecraft on the Shuttle/Centaur launch vehicle in 1986.

  2. Validation Testing of a Peridynamic Impact Damage Model Using NASA's Micro-Particle Gun

    Science.gov (United States)

    Baber, Forrest E.; Zelinski, Brian J.; Guven, Ibrahim; Gray, Perry

    2017-01-01

    Through a collaborative effort between the Virginia Commonwealth University and Raytheon, a peridynamic model for sand impact damage has been developed1-3. Model development has focused on simulating impacts of sand particles on ZnS traveling at velocities consistent with aircraft take-off and landing speeds. The model reproduces common features of impact damage including pit and radial cracks, and, under some conditions, lateral cracks. This study focuses on a preliminary validation exercise in which simulation results from the peridynamic model are compared to a limited experimental data set generated by NASA's recently developed micro-particle gun (MPG). The MPG facility measures the dimensions and incoming and rebound velocities of the impact particles. It also links each particle to a specific impact site and its associated damage. In this validation exercise parameters of the peridynamic model are adjusted to fit the experimentally observed pit diameter, average length of radial cracks and rebound velocities for 4 impacts of 300 µm glass beads on ZnS. Results indicate that a reasonable fit of these impact characteristics can be obtained by suitable adjustment of the peridynamic input parameters, demonstrating that the MPG can be used effectively as a validation tool for impact modeling and that the peridynamic sand impact model described herein possesses not only a qualitative but also a quantitative ability to simulate sand impact events.

  3. Full-Waveform Validation of a 3D Seismic Model for Western US

    Science.gov (United States)

    Maceira, M.; Larmat, C. S.; Ammon, C. J.; Chai, C.; Herrmann, R. B.

    2014-12-01

    Since the initiation of tomographic studies in the 1970s, geoscientists have advanced the art of inferring 3D variations in the subsurface using collections of geophysical (primarily seismic) observables recorded at or near Earth's surface. Advances have come from improvement and enhancement of the available data and from research on theoretical and computational improvements to tomographic and generalized inverse methods. In the last decade, utilizing dense array datasets, these efforts have led to unprecedented 3D images of the subsurface. Understandably, less effort has been expended on model validation to provide an absolute assessment of model uncertainty. Generally models constructed with different data sets and independent computational codes are assessed with geological reasonability and compared other models to gain confidence. The question of "How good is a particular 3D geophysical model at representing the Earth's true nature?" remains largely unaddressed at a time when 3D Earth models are used for both societal and energy security. In the last few years, opportunities have arisen in earth-structure imaging, including the advent of new methods in computational seismology and statistical sciences. We use the unique and extensive High Performance Computing resources available at Los Alamos National Laboratory to explore approaches to realistic model validation. We present results from a study focused on validating a 3D model for the western United States generated using a joint inversion simultaneously fitting interpolated teleseismic P-wave receiver functions, Rayleigh-wave group-velocity estimates between 7 and 250 s period, and high-wavenumber filtered Bouguer gravity observations. Validation of the obtained model is performed through systematic comparison of observed and predicted seismograms generated using the Spectral Element Method, which is a direct numerical solution for full waveform modeling in 3D models, with accuracy of spectral methods.

  4. Validation of coastal oceanographic models at Laxemar-Simpevarp. Site descriptive modelling SDM-Site Laxemar

    Energy Technology Data Exchange (ETDEWEB)

    Engqvist, Anders (A och I Engqvist Konsult HB, Vaxholm (SE)); Andrejev, Oleg (Finnish Inst. of Marine Research, Helsinki (FI))

    2008-12-15

    The Swedish Nuclear Fuel and Waste Management Company (SKB) is undertaking site characterization at two different locations, the Forsmark and the Laxemar-Simpevarp areas, with the objective of siting a geological repository for spent nuclear fuel. The characterization work is divided into an initial site investigation phase and a complete site investigation phase. In this context, the water exchange of the coastal zone is one link of the chain of possible nuclide transport mechanisms that must be assessed in the site description of potential repository areas. For the purpose of validating the pair of nested 3D-models and the coupled discrete basin (CDB-) model employed to simulate the water exchange in the near-shore coastal zone in the Laxemar-Simpevarp area, an encompassing measurement program entailing data from six stations (of which two are close) has been performed. The design of this program was to first assess to what degree the forcing of the fine resolution (FR-) model of the Laxemar- Simpevarp study area at its interfacial boundary to the coarse resolution (CR-) model of the entire Baltic was reproduced. In addition to this, it is of particular interest how the time-varying density-determining properties, salinity and temperature, at the borders are propagated into the FR-domain and further influence the water exchange with the interior, more secluded, basins. An important part of the validation process has been to carefully evaluate which measurement data that can be considered reliable. The result was that some periods of foremost near-surface salinity data had to be discarded due to growth of algae on the conductivity sensors. Interference with ship traffic and lack of absolute calibration of the salinity meters necessitated dismissal of measurement data too. In this study so-called Mesan data have been consistently used for the meteorological forcing of the 3D-models. Relative the assessed data that can be accepted as adequate, the outcome of the

  5. Is my model good enough? Best practices for verification and validation of musculoskeletal models and simulations of movement.

    Science.gov (United States)

    Hicks, Jennifer L; Uchida, Thomas K; Seth, Ajay; Rajagopal, Apoorva; Delp, Scott L

    2015-02-01

    Computational modeling and simulation of neuromusculoskeletal (NMS) systems enables researchers and clinicians to study the complex dynamics underlying human and animal movement. NMS models use equations derived from physical laws and biology to help solve challenging real-world problems, from designing prosthetics that maximize running speed to developing exoskeletal devices that enable walking after a stroke. NMS modeling and simulation has proliferated in the biomechanics research community over the past 25 years, but the lack of verification and validation standards remains a major barrier to wider adoption and impact. The goal of this paper is to establish practical guidelines for verification and validation of NMS models and simulations that researchers, clinicians, reviewers, and others can adopt to evaluate the accuracy and credibility of modeling studies. In particular, we review a general process for verification and validation applied to NMS models and simulations, including careful formulation of a research question and methods, traditional verification and validation steps, and documentation and sharing of results for use and testing by other researchers. Modeling the NMS system and simulating its motion involves methods to represent neural control, musculoskeletal geometry, muscle-tendon dynamics, contact forces, and multibody dynamics. For each of these components, we review modeling choices and software verification guidelines; discuss variability, errors, uncertainty, and sensitivity relationships; and provide recommendations for verification and validation by comparing experimental data and testing robustness. We present a series of case studies to illustrate key principles. In closing, we discuss challenges the community must overcome to ensure that modeling and simulation are successfully used to solve the broad spectrum of problems that limit human mobility.

  6. Use of Synchronized Phasor Measurements for Model Validation in ERCOT

    Science.gov (United States)

    Nuthalapati, Sarma; Chen, Jian; Shrestha, Prakash; Huang, Shun-Hsien; Adams, John; Obadina, Diran; Mortensen, Tim; Blevins, Bill

    2013-05-01

    This paper discusses experiences in the use of synchronized phasor measurement technology in Electric Reliability Council of Texas (ERCOT) interconnection, USA. Implementation of synchronized phasor measurement technology in the region is a collaborative effort involving ERCOT, ONCOR, AEP, SHARYLAND, EPG, CCET, and UT-Arlington. As several phasor measurement units (PMU) have been installed in ERCOT grid in recent years, phasor data with the resolution of 30 samples per second is being used to monitor power system status and record system events. Post-event analyses using recorded phasor data have successfully verified ERCOT dynamic stability simulation studies. Real time monitoring software "RTDMS"® enables ERCOT to analyze small signal stability conditions by monitoring the phase angles and oscillations. The recorded phasor data enables ERCOT to validate the existing dynamic models of conventional and/or wind generator.

  7. Multi-criteria validation of artificial neural network rainfall-runoff modeling

    Directory of Open Access Journals (Sweden)

    R. Modarres

    2008-12-01

    Full Text Available In this study we propose a comprehensive multi-criteria validation test for rainfall-runoff modeling by artificial neural networks. This study applies 17 global statistics and 3 additional non-parametric tests to evaluate the ANNs. The weakness of global statistics for validation of ANN is demonstrated by rainfall-runoff modeling of the Plasjan Basin in the western region of the Zayandehrud watershed, Iran. Although the global statistics showed that the multi layer perceptron with 4 hidden layers (MLP4 is the best ANN for the basin comparing with other MLP networks and empirical regression model, but the non-parametric tests illustrate that neither the ANNs nor the regression model are able to reproduce the probability distribution of observed runoff in validation phase. However, the MLP4 network is the best network to reproduce the mean and variance of the observed runoff based on non-parametric tests. The performance of ANNs and empirical model was also demonstrated for low-medium and high flows. Although the MLP4 network gives the best performance among ANNs for low-medium and high flows based on different statistics but the empirical model shows better results. However, none of the models is able to simulate the frequency distribution of low-medium and high flows according to non-parametric tests. This study illustrates that the modelers should select appropriate and relevant evaluation measures from the set of existing metrics based on the particular requirements of each individual applications.

  8. Numerical modelling of transdermal delivery from matrix systems: parametric study and experimental validation with silicone matrices.

    Science.gov (United States)

    Snorradóttir, Bergthóra S; Jónsdóttir, Fjóla; Sigurdsson, Sven Th; Másson, Már

    2014-08-01

    A model is presented for transdermal drug delivery from single-layered silicone matrix systems. The work is based on our previous results that, in particular, extend the well-known Higuchi model. Recently, we have introduced a numerical transient model describing matrix systems where the drug dissolution can be non-instantaneous. Furthermore, our model can describe complex interactions within a multi-layered matrix and the matrix to skin boundary. The power of the modelling approach presented here is further illustrated by allowing the possibility of a donor solution. The model is validated by a comparison with experimental data, as well as validating the parameter values against each other, using various configurations with donor solution, silicone matrix and skin. Our results show that the model is a good approximation to real multi-layered delivery systems. The model offers the ability of comparing drug release for ibuprofen and diclofenac, which cannot be analysed by the Higuchi model because the dissolution in the latter case turns out to be limited. The experiments and numerical model outlined in this study could also be adjusted to more general formulations, which enhances the utility of the numerical model as a design tool for the development of drug-loaded matrices for trans-membrane and transdermal delivery.

  9. Quality of life and hormone use: new validation results of MRS scale

    Directory of Open Access Journals (Sweden)

    Heinemann Lothar AJ

    2006-05-01

    Full Text Available Abstract Background The Menopause Rating Scale is a health-related Quality of Life scale developed in the early 1990s and step-by-step validated since then. Recently the MRS scale was validated as outcomes measure for hormone therapy. The suspicion however was expressed that the data were too optimistic due to methodological problems of the study. A new study became available to check how founded this suspicion was. Method An open post-marketing study of 3282 women with pre- and post- treatment data of the self-administered version of the MRS scale was analyzed to evaluate the capacity of the scale to detect hormone treatment related effects with the MRS scale. The main results were then compared with the old study where the interview-based version of the MRS scale was used. Results The hormone-therapy related improvement of complaints relative to the baseline score was about or less than 30% in total or domain scores, whereas it exceeded 30% improvement in the old study. Similarly, the relative improvement after therapy, stratified by the degree of severity at baseline, was lower in the new than in the old study, but had the same slope. Although we cannot exclude different treatment effects with the study method used, this supports our hypothesis that the individual MRS interviews performed by the physician biased the results towards over-estimation of the treatment effects. This hypothesis is underlined by the degree of concordance of physician's assessment and patient's perception of treatment success (MRS results: Sensitivity (correct prediction of the positive assessment by the treating physician of the MRS and specificity (correct prediction of a negative assessment by the physician were lower than the results obtained with the interview-based MRS scale in the previous publication. Conclusion The study confirmed evidence for the capacity of the MRS scale to measure treatment effects on quality of life across the full range of severity of

  10. Results from the radiometric validation of Sentinel-3 optical sensors using natural targets

    Science.gov (United States)

    Fougnie, Bertrand; Desjardins, Camille; Besson, Bruno; Bruniquel, Véronique; Meskini, Naceur; Nieke, Jens; Bouvet, Marc

    2016-09-01

    The recently launched SENTINEL-3 mission measures sea surface topography, sea/land surface temperature, and ocean/land surface colour with high accuracy. The mission provides data continuity with the ENVISAT mission through acquisitions by multiple sensing instruments. Two of them, OLCI (Ocean and Land Colour Imager) and SLSTR (Sea and Land Surface Temperature Radiometer) are optical sensors designed to provide continuity with Envisat's MERIS and AATSR instruments. During the commissioning, in-orbit calibration and validation activities are conducted. Instruments are in-flight calibrated and characterized primarily using on-board devices which include diffusers and black body. Afterward, vicarious calibration methods are used in order to validate the OLCI and SLSTR radiometry for the reflective bands. The calibration can be checked over dedicated natural targets such as Rayleigh scattering, sunglint, desert sites, Antarctica, and tentatively deep convective clouds. Tools have been developed and/or adapted (S3ETRAC, MUSCLE) to extract and process Sentinel-3 data. Based on these matchups, it is possible to provide an accurate checking of many radiometric aspects such as the absolute and interband calibrations, the trending correction, the calibration consistency within the field-of-view, and more generally this will provide an evaluation of the radiometric consistency for various type of targets. Another important aspect will be the checking of cross-calibration between many other instruments such as MERIS and AATSR (bridge between ENVISAT and Sentinel-3), MODIS (bridge to the GSICS radiometric standard), as well as Sentinel-2 (bridge between Sentinel missions). The early results, based on the available OLCI and SLSTR data, will be presented and discussed.

  11. Validating a dance-specific screening test for balance: preliminary results from multisite testing.

    Science.gov (United States)

    Batson, Glenna

    2010-09-01

    Few dance-specific screening tools adequately capture balance. The aim of this study was to administer and modify the Star Excursion Balance Test (oSEBT) to examine its utility as a balance screen for dancers. The oSEBT involves standing on one leg while lightly targeting with the opposite foot to the farthest distance along eight spokes of a star-shaped grid. This task simulates dance in the spatial pattern and movement quality of the gesturing limb. The oSEBT was validated for distance on athletes with history of ankle sprain. Thirty-three dancers (age 20.1 +/- 1.4 yrs) participated from two contemporary dance conservatories (UK and US), with or without a history of lower extremity injury. Dancers were verbally instructed (without physical demonstration) to execute the oSEBT and four modifications (mSEBT): timed (speed), timed with cognitive interference (answering questions aloud), and sensory disadvantaging (foam mat). Stepping strategies were tracked and performance strategies video-recorded. Unlike the oSEBT results, distances reached were not significant statistically (p = 0.05) or descriptively (i.e., shorter) for either group. Performance styles varied widely, despite sample homogeneity and instructions to control for strategy. Descriptive analysis of mSEBT showed an increased number of near-falls and decreased timing on the injured limb. Dancers appeared to employ variable strategies to keep balance during this test. Quantitative analysis is warranted to define balance strategies for further validation of SEBT modifications to determine its utility as a balance screening tool.

  12. Prediction Models and Their External Validation Studies for Mortality of Patients with Acute Kidney Injury: A Systematic Review

    Science.gov (United States)

    Ohnuma, Tetsu; Uchino, Shigehiko

    2017-01-01

    Objectives To systematically review AKI outcome prediction models and their external validation studies, to describe the discrepancy of reported accuracy between the results of internal and external validations, and to identify variables frequently included in the prediction models. Methods We searched the MEDLINE and Web of Science electronic databases (until January 2016). Studies were eligible if they derived a model to predict mortality of AKI patients or externally validated at least one of the prediction models, and presented area under the receiver-operator characteristic curves (AUROC) to assess model discrimination. Studies were excluded if they described only results of logistic regression without reporting a scoring system, or if a prediction model was generated from a specific cohort. Results A total of 2204 potentially relevant articles were found and screened, of which 12 articles reporting original prediction models for hospital mortality in AKI patients and nine articles assessing external validation were selected. Among the 21 studies for AKI prediction models and their external validation, 12 were single-center (57%), and only three included more than 1,000 patients (14%). The definition of AKI was not uniform and none used recently published consensus criteria for AKI. Although good performance was reported in their internal validation, most of the prediction models had poor discrimination with an AUROC below 0.7 in the external validation studies. There were 10 common non-renal variables that were reported in more than three prediction models: mechanical ventilation, age, gender, hypotension, liver failure, oliguria, sepsis/septic shock, low albumin, consciousness and low platelet count. Conclusions Information in this systematic review should be useful for future prediction model derivation by providing potential candidate predictors, and for future external validation by listing up the published prediction models. PMID:28056039

  13. Radiative transfer model for contaminated slabs : experimental validations

    CERN Document Server

    Andrieu, François; Schmitt, Bernard; Douté, Sylvain; Brissaud, Olivier

    2015-01-01

    This article presents a set of spectro-goniometric measurements of different water ice samples and the comparison with an approximated radiative transfer model. The experiments were done using the spectro-radiogoniometer described in Brissaud et al. (2004). The radiative transfer model assumes an isotropization of the flux after the second interface and is fully described in Andrieu et al. (2015). Two kind of experiments were conducted. First, the specular spot was closely investigated, at high angular resolution, at the wavelength of $1.5\\,\\mbox{\\mu m}$, where ice behaves as a very absorbing media. Second, the bidirectional reflectance was sampled at various geometries, including low phase angles on 61 wavelengths ranging from $0.8\\,\\mbox{\\mu m}$ to $2.0\\,\\mbox{\\mu m}$. In order to validate the model, we made a qualitative test to demonstrate the relative isotropization of the flux. We also conducted quantitative assessments by using a bayesian inversion method in order to estimate the parameters (e.g. sampl...

  14. Comparison of fully coupled hydroelastic computation and segmented model test results for slamming and whipping loads

    Directory of Open Access Journals (Sweden)

    Jung-Hyun Kim

    2014-12-01

    Full Text Available This paper presents a numerical analysis of slamming and whipping using a fully coupled hydroelastic model. The coupled model uses a 3-D Rankine panel method, a 1-D or 3-D finite element method, and a 2-D Generalized Wagner Model (GWM, which are strongly coupled in time domain. First, the GWM is validated against results of a free drop test of wedges. Second, the fully coupled method is validated against model test results for a 10,000 twenty-foot equivalent unit (TEU containership. Slamming pressures and whipping responses to regular waves are compared. A spatial distribution of local slamming forces is measured using 14 force sensors in the model test, and it is compared with the integration of the pressure distribution by the computation. Furthermore, the pressure is decomposed into the added mass, impact, and hydrostatic components, in the computational results. The validity and characteristics of the numerical model are discussed.

  15. [Development and validation of a finite element model of human knee joint for dynamic analysis].

    Science.gov (United States)

    Li, Haiyan; Gu, Yulong; Ruan, Shijie; Cui, Shihai

    2012-02-01

    Based on the biomechanical response of human knee joint to a front impact in occupants accidents, a finite element (FE) model of human knee joint was developed by using computer simulation technique for impacting. The model consists of human anatomical structure, including femoral condyle, tibia condyle, fibular small head, patellar, cartilage, meniscus and primary ligament. By comparing the results of the FE model with experiments of the knee joint in axial load conditions, the validation of the model was verified. Furthermore, this study provides data for the mechanical of human knee joint injury, and is helpful for the design and optimization of the vehicle protective devices.

  16. Identification of reduced-order thermal therapy models using thermal MR images: theory and validation.

    Science.gov (United States)

    Niu, Ran; Skliar, Mikhail

    2012-07-01

    In this paper, we develop and validate a method to identify computationally efficient site- and patient-specific models of ultrasound thermal therapies from MR thermal images. The models of the specific absorption rate of the transduced energy and the temperature response of the therapy target are identified in the reduced basis of proper orthogonal decomposition of thermal images, acquired in response to a mild thermal test excitation. The method permits dynamic reidentification of the treatment models during the therapy by recursively utilizing newly acquired images. Such adaptation is particularly important during high-temperature therapies, which are known to substantially and rapidly change tissue properties and blood perfusion. The developed theory was validated for the case of focused ultrasound heating of a tissue phantom. The experimental and computational results indicate that the developed approach produces accurate low-dimensional treatment models despite temporal and spatial noises in MR images and slow image acquisition rate.

  17. Calibration and validation of a model describing complete autotrophic nitrogen removal in a granular SBR system

    DEFF Research Database (Denmark)

    Vangsgaard, Anna Katrine; Mutlu, Ayten Gizem; Gernaey, Krist

    2013-01-01

    -up. RESULTS: A model was calibrated using a step-wise procedure customized for the specific needs of the system. The important steps in the procedure were initialization, steady-state and dynamic calibration, and validation. A fast and effective initialization approach was developed to approximate pseudo...... screening of the parameter space proposed by Sin et al. (2008) - to find the best fit of the model to dynamic data. Finally, the calibrated model was validated with an independent data set. CONCLUSION: The presented calibration procedure is the first customized procedure for this type of system...... and is expected to contribute to achieve a fast and effective model calibration, an important enabling tool for various biochemical engineering design, control and operation problems....

  18. Using the mouse to model human disease: increasing validity and reproducibility

    Directory of Open Access Journals (Sweden)

    Monica J. Justice

    2016-02-01

    Full Text Available Experiments that use the mouse as a model for disease have recently come under scrutiny because of the repeated failure of data, particularly derived from preclinical studies, to be replicated or translated to humans. The usefulness of mouse models has been questioned because of irreproducibility and poor recapitulation of human conditions. Newer studies, however, point to bias in reporting results and improper data analysis as key factors that limit reproducibility and validity of preclinical mouse research. Inaccurate and incomplete descriptions of experimental conditions also contribute. Here, we provide guidance on best practice in mouse experimentation, focusing on appropriate selection and validation of the model, sources of variation and their influence on phenotypic outcomes, minimum requirements for control sets, and the importance of rigorous statistics. Our goal is to raise the standards in mouse disease modeling to enhance reproducibility, reliability and clinical translation of findings.

  19. Development and experimental validation of a mechanistic model of in vitro DNA recombination.

    Science.gov (United States)

    Bowyer, Jack; Jia Zhao; Rosser, Susan; Colloms, Sean; Bates, Declan

    2015-08-01

    Engineering cellular memory is a key area of research in which Synthetic Biology has already begun to make significant impacts. Recent work elucidating transcriptional memory devices has paved the way for the creation of bistable genetic switches based on DNA recombination. Attempts to experimentally design and build synthetic systems using recombinases have thus far been hindered by a lack of validated computational models that capture the mechanistic basis of DNA recombination. The predictive capabilities of such models could be exploited by Synthetic Biologists to reduce the number of iterative cycles required to align experimental results with design performance requirements. Here, we develop and validate the first detailed mechanistic model of DNA recombination, with a focus on how efficiently recombination can occur, and the model features required to replicate and predict experimental data.

  20. Groundwater Model Validation for the Project Shoal Area, Corrective Action Unit 447

    Energy Technology Data Exchange (ETDEWEB)

    None

    2008-05-19

    Stoller has examined newly collected water level data in multiple wells at the Shoal site. On the basis of these data and information presented in the report, we are currently unable to confirm that the model is successfully validated. Most of our concerns regarding the model stem from two findings: (1) measured water level data do not provide clear evidence of a prevailing lateral flow direction; and (2) the groundwater flow system has been and continues to be in a transient state, which contrasts with assumed steady-state conditions in the model. The results of DRI's model validation efforts and observations made regarding water level behavior are discussed in the following sections. A summary of our conclusions and recommendations for a path forward are also provided in this letter report.

  1. Transient validation of RELAP5 model with the DISS facility in once through operation mode

    Science.gov (United States)

    Serrano-Aguilera, J. J.; Valenzuela, L.

    2016-05-01

    Thermal-hydraulic code RELAP5 has been used to model a Solar Direct Steam Generation (DSG) system. Experimental data from the DISS facility located at Plataforma Solar de Almería is compared to the numerical results of the RELAP5 model in order to validate it. Both the model and the experimental set-up are in once through operation mode where no injection or active control is regarded. Time dependent boundary conditions are taken into account. This work is a preliminary study of further research that will be carried out in order to achieve a thorough validation of RELAP5 models in the context of DSG in line-focus solar collectors.

  2. Revisiting Runoff Model Calibration: Airborne Snow Observatory Results Allow Improved Modeling Results

    Science.gov (United States)

    McGurk, B. J.; Painter, T. H.

    2014-12-01

    Deterministic snow accumulation and ablation simulation models are widely used by runoff managers throughout the world to predict runoff quantities and timing. Model fitting is typically based on matching modeled runoff volumes and timing with observed flow time series at a few points in the basin. In recent decades, sparse networks of point measurements of the mountain snowpacks have been available to compare with modeled snowpack, but the comparability of results from a snow sensor or course to model polygons of 5 to 50 sq. km is suspect. However, snowpack extent, depth, and derived snow water equivalent have been produced by the NASA/JPL Airborne Snow Observatory (ASO) mission for spring of 20013 and 2014 in the Tuolumne River basin above Hetch Hetchy Reservoir. These high-resolution snowpack data have exposed the weakness in a model calibration based on runoff alone. The U.S. Geological Survey's Precipitation Runoff Modeling System (PRMS) calibration that was based on 30-years of inflow to Hetch Hetchy produces reasonable inflow results, but modeled spatial snowpack location and water quantity diverged significantly from the weekly measurements made by ASO during the two ablation seasons. The reason is that the PRMS model has many flow paths, storages, and water transfer equations, and a calibrated outflow time series can be right for many wrong reasons. The addition of a detailed knowledge of snow extent and water content constrains the model so that it is a better representation of the actual watershed hydrology. The mechanics of recalibrating PRMS to the ASO measurements will be described, and comparisons in observed versus modeled flow for both a small subbasin and the entire Hetch Hetchy basin will be shown. The recalibrated model provided a bitter fit to the snowmelt recession, a key factor for water managers as they balance declining inflows with demand for power generation and ecosystem releases during the final months of snow melt runoff.

  3. Evaluating the validity of spectral calibration models for quantitative analysis following signal preprocessing.

    Science.gov (United States)

    Chen, Da; Grant, Edward

    2012-11-01

    When paired with high-powered chemometric analysis, spectrometric methods offer great promise for the high-throughput analysis of complex systems. Effective classification or quantification often relies on signal preprocessing to reduce spectral interference and optimize the apparent performance of a calibration model. However, less frequently addressed by systematic research is the affect of preprocessing on the statistical accuracy of a calibration result. The present work demonstrates the effectiveness of two criteria for validating the performance of signal preprocessing in multivariate models in the important dimensions of bias and precision. To assess the extent of bias, we explore the applicability of the elliptic joint confidence region (EJCR) test and devise a new means to evaluate precision by a bias-corrected root mean square error of prediction. We show how these criteria can effectively gauge the success of signal pretreatments in suppressing spectral interference while providing a straightforward means to determine the optimal level of model complexity. This methodology offers a graphical diagnostic by which to visualize the consequences of pretreatment on complex multivariate models, enabling optimization with greater confidence. To demonstrate the application of the EJCR criterion in this context, we evaluate the validity of representative calibration models using standard pretreatment strategies on three spectral data sets. The results indicate that the proposed methodology facilitates the reliable optimization of a well-validated calibration model, thus improving the capability of spectrophotometric analysis.

  4. Psychometric validation of the consensus five-factor model of the Positive and Negative Syndrome Scale.

    Science.gov (United States)

    Fong, Ted C T; Ho, Rainbow T H; Wan, Adrian H Y; Siu, Pantha Joey C Y; Au-Yeung, Friendly S W

    2015-10-01

    The Positive and Negative Syndrome Scale (PANSS) is widely used for clinical assessment of symptoms in schizophrenia. Instead of the traditional pyramidal model, recent literature supports the pentagonal model for the dimensionality of the PANSS. The present study aimed to validate the consensus five-factor model of the PANSS and evaluate its convergent validity. Participants were 146 Chinese chronic schizophrenic patients who completed diagnostic interviews and cognitive assessments. Exploratory structural equation modeling (ESEM) was performed to investigate the dimensionality of the PANSS. Covariates (age, sex, and education level) and concurrent outcomes (perceived stress, memory, daily living functions, and motor deficits) were added in the ESEM model. The results supported the consensus 5-factor underlying structure, which comprised 20 items categorized into positive, negative, excitement, depression, and cognitive factors with acceptable reliability (α=.69-.85) and strong factor loadings (λ=.41-.93). The five factors, especially the cognitive factor, showed evident convergent validity with the covariates and concurrent outcomes. The results support the consensus five-factor structure of the PANSS as a robust measure of symptoms in schizophrenia. Future studies could explore the clinical and practical utility of the consensus five-factor model. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. Development and validation of a two-dimensional fast-response flood estimation model

    Energy Technology Data Exchange (ETDEWEB)

    Judi, David R [Los Alamos National Laboratory; Mcpherson, Timothy N [Los Alamos National Laboratory; Burian, Steven J [UNIV OF UTAK

    2009-01-01

    A finite difference formulation of the shallow water equations using an upwind differencing method was developed maintaining computational efficiency and accuracy such that it can be used as a fast-response flood estimation tool. The model was validated using both laboratory controlled experiments and an actual dam breach. Through the laboratory experiments, the model was shown to give good estimations of depth and velocity when compared to the measured data, as well as when compared to a more complex two-dimensional model. Additionally, the model was compared to high water mark data obtained from the failure of the Taum Sauk dam. The simulated inundation extent agreed well with the observed extent, with the most notable differences resulting from the inability to model sediment transport. The results of these validation studies complex two-dimensional model. Additionally, the model was compared to high water mark data obtained from the failure of the Taum Sauk dam. The simulated inundation extent agreed well with the observed extent, with the most notable differences resulting from the inability to model sediment transport. The results of these validation studies show that a relatively numerical scheme used to solve the complete shallow water equations can be used to accurately estimate flood inundation. Future work will focus on further reducing the computation time needed to provide flood inundation estimates for fast-response analyses. This will be accomplished through the efficient use of multi-core, multi-processor computers coupled with an efficient domain-tracking algorithm, as well as an understanding of the impacts of grid resolution on model results.

  6. Stem volume Models and Validation for Cryptomeria japonica in Jeju Island, Korea

    Science.gov (United States)

    Seo, YeonOk; Jung, Sung Cheol; Lumbres, Roscinto Ian; Jeon, Chul Hyun; Kim, Chan Soo

    2016-04-01

    This study was carried out to fit different volume equations for Cryptomeria japonica trees in Jeju Experimental Forests, Jeju Island, Korea. A total of 120 Cryptomeria japonica trees were measured and were randomly split into two dataset One is for initial model development (80% of the dataset) and the other is for model validation (20% of the dataset). The two dataset were then combined for the final model development. Coefficient of determination (R2), root mean square error (RMSE), mean difference (MD), absolute mean difference (AMD) were used as evaluation statistics to evaluate the performance of the different models. Results showed that volume models with two independent variables (DBH and total height) had a better performance as compared to models with only one (DBH). The result of model evaluation and validation showed that model 6 (V=aDbHc) was considered best based on the rank analysis among the candidate models. It is hope that the result of this study could help forests managers to easily predict the total volume of Cryptomeria japonica which is important in Carbon stock assessment of the different Cryptomeria japonica forests in Jeju Island, Korea.

  7. Modeling Malaysia's Energy System: Some Preliminary Results

    OpenAIRE

    Ahmad M. Yusof

    2011-01-01

    Problem statement: The current dynamic and fragile world energy environment necessitates the development of new energy model that solely caters to analyze Malaysias energy scenarios. Approach: The model is a network flow model that traces the flow of energy carriers from its sources (import and mining) through some conversion and transformation processes for the production of energy products to final destinations (energy demand sectors). The integration to the economic sectors is done exogene...

  8. Validation of buoyancy driven spectral tensor model using HATS data

    DEFF Research Database (Denmark)

    Chougule, A.; Mann, Jakob; Kelly, Mark C.

    2016-01-01

    We present a homogeneous spectral tensor model for wind velocity and temperature fluctuations, driven by mean vertical shear and mean temperature gradient. Results from the model, including one-dimensional velocity and temperature spectra and the associated co-spectra, are shown in this paper. Th...

  9. Engineering Glass Passivation Layers -Model Results

    Energy Technology Data Exchange (ETDEWEB)

    Skorski, Daniel C.; Ryan, Joseph V.; Strachan, Denis M.; Lepry, William C.

    2011-08-08

    The immobilization of radioactive waste into glass waste forms is a baseline process of nuclear waste management not only in the United States, but worldwide. The rate of radionuclide release from these glasses is a critical measure of the quality of the waste form. Over long-term tests and using extrapolations of ancient analogues, it has been shown that well designed glasses exhibit a dissolution rate that quickly decreases to a slow residual rate for the lifetime of the glass. The mechanistic cause of this decreased corrosion rate is a subject of debate, with one of the major theories suggesting that the decrease is caused by the formation of corrosion products in such a manner as to present a diffusion barrier on the surface of the glass. Although there is much evidence of this type of mechanism, there has been no attempt to engineer the effect to maximize the passivating qualities of the corrosion products. This study represents the first attempt to engineer the creation of passivating phases on the surface of glasses. Our approach utilizes interactions between the dissolving glass and elements from the disposal environment to create impermeable capping layers. By drawing from other corrosion studies in areas where passivation layers have been successfully engineered to protect the bulk material, we present here a report on mineral phases that are likely have a morphological tendency to encrust the surface of the glass. Our modeling has focused on using the AFCI glass system in a carbonate, sulfate, and phosphate rich environment. We evaluate the minerals predicted to form to determine the likelihood of the formation of a protective layer on the surface of the glass. We have also modeled individual ions in solutions vs. pH and the addition of aluminum and silicon. These results allow us to understand the pH and ion concentration dependence of mineral formation. We have determined that iron minerals are likely to form a complete incrustation layer and we plan

  10. Validation of SCS CN Method for Runoff Estimation with Field Observed Regression Analysis Results in Venna Basin, Central India.

    Science.gov (United States)

    Katpatal, Y. B.; Paranjpe, S. V.; Kadu, M.

    2014-12-01

    Effective Watershed management requires authentic data of surface runoff potential for which several methods and models are in use. Generally, non availability of field data calls for techniques based on remote observations. Soil Conservation Services Curve Number (SCS CN) method is an important method which utilizes information generated from remote sensing for estimation of runoff. Several attempts have been made to validate the runoff values generated from SCS CN method by comparing the results obtained from other methods. In the present study, runoff estimation through SCS CN method has been performed using IRS LISS IV data for the Venna Basin situated in the Central India. The field data was available for Venna Basin. The Land use/land cover and soil layers have been generated for the entire watershed using the satellite data and Geographic Information System (GIS). The Venna basin have been divided into intercepted catchment and free catchment. Run off values have been estimated using field data through regression analysis. The runoff values estimated using SCS CN method have been compared with yield values generated using data collected from the tank gauge stations and data from the discharge stations. The correlation helps in validation of the results obtained from the SCS CN method and its applicability in Indian conditions. Key Words: SCS CN Method, Regression Analysis, Land Use / Land cover, Runoff, Remote Sensing, GIS.

  11. Results of the evaluation and preliminary validation of a primary LNG mass flow standard

    Science.gov (United States)

    van der Beek, Mijndert; Lucas, Peter; Kerkhof, Oswin; Mirzaei, Maria; Blom, Gerard

    2014-10-01

    LNG custody transfer measurements at large terminals have been based on ship tank level gauging for more than 50 years. Flow meter application has mainly been limited to process control in spite of the promise of simplified operations, potentially smaller uncertainties and better control over the measurements for buyers. The reason for this has been the lack of LNG flow calibration standards as well as written standards. In the framework of the EMRP1 ‘Metrology for LNG’ project, Van Swinden Laboratory (VSL) has developed a primary LNG mass flow standard. This standard is so far the only one in the world except for a liquid nitrogen flow standard at the National Institute of Standards and Technology (NIST). The VSL standard is based on weighing and holds a Calibration and Measurement Capability (CMC) of 0.12% to 0.15%. This paper discusses the measurement principle, results of the uncertainty validation with LNG and the differences between water and LNG calibration results of four Coriolis mass flow meters. Most of the calibrated meters do not comply with their respective accuracy claims. Recommendations for further improvement of the measurement uncertainty will also be discussed.

  12. Assessing Leader Cognitive Skills with Situational Judgment Tests: Construct Validity Results

    Science.gov (United States)

    2010-09-01

    and thus were not included in subsequent analyses. Furthermore, the exploratory factor analyses ( EFAs ) results demonstrated that the Remaining...the factor solution could not be rotated. When these items were omitted from the analyses, the EFA results suggested a 2-factor model as shown in...company from the high tariffs of transporting their products through neighboring countries. The Coronian National Command Authority commissioned the

  13. Validation of conducting wall models using magnetic measurements

    Science.gov (United States)

    Hanson, J. M.; Bialek, J.; Turco, F.; King, J.; Navratil, G. A.; Strait, E. J.; Turnbull, A.

    2016-10-01

    The impact of conducting wall eddy currents on perturbed magnetic field measurements is a key issue for understanding the measurement and control of long-wavelength MHD stability in tokamak devices. As plasma response models have growth in sophistication, the need to understand and resolve small changes in these measurements has become more important, motivating increased fidelity in simulations of externally applied fields and the wall eddy current response. In this manuscript, we describe thorough validation studies of the wall models in the mars-f and valen stability codes, using coil-sensor vacuum coupling measurements from the DIII-D tokamak (Luxon et al 2005 Fusion Sci. Technol. 48 807). The valen formulation treats conducting structures with arbitrary three-dimensional geometries, while mars-f uses an axisymmetric wall model and a spectral decomposition of the problem geometry with a fixed toroidal harmonic n. The vacuum coupling measurements have a strong sensitivity to wall eddy currents induced by time-changing coil currents, owing to the close proximities of both the sensors and coils to the wall. Measurements from individual coil and sensor channels are directly compared with valen predictions. It is found that straightforward improvements to the valen model, such as refining the wall mesh and simulating the vertical extent of the DIII-D poloidal field sensors, lead to good agreement with the experimental measurements. In addition, couplings to multi-coil, n  =  1 toroidal mode perturbations are calculated from the measurements and compared with predictions from both codes. The toroidal mode comparisons favor the fully three-dimensional simulation approach, likely because this approach naturally treats n  >  1 sidebands generated by the coils and wall eddy currents, as well as the n  =  1 fundamental.

  14. Nonparametric model validations for hidden Markov models with applications in financial econometrics.

    Science.gov (United States)

    Zhao, Zhibiao

    2011-06-01

    We address the nonparametric model validation problem for hidden Markov models with partially observable variables and hidden states. We achieve this goal by constructing a nonparametric simultaneous confidence envelope for transition density function of the observable variables and checking whether the parametric density estimate is contained within such an envelope. Our specification test procedure is motivated by a functional connection between the transition density of the observable variables and the Markov transition kernel of the hidden states. Our approach is applicable for continuous time diffusion models, stochastic volatility models, nonlinear time series models, and models with market microstructure noise.

  15. JaCVAM-organized international validation study of the in vivo rodent alkaline comet assay for detection of genotoxic carcinogens: II. Summary of definitive validation study results.

    Science.gov (United States)

    Uno, Yoshifumi; Kojima, Hajime; Omori, Takashi; Corvi, Raffaella; Honma, Masamistu; Schechtman, Leonard M; Tice, Raymond R; Beevers, Carol; De Boeck, Marlies; Burlinson, Brian; Hobbs, Cheryl A; Kitamoto, Sachiko; Kraynak, Andrew R; McNamee, James; Nakagawa, Yuzuki; Pant, Kamala; Plappert-Helbig, Ulla; Priestley, Catherine; Takasawa, Hironao; Wada, Kunio; Wirnitzer, Uta; Asano, Norihide; Escobar, Patricia A; Lovell, David; Morita, Takeshi; Nakajima, Madoka; Ohno, Yasuo; Hayashi, Makoto

    2015-07-01

    The in vivo rodent alkaline comet assay (comet assay) is used internationally to investigate the in vivo genotoxic potential of test chemicals. This assay, however, has not previously been formally validated. The Japanese Center for the Validation of Alternative Methods (JaCVAM), with the cooperation of the U.S. NTP Interagency Center for the Evaluation of Alternative Toxicological Methods (NICEATM)/the Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM), the European Centre for the Validation of Alternative Methods (ECVAM), and the Japanese Environmental Mutagen Society/Mammalian Mutagenesis Study Group (JEMS/MMS), organized an international validation study to evaluate the reliability and relevance of the assay for identifying genotoxic carcinogens, using liver and stomach as target organs. The ultimate goal of this exercise was to establish an Organisation for Economic Co-operation and Development (OECD) test guideline. The study protocol was optimized in the pre-validation studies, and then the definitive (4th phase) validation study was conducted in two steps. In the 1st step, assay reproducibility was confirmed among laboratories using four coded reference chemicals and the positive control ethyl methanesulfonate. In the 2nd step, the predictive capability was investigated using 40 coded chemicals with known genotoxic and carcinogenic activity (i.e., genotoxic carcinogens, genotoxic non-carcinogens, non-genotoxic carcinogens, and non-genotoxic non-carcinogens). Based on the results obtained, the in vivo comet assay is concluded to be highly capable of identifying genotoxic chemicals and therefore can serve as a reliable predictor of rodent carcinogenicity.

  16. Atmospheric Dispersion Model Validation in Low Wind Conditions

    Energy Technology Data Exchange (ETDEWEB)

    Sawyer, Patrick

    2007-11-01

    Atmospheric plume dispersion models are used for a variety of purposes including emergency planning and response to hazardous material releases, determining force protection actions in the event of a Weapons of Mass Destruction (WMD) attack and for locating sources of pollution. This study provides a review of previous studies that examine the accuracy of atmospheric plume dispersion models for chemical releases. It considers the principles used to derive air dispersion plume models and looks at three specific models currently in use: Aerial Location of Hazardous Atmospheres (ALOHA), Emergency Prediction Information Code (EPIcode) and Second Order Closure Integrated Puff (SCIPUFF). Results from this study indicate over-prediction bias by the EPIcode and SCIPUFF models and under-prediction bias by the ALOHA model. The experiment parameters were for near field dispersion (less than 100 meters) in low wind speed conditions (less than 2 meters per second).

  17. Atmospheric Dispersion Model Validation in Low Wind Conditions

    Energy Technology Data Exchange (ETDEWEB)

    Sawyer, Patrick

    2007-11-01

    Atmospheric plume dispersion models are used for a variety of purposes including emergency planning and response to hazardous material releases, determining force protection actions in the event of a Weapons of Mass Destruction (WMD) attack and for locating sources of pollution. This study provides a review of previous studies that examine the accuracy of atmospheric plume dispersion models for chemical releases. It considers the principles used to derive air dispersion plume models and looks at three specific models currently in use: Aerial Location of Hazardous Atmospheres (ALOHA), Emergency Prediction Information Code (EPIcode) and Second Order Closure Integrated Puff (SCIPUFF). Results from this study indicate over-prediction bias by the EPIcode and SCIPUFF models and under-prediction bias by the ALOHA model. The experiment parameters were for near field dispersion (less than 100 meters) in low wind speed conditions (less than 2 meters per second).

  18. Model validation lessons learned: A case study at Oak Ridge National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Ketelle, R.H.; Lee, R.R.; Bownds, J.M. [Oak Ridge National Lab., TN (United States); Rizk, T.A. [North Carolina State Univ., Raleigh, NC (United States)

    1989-11-01

    A groundwater flow and contaminant transport model validation study was performed to determine the applicability of typical groundwater flow models for performance assessment of proposed waste disposal facilities at Oak Ridge, Tennessee. Standard practice site interpretation and groundwater modeling resulted in inaccurate predictions of contaminant transport at a proposed waste disposal site. The site`s complex and heterogeneous geology, the presence of flow dominated by fractured and weathered zones, and the strongly transient character of shallow aquifer recharge and discharge combined to render assumptions of steady-state, homogeneous groundwater flow invalid. The study involved iterative phases of site field investigation and modeling. Subsequent modeling activities focused on generation of a model grid incorporating the observed site geologic heterogeneity, and on establishing and using model boundary conditions based on site data. Time dependent water table configurations, and fixed head boundary conditions were used as input to the refined model in simulating groundwater flow at the site.

  19. Soil process modelling in CZO research: gains in data harmonisation and model validation

    Science.gov (United States)

    van Gaans, Pauline; Andrianaki, Maria; Kobierska, Florian; Kram, Pavel; Lamacova, Anna; Lair, Georg; Nikolaidis, Nikos; Duffy, Chris; Regelink, Inge; van Leeuwen, Jeroen P.; de Ruiter, Peter

    2014-05-01

    Various soil process models were applied to four European Critical Zone observatories (CZOs), the core research sites of the FP7 project SoilTrEC: the Damma glacier forefield (CH), a set of three forested catchments on geochemically contrasing bedrocks in the Slavkov Forest (CZ), a chronosequence of soils in the former floodplain of the Danube of Fuchsenbigl/Marchfeld (AT), and the Koiliaris catchments in the north-western part of Crete, (GR). The aim of the modelling exercises was to apply and test soil process models with data from the CZOs for calibration/validation, identify potential limits to the application scope of the models, interpret soil state and soil functions at key stages of the soil life cycle, represented by the four SoilTrEC CZOs, contribute towards harmonisation of data and data acquisition. The models identified as specifically relevant were: The Penn State Integrated Hydrologic Model (PIHM), a fully coupled, multiprocess, multi-scale hydrologic model, to get a better understanding of water flow and pathways, The Soil and Water Assessment Tool (SWAT), a deterministic, continuous time (daily time step) basin scale model, to evaluate the impact of soil management practices, The Rothamsted Carbon model (Roth-C) to simulate organic carbon turnover and the Carbon, Aggregation, and Structure Turnover (CAST) model to include the role of soil aggregates in carbon dynamics, The Ligand Charge Distribution (LCD) model, to understand the interaction between organic matter and oxide surfaces in soil aggregate formation, and The Terrestrial Ecology Model (TEM) to obtain insight into the link between foodweb structure and carbon and nutrient turnover. With some exceptions all models were applied to all four CZOs. The need for specific model input contributed largely to data harmonisation. The comparisons between the CZOs turned out to be of great value for understanding the strength and limitations of the models, as well as the differences in soil conditions

  20. Quantitative magnetospheric models: results and perspectives.

    Science.gov (United States)

    Kuznetsova, M.; Hesse, M.; Gombosi, T.; Csem Team

    Global magnetospheric models are indispensable tool that allow multi-point measurements to be put into global context Significant progress is achieved in global MHD modeling of magnetosphere structure and dynamics Medium resolution simulations confirm general topological pictures suggested by Dungey State of the art global models with adaptive grids allow performing simulations with highly resolved magnetopause and magnetotail current sheet Advanced high-resolution models are capable to reproduced transient phenomena such as FTEs associated with formation of flux ropes or plasma bubbles embedded into magnetopause and demonstrate generation of vortices at magnetospheric flanks On the other hand there is still controversy about the global state of the magnetosphere predicted by MHD models to the point of questioning the length of the magnetotail and the location of the reconnection sites within it For example for steady southwards IMF driving condition resistive MHD simulations produce steady configuration with almost stationary near-earth neutral line While there are plenty of observational evidences of periodic loading unloading cycle during long periods of southward IMF Successes and challenges in global modeling of magnetispheric dynamics will be addessed One of the major challenges is to quantify the interaction between large-scale global magnetospheric dynamics and microphysical processes in diffusion regions near reconnection sites Possible solutions to controversies will be discussed

  1. Evaluating the statistical conclusion validity of weighted mean results in meta-analysis by analysing funnel graph diagrams.

    Science.gov (United States)

    Elvik, R

    1998-03-01

    The validity of weighted mean results estimated in meta-analysis has been criticized. This paper presents a set of simple statistical and graphical techniques that can be used in meta-analysis to evaluate common points of criticism. The graphical techniques are based on funnel graph diagrams. Problems and techniques for dealing with them that are discussed include: (1) the so-called 'apples and oranges' problem, stating that mean results in meta-analysis tend to gloss over important differences that should be highlighted. A test of the homogeneity of results is described for testing the presence of this problem. If results are highly heterogeneous, a random effects model of meta-analysis is more appropriate than the fixed effects model of analysis. (2) The possible presence of skewness in a sample of results. This can be tested by comparing the mode, median and mean of the results in the sample. (3) The possible presence of more than one mode in a sample of results. This can be tested by forming a frequency distribution of the results and examining the shape of this distribution. (4) The sensitivity of the mean to the possible presence of atypical results (outliers) can be tested by comparing the overall mean to the mean of all results except the one suspected of being atypical. (5) The possible presence of publication bias can be tested by visual inspection of funnel graph diagrams in which data points have been sorted according to statistical significance and direction of effect. (6) The possibility of underestimating the standard error of the mean in meta-analyses by using multiple, correlated results from the same study as the unit of analysis can be addressed by using the jack-knife technique for estimating the uncertainty of the mean. Brief examples, taken from road safety research, are given of all these techniques.

  2. Prognostic models for locally advanced cervical cancer: external validation of the published models.

    Science.gov (United States)

    Lora, David; Gómez de la Cámara, Agustín; Fernández, Sara Pedraza; Enríquez de Salamanca, Rafael; Gómez, José Fermín Pérez Regadera

    2017-09-01

    To externally validate the prognostic models for predicting the time-dependent outcome in patients with locally advanced cervical cancer (LACC) who were treated with concurrent chemoradiotherapy in an independent cohort. A historical cohort of 297 women with LACC who were treated with radical concurrent chemoradiotherapy from 1999 to 2014 at the 12 de Octubre University Hospital (H12O), Madrid, Spain. The external validity of prognostic models was quantified regarding discrimination, calibration, measures of overall performance, and decision curve analyses. The review identified 8 studies containing 13 prognostic models. Different (International Federation of Gynecology and Obstetrics [FIGO] stages, parametrium involvement, hydronephrosis, location of positive nodes, and race) but related cohorts with validation cohort (5-year overall survival [OS]=70%; 5-year disease-free survival [DFS]=64%; average age of 50; and over 79% squamous cell) were evaluated. The following models exhibited good external validity in terms of discrimination and calibration but limited clinical utility: the OS model at 3 year from Kidd et al.'s study (area under the receiver operating characteristic curve [AUROC]=0.69; threshold of clinical utility [TCU] between 36% and 50%), the models of DFS at 1 year from Kidd et al.'s study (AUROC=0.64; TCU between 24% and 32%) and 2 years from Rose et al.'s study (AUROC=0.70; TCU between 19% and 58%) and the distant recurrence model at 5 years from Kang et al.'s study (AUROC=0.67; TCU between 12% and 36%). The external validation revealed the statistical and clinical usefulness of 4 prognostic models published in the literature.

  3. Validity of "Hi_Science" as instructional media based-android refer to experiential learning model

    Science.gov (United States)

    Qamariah, Jumadi, Senam, Wilujeng, Insih

    2017-08-01

    Hi_Science is instructional media based-android in learning science on material environmental pollution and global warming. This study is aimed: (a) to show the display of Hi_Science that will be applied in Junior High School, and (b) to describe the validity of Hi_Science. Hi_Science as instructional media created with colaboration of innovative learning model and development of technology at the current time. Learning media selected is based-android and collaborated with experiential learning model as an innovative learning model. Hi_Science had adapted student worksheet by Taufiq (2015). Student worksheet had very good category by two expert lecturers and two science teachers (Taufik, 2015). This student worksheet is refined and redeveloped in android as an instructional media which can be used by students for learning science not only in the classroom, but also at home. Therefore, student worksheet which has become instructional media based-android must be validated again. Hi_Science has been validated by two experts. The validation is based on assessment of meterials aspects and media aspects. The data collection was done by media assessment instrument. The result showed the assessment of material aspects has obtained the average value 4,72 with percentage of agreement 96,47%, that means Hi_Science on the material aspects is in excellent category or very valid category. The assessment of media aspects has obtained the average value 4,53 with percentage of agreement 98,70%, that means Hi_Science on the media aspects is in excellent category or very valid category. It was concluded that Hi_Science as instructional media can be applied in the junior high school.

  4. Planck intermediate results I. Further validation of new Planck clusters with XMM-Newton

    DEFF Research Database (Denmark)

    Aghanim, N.; Collaboration, Planck; Arnaud, M.

    2012-01-01

    . The sample was selected in order to test internal SZ quality flags, and the pertinence of these flags is discussed in light of the validation results. Ten of the candidates are found to be bona fide clusters lying below the RASS flux limit. Redshift estimates are available for all confirmed systems via X......-ray Fe-line spectroscopy. They lie in the redshift range 0.19 z z. The X-ray properties of the new clusters appear to be similar to previous new detections by Planck at lower z and higher SZ flux: the majority are X-ray underluminous...... of candidates previously confirmed with XMM-Newton. The X-ray and optical redshifts for a total of 20 clusters are found to be in excellent agreement. We also show that useful lower limits can be put on cluster redshifts using X-ray data only via the use of the Y-X vs. Y-SZ and X-ray flux F-X vs. Y-SZ relations....

  5. Local Validation of Global Estimates of Biosphere Properties: Synthesis of Scaling Methods and Results Across Several Major Biomes

    Science.gov (United States)

    Cohen, Warren B.; Wessman, Carol A.; Aber, John D.; VanderCaslte, John R.; Running, Steven W.

    1998-01-01

    To assist in validating future MODIS land cover, LAI, IPAR, and NPP products, this project conducted a series of prototyping exercises that resulted in enhanced understanding of the issues regarding such validation. As a result, we have several papers to appear as a special issue of Remote Sensing of Environment in 1999. Also, we have been successful at obtaining a follow-on grant to pursue actual validation of these products over the next several years. This document consists of a delivery letter, including a listing of published papers.

  6. Local Validation of Global Estimates of Biosphere Properties: Synthesis of Scaling Methods and Results Across Several Major Biomes

    Science.gov (United States)

    Cohen, Warren B.; Wessman, Carol A.; Aber, John D.; VanderCaslte, John R.; Running, Steven W.

    1998-01-01

    To assist in validating future MODIS land cover, LAI, IPAR, and NPP products, this project conducted a series of prototyping exercises that resulted in enhanced understanding of the issues regarding such validation. As a result, we have several papers to appear as a special issue of Remote Sensing of Environment in 1999. Also, we have been successful at obtaining a follow-on grant to pursue actual validation of these products over the next several years. This document consists of a delivery letter, including a listing of published papers.

  7. Validation of a Parcel-Based Reduced-Complexity Model for River Delta Formation (Invited)

    Science.gov (United States)

    Liang, M.; Geleynse, N.; Passalacqua, P.; Edmonds, D. A.; Kim, W.; Voller, V. R.; Paola, C.

    2013-12-01

    Reduced-Complexity Models (RCMs) take an intuitive yet quantitative approach to represent processes with the goal of getting maximum return in emergent system-scale behavior with minimum investment in computational complexity. This approach is in contrast to reductionist models that aim at rigorously solving the governing equations of fluid flow and sediment transport. RCMs have had encouraging successes in modeling a variety of geomorphic systems, such as braided rivers, alluvial fans, and river deltas. Despite the fact that these models are not intended to resolve detailed flow structures, questions remain on how to interpret and validate the output of RCMs beyond qualitative behavior-based descriptions. Here we present a validation of the newly developed RCM for river delta formation with channel dynamics (Liang, 2013). The model uses a parcel-based 'weighted-random-walk' method that resolves the formation of river deltas at the scale of channel dynamics (e.g., avulsions and bifurcations). The main focus of this validation work is the flow routing model component. A set of synthetic test cases were designed to compare hydrodynamic results from the RCM and Delft3D, including flow in a straight channel, around a bump, and flow partitioning at a single bifurcation. Output results, such as water surface slope and flow field, are also compared to field observations collected at Wax Lake Delta. Additionally, we investigate channel avulsion cycles and flow path selection in an alluvial fan with differential styles of subsidence and compare model results to laboratory experiments, as a preliminary effort in pairing up numerical and experimental models to understand channel organization at process scale. Strengths and weaknesses of the RCM are discussed and potential candidates for model application identified.

  8. Validation of transport models using additive flux minimization technique

    Energy Technology Data Exchange (ETDEWEB)

    Pankin, A. Y.; Kruger, S. E. [Tech-X Corporation, 5621 Arapahoe Ave., Boulder, Colorado 80303 (United States); Groebner, R. J. [General Atomics, San Diego, California 92121 (United States); Hakim, A. [Princeton Plasma Physics Laboratory, Princeton, New Jersey 08543-0451 (United States); Kritz, A. H.; Rafiq, T. [Department of Physics, Lehigh University, Bethlehem, Pennsylvania 18015 (United States)

    2013-10-15

    A new additive flux minimization technique is proposed for carrying out the verification and validation (V and V) of anomalous transport models. In this approach, the plasma profiles are computed in time dependent predictive simulations in which an additional effective diffusivity is varied. The goal is to obtain an optimal match between the computed and experimental profile. This new technique has several advantages over traditional V and V methods for transport models in tokamaks and takes advantage of uncertainty quantification methods developed by the applied math community. As a demonstration of its efficiency, the technique is applied to the hypothesis that the paleoclassical density transport dominates in the plasma edge region in DIII-D tokamak discharges. A simplified version of the paleoclassical model that utilizes the Spitzer resistivity for the parallel neoclassical resistivity and neglects the trapped particle effects is tested in this paper. It is shown that a contribution to density transport, in addition to the paleoclassical density transport, is needed in order to describe the experimental profiles. It is found that more additional diffusivity is needed at the top of the H-mode pedestal, and almost no additional diffusivity is needed at the pedestal bottom. The implementation of this V and V technique uses the FACETS::Core transport solver and the DAKOTA toolkit for design optimization and uncertainty quantification. The FACETS::Core solver is used for advancing the plasma density profiles. The DAKOTA toolkit is used for the optimization of plasma profiles and the computation of the additional diffusivity that is required for the predicted density profile to match the experimental profile.

  9. Active control strategy on a catenary-pantograph validated model

    Science.gov (United States)

    Sanchez-Rebollo, C.; Jimenez-Octavio, J. R.; Carnicero, A.

    2013-04-01

    Dynamic simulation methods have become essential in the design process and control of the catenary-pantograph system, overall since high-speed trains and interoperability criteria are getting very trendy. This paper presents an original hardware-in-the-loop (HIL) strategy aimed at integrating a multicriteria active control within the catenary-pantograph dynamic interaction. The relevance of HIL control systems applied in the frame of the pantograph is undoubtedly increasing due to the recent and more demanding requirements for high-speed railway systems. Since the loss of contact between the catenary and the pantograph leads to arcing and electrical wear, and too high contact forces cause mechanical wear of both the catenary wires and the strips of the pantograph, not only prescribed but also economic and performance criteria ratify such a relevance. Different configurations of the proportional-integral-derivative (PID) controller are proposed and applied to two different plant systems. Since this paper is mainly focused on the control strategy, both plant systems are simulation models though the methodology is suitable for a laboratory bench. The strategy of control involves a multicriteria optimisation of the contact force and the consumption of the energy supplied by the control force, a genetic algorithm has been applied for this purpose. Thus, the PID controller is fitted according to these conflicting objectives and tested within a nonlinear lumped model and a nonlinear finite element model, being the last one validated against the European Standard EN 50318. Finally, certain tests have been accomplished in order to analyse the robustness of the control strategy. Particularly, the relevance or the plant simulation, the running speed and the instrumentation time delay are studied in this paper.

  10. Numerical modeling and preliminary validation of drag-based vertical axis wind turbine

    Directory of Open Access Journals (Sweden)

    Krysiński Tomasz

    2015-03-01

    Full Text Available The main purpose of this article is to verify and validate the mathematical description of the airflow around a wind turbine with vertical axis of rotation, which could be considered as representative for this type of devices. Mathematical m