WorldWideScience

Sample records for model validation project

  1. Base Flow Model Validation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation is the systematic "building-block" validation of CFD/turbulence models employing a GUI driven CFD code (RPFM) and existing as well as new data sets to...

  2. Base Flow Model Validation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The program focuses on turbulence modeling enhancements for predicting high-speed rocket base flows. A key component of the effort is the collection of high-fidelity...

  3. Validation of HEDR models. Hanford Environmental Dose Reconstruction Project

    Energy Technology Data Exchange (ETDEWEB)

    Napier, B.A.; Simpson, J.C.; Eslinger, P.W.; Ramsdell, J.V. Jr.; Thiede, M.E.; Walters, W.H.

    1994-05-01

    The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computer models for estimating the possible radiation doses that individuals may have received from past Hanford Site operations. This document describes the validation of these models. In the HEDR Project, the model validation exercise consisted of comparing computational model estimates with limited historical field measurements and experimental measurements that are independent of those used to develop the models. The results of any one test do not mean that a model is valid. Rather, the collection of tests together provide a level of confidence that the HEDR models are valid.

  4. Cross-validation analysis of bias models in Bayesian multi-model projections of climate

    Science.gov (United States)

    Huttunen, J. M. J.; Räisänen, J.; Nissinen, A.; Lipponen, A.; Kolehmainen, V.

    2017-03-01

    Climate change projections are commonly based on multi-model ensembles of climate simulations. In this paper we consider the choice of bias models in Bayesian multimodel predictions. Buser et al. (Clim Res 44(2-3):227-241, 2010a) introduced a hybrid bias model which combines commonly used constant bias and constant relation bias assumptions. The hybrid model includes a weighting parameter which balances these bias models. In this study, we use a cross-validation approach to study which bias model or bias parameter leads to, in a specific sense, optimal climate change projections. The analysis is carried out for summer and winter season means of 2 m-temperatures spatially averaged over the IPCC SREX regions, using 19 model runs from the CMIP5 data set. The cross-validation approach is applied to calculate optimal bias parameters (in the specific sense) for projecting the temperature change from the control period (1961-2005) to the scenario period (2046-2090). The results are compared to the results of the Buser et al. (Clim Res 44(2-3):227-241, 2010a) method which includes the bias parameter as one of the unknown parameters to be estimated from the data.

  5. Integrated Medical Model (IMM) Project Verification, Validation, and Credibility (VVandC)

    Science.gov (United States)

    Walton, M.; Boley, L.; Keenan, L.; Kerstman, E.; Shah, R.; Young, M.; Saile, L.; Garcia, Y.; Meyers, J.; Reyes, D.

    2015-01-01

    The Integrated Medical Model (IMM) Project supports end user requests by employing the Integrated Medical Evidence Database (iMED) and IMM tools as well as subject matter expertise within the Project. The iMED houses data used by the IMM. The IMM is designed to forecast relative changes for a specified set of crew health and mission success risk metrics by using a probabilistic model based on historical data, cohort data, and subject matter expert opinion. A stochastic approach is taken because deterministic results would not appropriately reflect the uncertainty in the IMM inputs. Once the IMM was conceptualized, a plan was needed to rigorously assess input information, framework and code, and output results of the IMM, and ensure that end user requests and requirements were considered during all stages of model development and implementation, as well as lay the foundation for external review and application. METHODS: In 2008, the Project team developed a comprehensive verification and validation (VV) plan, which specified internal and external review criteria encompassing 1) verification of data and IMM structure to ensure proper implementation of the IMM, 2) several validation techniques to confirm that the simulation capability of the IMM appropriately represents occurrences and consequences of medical conditions during space missions, and 3) credibility processes to develop user confidence in the information derived from the IMM. When the NASA-STD-7009 (7009) [1] was published, the Project team updated their verification, validation, and credibility (VVC) project plan to meet 7009 requirements and include 7009 tools in reporting VVC status of the IMM. Construction of these tools included meeting documentation and evidence requirements sufficient to meet external review success criteria. RESULTS: IMM Project VVC updates are compiled recurrently and include updates to the 7009 Compliance and Credibility matrices. Reporting tools have evolved over the lifetime of

  6. Snow Cover on the Arctic Sea Ice: Model Validation, Sensitivity, and 21st Century Projections

    Science.gov (United States)

    Blazey, Benjamin Andrew

    The role of snow cover in controlling Arctic Ocean sea ice thickness and extent is assessed with a series of models. Investigations with the stand alone Community Ice CodE (CICE) show, first, a reduction in snow depth triggers a decrease in ice volume and area, and, second, that the impact of increased snow is heavily dependent on ice and atmospheric conditions. Hindcast snow depths on the Arctic ice, simulated by the fully coupled Community Climate System Model (CCSM) are validated with 20th century in situ snow depth measurements. The snow depths in CCSM are found to be deeper than observed, likely due to excessive precipitation produced by the component atmosphere model. The sensitivity of the ice to the thermal barrier imposed by the biased snow depth is assessed. The removal of the thermodynamic impact of the exaggerated snow depth increases ice area and volume. The initial increases in ice due to enhanced conductive flux triggers feedback mechanisms with the atmosphere and ocean, reinforcing the increase in ice. Finally, the 21st century projections of decreased Arctic Ocean snow depth in CCSM are reported and diagnosed. The changes in snow are dominated by reduced accumulation due to the lack of autumn ice cover. Without this platform, much of the early snowfall is lost directly to the ocean. While this decrease in snow results in enhanced conductive flux through the ice as in the validation sensitivity experiment, the decreased summer albedo is found to dominate, as in the CICE stand alone sensitivity experiment. As such, the decrease in snow projected by CCSM in the 21st century presents a mechanism to continued ice loss. These negative (ice growth due decreased insulation) and positive (ice melt due to decreased albedo) feedback mechanisms highlight the need for an accurate representation snow cover on the ice in order to accurately simulate the evolution of Arctic Ocean sea ice.

  7. High Performance Computing Application: Solar Dynamo Model Project II, Corona and Heliosphere Component Initialization, Integration and Validation

    Science.gov (United States)

    2015-06-24

    allocate solar heating into any location of the corona . Its total contribution depended on the integration of the unsigned magnetic flux at 1 Rs...AFRL-RD-PS- TR-2015-0028 AFRL-RD-PS- TR-2015-0028 HIGH PERFORMANCE COMPUTING APPLICATION: SOLAR DYNAMO MODEL PROJECT II; CORONA AND HELIOSPHERE...Dynamo Model Project II, Corona and Heliosphere Component Initialization, Integration and Validation 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6

  8. Turbulent Scalar Transport Model Validation for High Speed Propulsive Flows Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This effort entails the validation of a RANS turbulent scalar transport model (SFM) for high speed propulsive flows, using new experimental data sets and...

  9. Validation and Comparison of Carbon Sequestration Project Cost Models with Project Cost Data Obtained from the Southwest Partnership

    Energy Technology Data Exchange (ETDEWEB)

    Robert Lee; Reid Grigg; Brian McPherson

    2011-04-15

    Obtaining formal quotes and engineering conceptual designs for carbon dioxide (CO{sub 2}) sequestration sites and facilities is costly and time-consuming. Frequently, when looking at potential locations, managers, engineers and scientists are confronted with multiple options, but do not have the expertise or the information required to quickly obtain a general estimate of what the costs will be without employing an engineering firm. Several models for carbon compression, transport and/or injection have been published that are designed to aid in determining the cost of sequestration projects. A number of these models are used in this study, including models by J. Ogden, MIT's Carbon Capture and Sequestration Technologies Program Model, the Environmental Protection Agency and others. This report uses the information and data available from several projects either completed, in progress, or conceptualized by the Southwest Regional Carbon Sequestration Partnership on Carbon Sequestration (SWP) to determine the best approach to estimate a project's cost. The data presented highlights calculated versus actual costs. This data is compared to the results obtained by applying several models for each of the individual projects with actual cost. It also offers methods to systematically apply the models to future projects of a similar scale. Last, the cost risks associated with a project of this scope are discussed, along with ways that have been and could be used to mitigate these risks.

  10. Groundwater Model Validation for the Project Shoal Area, Corrective Action Unit 447

    Energy Technology Data Exchange (ETDEWEB)

    None

    2008-05-19

    Stoller has examined newly collected water level data in multiple wells at the Shoal site. On the basis of these data and information presented in the report, we are currently unable to confirm that the model is successfully validated. Most of our concerns regarding the model stem from two findings: (1) measured water level data do not provide clear evidence of a prevailing lateral flow direction; and (2) the groundwater flow system has been and continues to be in a transient state, which contrasts with assumed steady-state conditions in the model. The results of DRI's model validation efforts and observations made regarding water level behavior are discussed in the following sections. A summary of our conclusions and recommendations for a path forward are also provided in this letter report.

  11. Validation of a probabilistic model for hurricane insurance loss projections in Florida

    Energy Technology Data Exchange (ETDEWEB)

    Pinelli, J.-P. [Florida Tech, Melbourne, Florida (United States)], E-mail: pinelli@fit.edu; Gurley, K.R. [University of Florida, Gainesville, Florida (United States); Subramanian, C.S. [Florida Tech, Melbourne, Florida (United States); Hamid, S.S. [Florida International University, Miami, Florida (United States); Pita, G.L. [Florida Tech, Melbourne, Florida (United States)

    2008-12-15

    The Florida Public Hurricane Loss Model is one of the first public models accessible for scrutiny to the scientific community, incorporating state of the art techniques in hurricane and vulnerability modeling. The model was developed for Florida, and is applicable to other hurricane-prone regions where construction practice is similar. The 2004 hurricane season produced substantial losses in Florida, and provided the means to validate and calibrate this model against actual claim data. This paper presents the predicted losses for several insurance portfolios corresponding to hurricanes Andrew, Charley, and Frances. The predictions are validated against the actual claim data. Physical damage predictions for external building components are also compared to observed damage. The analyses show that the predictive capabilities of the model were substantially improved after the calibration against the 2004 data. The methodology also shows that the predictive capabilities of the model could be enhanced if insurance companies report more detailed information about the structures they insure and the types of damage they suffer. This model can be a powerful tool for the study of risk reduction strategies.

  12. Climate change in Central America and Mexico: regional climate model validation and climate change projections

    Science.gov (United States)

    Karmalkar, Ambarish V.; Bradley, Raymond S.; Diaz, Henry F.

    2011-08-01

    Central America has high biodiversity, it harbors high-value ecosystems and it's important to provide regional climate change information to assist in adaptation and mitigation work in the region. Here we study climate change projections for Central America and Mexico using a regional climate model. The model evaluation shows its success in simulating spatial and temporal variability of temperature and precipitation and also in capturing regional climate features such as the bimodal annual cycle of precipitation and the Caribbean low-level jet. A variety of climate regimes within the model domain are also better identified in the regional model simulation due to improved resolution of topographic features. Although, the model suffers from large precipitation biases, it shows improvements over the coarse-resolution driving model in simulating precipitation amounts. The model shows a dry bias in the wet season and a wet bias in the dry season suggesting that it's unable to capture the full range of precipitation variability. Projected warming under the A2 scenario is higher in the wet season than that in the dry season with the Yucatan Peninsula experiencing highest warming. A large reduction in precipitation in the wet season is projected for the region, whereas parts of Central America that receive a considerable amount of moisture in the form of orographic precipitation show significant decreases in precipitation in the dry season. Projected climatic changes can have detrimental impacts on biodiversity as they are spatially similar, but far greater in magnitude, than those observed during the El Niño events in recent decades that adversely affected species in the region.

  13. Climate change in Central America and Mexico: regional climate model validation and climate change projections

    Energy Technology Data Exchange (ETDEWEB)

    Karmalkar, Ambarish V. [University of Oxford, School of Geography and the Environment, Oxford (United Kingdom); Bradley, Raymond S. [University of Massachusetts, Department of Geosciences, Amherst, MA (United States); Diaz, Henry F. [NOAA/ESRL/CIRES, Boulder, CO (United States)

    2011-08-15

    Central America has high biodiversity, it harbors high-value ecosystems and it's important to provide regional climate change information to assist in adaptation and mitigation work in the region. Here we study climate change projections for Central America and Mexico using a regional climate model. The model evaluation shows its success in simulating spatial and temporal variability of temperature and precipitation and also in capturing regional climate features such as the bimodal annual cycle of precipitation and the Caribbean low-level jet. A variety of climate regimes within the model domain are also better identified in the regional model simulation due to improved resolution of topographic features. Although, the model suffers from large precipitation biases, it shows improvements over the coarse-resolution driving model in simulating precipitation amounts. The model shows a dry bias in the wet season and a wet bias in the dry season suggesting that it's unable to capture the full range of precipitation variability. Projected warming under the A2 scenario is higher in the wet season than that in the dry season with the Yucatan Peninsula experiencing highest warming. A large reduction in precipitation in the wet season is projected for the region, whereas parts of Central America that receive a considerable amount of moisture in the form of orographic precipitation show significant decreases in precipitation in the dry season. Projected climatic changes can have detrimental impacts on biodiversity as they are spatially similar, but far greater in magnitude, than those observed during the El Nino events in recent decades that adversely affected species in the region. (orig.)

  14. Validated dynamic flow model

    DEFF Research Database (Denmark)

    Knudsen, Torben

    2011-01-01

    The purpose with this deliverable 2.5 is to use fresh experimental data for validation and selection of a flow model to be used for control design in WP3-4. Initially the idea was to investigate the models developed in WP2. However, in the project it was agreed to include and focus on a additive...... model turns out not to be useful for prediction of the flow. Moreover, standard Box Jenkins model structures and multiple output auto regressive models proves to be superior as they can give useful predictions of the flow....

  15. Research program to develop and validate conceptual models for flow and transport through unsaturated, fractured rock; Yucca Mountain Site Characterization Project

    Energy Technology Data Exchange (ETDEWEB)

    Glass, R.J.; Tidwell, V.C.

    1991-09-01

    As part of the Yucca Mountain Project, our research program to develop and validate conceptual models for flow and transport through unsaturated fractured rock integrates fundamental physical experimentation with conceptual model formulation and mathematical modeling. Our research is directed toward developing and validating macroscopic, continuum-based models and supporting effective property models because of their widespread utility within the context of this project. Success relative to the development and validation of effective property models is predicted on a firm understanding of the basic physics governing flow through fractured media, specifically in the areas of unsaturated flow and transport in a single fracture and fracture-matrix interaction.

  16. Introduction to the Monte Carlo project and the approach to the validation of probabilistic models of dietary exposure to selected food chemicals

    NARCIS (Netherlands)

    Gibney, M.J.; Voet, van der H.

    2003-01-01

    The Monte Carlo project was established to allow an international collaborative effort to define conceptual models for food chemical and nutrient exposure, to define and validate the software code to govern these models, to provide new or reconstructed databases for validation studies, and to use th

  17. The VATO project: Development and validation of a dynamic transfer model of tritium in grassland ecosystem.

    Science.gov (United States)

    Le Dizès, S; Aulagnier, C; Maro, D; Rozet, M; Vermorel, F; Hébert, D; Voiseux, C; Solier, L; Godinot, C; Fievet, B; Laguionie, P; Connan, O; Cazimajou, O; Morillon, M

    2017-05-01

    In this paper, a dynamic compartment model with a high temporal resolution has been investigated to describe tritium transfer in grassland ecosystems exposed to atmospheric (3)H releases from nuclear facilities under normal operating or accidental conditions. TOCATTA-χ model belongs to the larger framework of the SYMBIOSE modelling and simulation platform that aims to assess the fate and transport of a wide range of radionuclides in various environmental systems. In this context, the conceptual and mathematical models of TOCATTA-χ have been designed to be relatively simple, minimizing the number of compartments and input parameters required. In the same time, the model achieves a good compromise between easy-to-use (as it is to be used in an operational mode), explicative power and predictive accuracy in various experimental conditions. In the framework of the VATO project, the model has been tested against two-year-long in situ measurements of (3)H activity concentration monitored by IRSN in air, groundwater and grass, together with meteorological parameters, on a grass field plot located 2 km downwind of the AREVA NC La Hague nuclear reprocessing plant, as was done in the past for the evaluation of transfer of (14)C in grass. By considering fast exchanges at the vegetation-air canopy interface, the model correctly reproduces the observed variability in TFWT activity concentration in grass, which evolves in accordance with spikes in atmospheric HTO activity concentration over the previous 24 h. The average OBT activity concentration in grass is also correctly reproduced. However, the model has to be improved in order to reproduce punctual high concentration of OBT activity, as observed in December 2013. The introduction of another compartment with a fast kinetic (like TFWT) - although outside the model scope - improves the predictions by increasing the correlation coefficient from 0.29 up to 0.56 when it includes this particular point. Further experimental

  18. Assessment of epidemic projections using recent HIV survey data in South Africa: a validation analysis of ten mathematical models of HIV epidemiology in the antiretroviral therapy era

    NARCIS (Netherlands)

    Eaton, J.W.; Bacaer, N.; Bershteyn, A.; Cambiano, V.; Cori, A.; Dorrington, R.E.; Fraser, C.; Gopalappa, C.; Hontelez, J.A.; Johnson, L.F.; Klein, D.J.; Phillips, A.N.; Pretorius, C.; Stover, J.; Rehle, T.M.; Hallett, T.B.

    2015-01-01

    BACKGROUND: Mathematical models are widely used to simulate the effects of interventions to control HIV and to project future epidemiological trends and resource needs. We aimed to validate past model projections against data from a large household survey done in South Africa in 2012. METHODS: We co

  19. Assessment of epidemic projections using recent HIV survey data in South Africa: a validation analysis of ten mathematical models of HIV epidemiology in the antiretroviral therapy era

    NARCIS (Netherlands)

    Eaton, J.W.; Bacaer, N.; Bershteyn, A.; Cambiano, V.; Cori, A.; Dorrington, R.E.; Fraser, C.; Gopalappa, C.; Hontelez, J.A.; Johnson, L.F.; Klein, D.J.; Phillips, A.N.; Pretorius, C.; Stover, J.; Rehle, T.M.; Hallett, T.B.

    2015-01-01

    BACKGROUND: Mathematical models are widely used to simulate the effects of interventions to control HIV and to project future epidemiological trends and resource needs. We aimed to validate past model projections against data from a large household survey done in South Africa in 2012. METHODS: We

  20. Assessment of epidemic projections using recent HIV survey data in South Africa: a validation analysis of ten mathematical models of HIV epidemiology in the antiretroviral therapy era

    NARCIS (Netherlands)

    Eaton, J.W.; Bacaer, N.; Bershteyn, A.; Cambiano, V.; Cori, A.; Dorrington, R.E.; Fraser, C.; Gopalappa, C.; Hontelez, J.A.; Johnson, L.F.; Klein, D.J.; Phillips, A.N.; Pretorius, C.; Stover, J.; Rehle, T.M.; Hallett, T.B.

    2015-01-01

    BACKGROUND: Mathematical models are widely used to simulate the effects of interventions to control HIV and to project future epidemiological trends and resource needs. We aimed to validate past model projections against data from a large household survey done in South Africa in 2012. METHODS: We co

  1. Rapid Robot Design Validation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Energid Technologies will create a comprehensive software infrastructure for rapid validation of robot designs. The software will support push-button validation...

  2. Rapid Robot Design Validation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Energid Technologies will create a comprehensive software infrastructure for rapid validation of robotic designs. The software will support push-button validation...

  3. Validation of simulation models

    DEFF Research Database (Denmark)

    Rehman, Muniza; Pedersen, Stig Andur

    2012-01-01

    In philosophy of science, the interest for computational models and simulations has increased heavily during the past decades. Different positions regarding the validity of models have emerged but the views have not succeeded in capturing the diversity of validation methods. The wide variety...... of models has been somewhat narrow-minded reducing the notion of validation to establishment of truth. This article puts forward the diversity in applications of simulation models that demands a corresponding diversity in the notion of validation....... of models with regards to their purpose, character, field of application and time dimension inherently calls for a similar diversity in validation approaches. A classification of models in terms of the mentioned elements is presented and used to shed light on possible types of validation leading...

  4. Model Validation Status Review

    Energy Technology Data Exchange (ETDEWEB)

    E.L. Hardin

    2001-11-28

    The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified, and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M&O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and

  5. Introduction to the Monte Carlo project and the approach to the validation of probabilistic models of dietary exposure to selected food chemicals.

    Science.gov (United States)

    Gibney, M J; van der Voet, H

    2003-10-01

    The Monte Carlo project was established to allow an international collaborative effort to define conceptual models for food chemical and nutrient exposure, to define and validate the software code to govern these models, to provide new or reconstructed databases for validation studies, and to use the new software code to complete validation modelling. Models were considered valid when they provided exposure estimates (e(a)) that could be shown not to underestimate the true exposure (e(b)), but at the same time are more realistic than the currently used conservative estimates (e(c)). Thus, validation required e(b) validation involved the collection of duplicate diets from 500 infants for pesticide analysis. In the case of intense sweeteners, a new consumption dataset was created among prescreened high consumers of intense sweeteners by recording, at brand level, all foods and beverages ingested over 12 days. In the case of nutrients and additives, existing databases were modified to minimize uncertainty over the model parameters. In most instances, it was possible to generate probabilistic models that fulfilled the validation criteria.

  6. Characterization of materials for a reactive transport model validation experiment: Interim report on the caisson experiment. Yucca Mountain Site Characterization Project

    Energy Technology Data Exchange (ETDEWEB)

    Siegel, M.D.; Cheng, W.C. [Sandia National Labs., Albuquerque, NM (United States); Ward, D.B.; Bryan, C.R. [Univ. of New Mexico, Albuquerque, NM (United States). Dept. of Earth and Planetary Sciences

    1995-08-01

    Models used in performance assessment and site characterization activities related to nuclear waste disposal rely on simplified representations of solute/rock interactions, hydrologic flow field and the material properties of the rock layers surrounding the repository. A crucial element in the design of these models is the validity of these simplifying assumptions. An intermediate-scale experiment is being carried out at the Experimental Engineered Test Facility at Los Alamos Laboratory by the Los Alamos and Sandia National Laboratories to develop a strategy to validate key geochemical and hydrological assumptions in performance assessment models used by the Yucca Mountain Site Characterization Project.

  7. Models projecting the fate of fish populations under climate change need to be based on valid physiological mechanisms.

    Science.gov (United States)

    Lefevre, Sjannie; McKenzie, David J; Nilsson, Göran E

    2017-09-01

    Some recent modelling papers projecting smaller fish sizes and catches in a warmer future are based on erroneous assumptions regarding (i) the scaling of gills with body mass and (ii) the energetic cost of 'maintenance'. Assumption (i) posits that insurmountable geometric constraints prevent respiratory surface areas from growing as fast as body volume. It is argued that these constraints explain allometric scaling of energy metabolism, whereby larger fishes have relatively lower mass-specific metabolic rates. Assumption (ii) concludes that when fishes reach a certain size, basal oxygen demands will not be met, because of assumption (i). We here demonstrate unequivocally, by applying accepted physiological principles with reference to the existing literature, that these assumptions are not valid. Gills are folded surfaces, where the scaling of surface area to volume is not constrained by spherical geometry. The gill surface area can, in fact, increase linearly in proportion to gill volume and body mass. We cite the large body of evidence demonstrating that respiratory surface areas in fishes reflect metabolic needs, not vice versa, which explains the large interspecific variation in scaling of gill surface areas. Finally, we point out that future studies basing their predictions on models should incorporate factors for scaling of metabolic rate and for temperature effects on metabolism, which agree with measured values, and should account for interspecific variation in scaling and temperature effects. It is possible that some fishes will become smaller in the future, but to make reliable predictions the underlying mechanisms need to be identified and sought elsewhere than in geometric constraints on gill surface area. Furthermore, to ensure that useful information is conveyed to the public and policymakers about the possible effects of climate change, it is necessary to improve communication and congruity between fish physiologists and fisheries scientists. © 2017

  8. Validation, Proof-of-Concept, and Postaudit of the Groundwater Flow and Transport Model of the Project Shoal Area

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed Hassan

    2004-09-01

    The groundwater flow and radionuclide transport model characterizing the Shoal underground nuclear test has been accepted by the State of Nevada Division of Environmental Protection. According to the Federal Facility Agreement and Consent Order (FFACO) between DOE and the State of Nevada, the next steps in the closure process for the site are then model validation (or postaudit), the proof-of-concept, and the long-term monitoring stage. This report addresses the development of the validation strategy for the Shoal model, needed for preparing the subsurface Corrective Action Decision Document-Corrective Action Plan and the development of the proof-of-concept tools needed during the five-year monitoring/validation period. The approach builds on a previous model, but is adapted and modified to the site-specific conditions and challenges of the Shoal site.

  9. Results of the AVATAR project for the validation of 2D aerodynamic models with experimental data of the DU95W180 airfoil with unsteady flap

    Science.gov (United States)

    Ferreira, C.; Gonzalez, A.; Baldacchino, D.; Aparicio, M.; Gómez, S.; Munduate, X.; Garcia, N. R.; Sørensen, J. N.; Jost, E.; Knecht, S.; Lutz, T.; Chassapogiannis, P.; Diakakis, K.; Papadakis, G.; Voutsinas, S.; Prospathopoulos, J.; Gillebaart, T.; van Zuijlen, A.

    2016-09-01

    The FP7 AdVanced Aerodynamic Tools for lArge Rotors - Avatar project aims to develop and validate advanced aerodynamic models, to be used in integral design codes for the next generation of large scale wind turbines (10-20MW). One of the approaches towards reaching rotors for 10-20MW size is the application of flow control devices, such as flaps. In Task 3.2: Development of aerodynamic codes for modelling of flow devices on aerofoils and, rotors of the Avatar project, aerodynamic codes are benchmarked and validated against the experimental data of a DU95W180 airfoil in steady and unsteady flow, for different angle of attack and flap settings, including unsteady oscillatory trailing-edge-flap motion, carried out within the framework of WP3: Models for Flow Devices and Flow Control, Task 3.1: CFD and Experimental Database. The aerodynamics codes are: AdaptFoil2D, Foil2W, FLOWer, MaPFlow, OpenFOAM, Q3UIC, ATEFlap. The codes include unsteady Eulerian CFD simulations with grid deformation, panel models and indicial engineering models. The validation cases correspond to 18 steady flow cases, and 42 unsteady flow cases, for varying angle of attack, flap deflection and reduced frequency, with free and forced transition. The validation of the models show varying degrees of agreement, varying between models and flow cases.

  10. Regional Climate Downscaling Of African Climate Using A High-Resolution Global Atmospheric Model: Validation And Future Projection

    Science.gov (United States)

    Raj, J.; Stenchikov, G. L.; Bangalath, H.

    2013-12-01

    Climate change impact assessment and adaptation planning require region specific information with high spatial resolution, since the climate and weather effects are directly felt at the local scale. While most of the state-of-the-art General Circulation Models lack adequate spatial resolution, regional climate models (RCM) used in a nested domain are generally incapable of incorporating the two-way exchanges between regional and global climate. In this study we use a very high resolution atmospheric general circulation model HiRAM, developed at NOAA GFDL, to investigate the regional climate changes over CORDEX African domain. The HiRAM simulations are performed with a horizontal grid spacing of 25 km, which is an ample resolution for regional climate simulation. HiRAM has the advantage of naturally describing interaction between regional and global climate. Historic (1975-2004) simulations and future (2007-2050) projections, with both RCP 4.5 and RCP 8.5 pathways, are conducted in line with the CORDEX protocol. A coarse resolution sea surface temperature (SST) is prescribed from the GFDL Earth System Model runs of IPPC AR5, as bottom boundary condition over ocean. The GFDL Land Surface Model (LM3) is employed to calculate physical processes at surface and in soil. The preliminary analysis of the performance of HiRAM, using historic runs, shows it reproduces the regional climate adequately well in comparison with observations. Significant improvement in the simulation of regional climate is evident in comparison with the coarse resolution driving model. Future projections predict an increase in atmospheric temperature over Africa with stronger warming in the subtropics than in tropics. A significant strengthening of West African Monsoon and a southward shift of the summer rainfall maxima over Africa is predicted in both RCP 4.5 and RCP8.5 scenarios.

  11. Results of the AVATAR project for the validation of 2D aerodynamic models with experimental data of the DU95W180 airfoil with unsteady flap

    DEFF Research Database (Denmark)

    Ferreira, C.; Gonzalez, A.; Baldacchino, D.;

    2016-01-01

    The FP7 AdVanced Aerodynamic Tools for lArge Rotors - Avatar project aims to develop and validate advanced aerodynamic models, to be used in integral design codes for the next generation of large scale wind turbines (10-20MW). One of the approaches towards reaching rotors for 10-20MW size...... is the application of flow control devices, such as flaps. In Task 3.2: Development of aerodynamic codes for modelling of flow devices on aerofoils and, rotors of the Avatar project, aerodynamic codes are benchmarked and validated against the experimental data of a DU95W180 airfoil in steady and unsteady flow......, for different angle of attack and flap settings, including unsteady oscillatory trailing-edge-flap motion, carried out within the framework of WP3: Models for Flow Devices and Flow Control, Task 3.1: CFD and Experimental Database. The aerodynamics codes are: AdaptFoil2D, Foil2W, FLOWer, MaPFlow, OpenFOAM, Q3UIC...

  12. Design and validation of a specialized training model for tissue bank personnel as a result of the European Quality System for Tissue Banking (EQSTB) project.

    Science.gov (United States)

    Kaminski, A; Uhrynowska-Tyszkiewicz, I; Miranda, B; Navarro, A; Manyalich, M

    2007-11-01

    The main objective of European Quality System for Tissue Banking (EQSTB) project was to analyze throughout different working areas the factors that may influence the final tissue quality and safety for transplantation, providing greater benefit to recipients. Fifteen national organizations and tissue establishments from 12 European countries took part in this project. The Sanco-EQSTB project was organized in four Working Groups. The objectives of each was focused on a specific area. The Standards Working Group analyzed different standards or guides used in various European tissue banks as a quality and safety system. The Registry Working Group created a Tissue Registry through a multinational European network database. The Education Working Group created a specialized training model for tissue bank personnel. The Audit Working Group created an European model of Auditing for tissue establishments. The aim of this article was to describe the activities of Working Group 3 in designing and validating a specialized training model among tissue bank personnel that could become the approved education system recommended by European Union members.

  13. Nordic Seas Precipitation Ground Validation Project

    Science.gov (United States)

    Klepp, Christian; Bumke, Karl; Bakan, Stephan; Andersson, Axel

    2010-05-01

    A thorough knowledge of global ocean precipitation is an indispensable prerequisite for the understanding of the water cycle in the global climate system. However, reliable detection of precipitation over the global oceans, especially of solid precipitation, remains a challenging task. This is true for both, passive microwave remote sensing and reanalysis based model estimates. The satellite based HOAPS (Hamburg Ocean Atmosphere Parameters and Fluxes from Satellite Data) climatology contains fields of precipitation, evaporation and the resulting freshwater flux along with 12 additional atmospheric parameters over the global ice-free ocean between 1987 and 2005. Except for the NOAA Pathfinder SST, all basic state variables are calculated from SSM/I passive microwave radiometer measurements. HOAPS contains three main data subsets that originate from one common pixel-level data source. Gridded 0.5 degree monthly, pentad and twice daily data products are freely available from www.hoaps.org. The optical disdrometer ODM 470 is a ground validation instrument capable of measuring rain and snowfall on ships even under high wind speeds. It was used for the first time over the Nordic Seas during the LOFZY 2005 campaign. A dichotomous verification for these snowfall events resulted in a perfect score between the disdrometer, a precipitation detector and a shipboard observer's log. The disdrometer data is further point-to-area collocated against precipitation from three satellite derived climatologies, HOAPS-3, the Global Precipitation Climatology Project (GPCP) one degree daily (1DD) data set, and the Goddard Profiling algorithm, version 2004 (GPROF 2004). Only the HOAPS precipitation turns out to be overall consistent with the disdrometer data resulting in an accuracy of 0.96. The collocated data comprises light precipitation events below 1 mm/h. Therefore two LOFZY case studies with high precipitation rates are presented that still indicate plausible results. Overall, this

  14. Model validation, science and application

    NARCIS (Netherlands)

    Builtjes, P.J.H.; Flossmann, A.

    1998-01-01

    Over the last years there is a growing interest to try to establish a proper validation of atmospheric chemistry-transport (ATC) models. Model validation deals with the comparison of model results with experimental data, and in this way adresses both model uncertainty and uncertainty in, and adequac

  15. Validating Animal Models

    Directory of Open Access Journals (Sweden)

    Nina Atanasova

    2015-06-01

    Full Text Available In this paper, I respond to the challenge raised against contemporary experimental neurobiology according to which the field is in a state of crisis because of the multiple experimental protocols employed in different laboratories and strengthening their reliability that presumably preclude the validity of neurobiological knowledge. I provide an alternative account of experimentation in neurobiology which makes sense of its experimental practices. I argue that maintaining a multiplicity of experimental protocols and strengthening their reliability are well justified and they foster rather than preclude the validity of neurobiological knowledge. Thus, their presence indicates thriving rather than crisis of experimental neurobiology.

  16. Validating the passenger traffic model for Copenhagen

    DEFF Research Database (Denmark)

    Overgård, Christian Hansen; VUK, Goran

    2006-01-01

    The paper presents a comprehensive validation procedure for the passenger traffic model for Copenhagen based on external data from the Danish national travel survey and traffic counts. The model was validated for the years 2000 to 2004, with 2004 being of particular interest because the Copenhagen...... Metro became operational in autumn 2002. We observed that forecasts from the demand sub-models agree well with the data from the 2000 national travel survey, with the mode choice forecasts in particular being a good match with the observed modal split. The results of the 2000 car assignment model...... matched the observed traffic better than those of the transit assignment model. With respect to the metro forecasts, the model over-predicts metro passenger flows by 10% to 50%. The wide range of findings from the project resulted in two actions. First, a project was started in January 2005 to upgrade...

  17. IV&V Project Assessment Process Validation

    Science.gov (United States)

    Driskell, Stephen

    2012-01-01

    The Space Launch System (SLS) will launch NASA's Multi-Purpose Crew Vehicle (MPCV). This launch vehicle will provide American launch capability for human exploration and travelling beyond Earth orbit. SLS is designed to be flexible for crew or cargo missions. The first test flight is scheduled for December 2017. The SLS SRR/SDR provided insight into the project development life cycle. NASA IV&V ran the standard Risk Based Assessment and Portfolio Based Risk Assessment to identify analysis tasking for the SLS program. This presentation examines the SLS System Requirements Review/System Definition Review (SRR/SDR), IV&V findings for IV&V process validation correlation to/from the selected IV&V tasking and capabilities. It also provides a reusable IEEE 1012 scorecard for programmatic completeness across the software development life cycle.

  18. Enrollment Projection Model.

    Science.gov (United States)

    Gustafson, B. Kerry; Hample, Stephen R.

    General documentation for the Enrollment Projection Model used by the Maryland Council for Higher Education (MCHE) is provided. The manual is directed toward both the potential users of the model as well as others interested in enrollment projections. The first four chapters offer administrators or planners insight into the derivation of the…

  19. Testing and validating environmental models

    Science.gov (United States)

    Kirchner, J.W.; Hooper, R.P.; Kendall, C.; Neal, C.; Leavesley, G.

    1996-01-01

    Generally accepted standards for testing and validating ecosystem models would benefit both modellers and model users. Universally applicable test procedures are difficult to prescribe, given the diversity of modelling approaches and the many uses for models. However, the generally accepted scientific principles of documentation and disclosure provide a useful framework for devising general standards for model evaluation. Adequately documenting model tests requires explicit performance criteria, and explicit benchmarks against which model performance is compared. A model's validity, reliability, and accuracy can be most meaningfully judged by explicit comparison against the available alternatives. In contrast, current practice is often characterized by vague, subjective claims that model predictions show 'acceptable' agreement with data; such claims provide little basis for choosing among alternative models. Strict model tests (those that invalid models are unlikely to pass) are the only ones capable of convincing rational skeptics that a model is probably valid. However, 'false positive' rates as low as 10% can substantially erode the power of validation tests, making them insufficiently strict to convince rational skeptics. Validation tests are often undermined by excessive parameter calibration and overuse of ad hoc model features. Tests are often also divorced from the conditions under which a model will be used, particularly when it is designed to forecast beyond the range of historical experience. In such situations, data from laboratory and field manipulation experiments can provide particularly effective tests, because one can create experimental conditions quite different from historical data, and because experimental data can provide a more precisely defined 'target' for the model to hit. We present a simple demonstration showing that the two most common methods for comparing model predictions to environmental time series (plotting model time series

  20. Validating Dart Model

    Directory of Open Access Journals (Sweden)

    Mazur Jolanta

    2014-12-01

    Full Text Available The primary objective of the study was to quantitatively test the DART model, which despite being one of the most popular representations of co-creation concept was so far studied almost solely with qualitative methods. To this end, the researchers developed a multiple measurement scale and employed it in interviewing managers. The statistical evidence for adequacy of the model was obtained through CFA with AMOS software. The findings suggest that the DART model may not be an accurate representation of co-creation practices in companies. From the data analysis it was evident that the building blocks of DART had too much of conceptual overlap to be an effective framework for quantitative analysis. It was also implied that the phenomenon of co-creation is so rich and multifaceted that it may be more adequately captured by a measurement model where co-creation is conceived as a third-level factor with two layers of intermediate latent variables.

  1. Structural system identification: Structural dynamics model validation

    Energy Technology Data Exchange (ETDEWEB)

    Red-Horse, J.R.

    1997-04-01

    Structural system identification is concerned with the development of systematic procedures and tools for developing predictive analytical models based on a physical structure`s dynamic response characteristics. It is a multidisciplinary process that involves the ability (1) to define high fidelity physics-based analysis models, (2) to acquire accurate test-derived information for physical specimens using diagnostic experiments, (3) to validate the numerical simulation model by reconciling differences that inevitably exist between the analysis model and the experimental data, and (4) to quantify uncertainties in the final system models and subsequent numerical simulations. The goal of this project was to develop structural system identification techniques and software suitable for both research and production applications in code and model validation.

  2. Project Evaluation: Validation of a Scale and Analysis of Its Predictive Capacity

    Science.gov (United States)

    Fernandes Malaquias, Rodrigo; de Oliveira Malaquias, Fernanda Francielle

    2014-01-01

    The objective of this study was to validate a scale for assessment of academic projects. As a complement, we examined its predictive ability by comparing the scores of advised/corrected projects based on the model and the final scores awarded to the work by an examining panel (approximately 10 months after the project design). Results of…

  3. Project on Elite Athlete Commitment (PEAK): III. An examination of the external validity across gender, and the expansion and clarification of the Sport Commitment Model.

    Science.gov (United States)

    Scanlan, Tara K; Russell, David G; Magyar, T Michelle; Scanlan, Larry A

    2009-12-01

    The Sport Commitment Model was further tested using the Scanlan Collaborative Interview Method to examine its generalizability to New Zealand's elite female amateur netball team, the Silver Ferns. Results supported or clarified Sport Commitment Model predictions, revealed avenues for model expansion, and elucidated the functions of perceived competence and enjoyment in the commitment process. A comparison and contrast of the in-depth interview data from the Silver Ferns with previous interview data from a comparable elite team of amateur male athletes allowed assessment of model external validity, tested the generalizability of the underlying mechanisms, and separated gender differences from discrepancies that simply reflected team or idiosyncratic differences.

  4. Experimental validation of a Monte Carlo-based kV x-ray projection model for the Varian linac-mounted cone-beam CT imaging system

    Science.gov (United States)

    Lazos, Dimitrios; Pokhrel, Damodar; Su, Zhong; Lu, Jun; Williamson, Jeffrey F.

    2008-03-01

    Fast and accurate modeling of cone-beam CT (CBCT) x-ray projection data can improve CBCT image quality either by linearizing projection data for each patient prior to image reconstruction (thereby mitigating detector blur/lag, spectral hardening, and scatter artifacts) or indirectly by supporting rigorous comparative simulation studies of competing image reconstruction and processing algorithms. In this study, we compare Monte Carlo-computed x-ray projections with projections experimentally acquired from our Varian Trilogy CBCT imaging system for phantoms of known design. Our recently developed Monte Carlo photon-transport code, PTRAN, was used to compute primary and scatter projections for cylindrical phantom of known diameter (NA model 76-410) with and without bow-tie filter and antiscatter grid for both full- and half-fan geometries. These simulations were based upon measured 120 kVp spectra, beam profiles, and flat-panel detector (4030CB) point-spread function. Compound Poisson- process noise was simulated based upon measured beam output. Computed projections were compared to flat- and dark-field corrected 4030CB images where scatter profiles were estimated by subtracting narrow axial-from full axial width 4030CB profiles. In agreement with the literature, the difference between simulated and measured projection data is of the order of 6-8%. The measurement of the scatter profiles is affected by the long tails of the detector PSF. Higher accuracy can be achieved mainly by improving the beam modeling and correcting the non linearities induced by the detector PSF.

  5. Procedure Verification and Validation Toolset Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed research is aimed at investigating a procedure verification and validation toolset, which will allow the engineers who are responsible for developing...

  6. Validation of Autonomous Space Systems Project

    Data.gov (United States)

    National Aeronautics and Space Administration — System validation addresses the question “Will the system do the right thing?” When system capability includes autonomy, or more specifically, onboard...

  7. PEMFC modeling and experimental validation

    Energy Technology Data Exchange (ETDEWEB)

    Vargas, J.V.C. [Federal University of Parana (UFPR), Curitiba, PR (Brazil). Dept. of Mechanical Engineering], E-mail: jvargas@demec.ufpr.br; Ordonez, J.C.; Martins, L.S. [Florida State University, Tallahassee, FL (United States). Center for Advanced Power Systems], Emails: ordonez@caps.fsu.edu, martins@caps.fsu.edu

    2009-07-01

    In this paper, a simplified and comprehensive PEMFC mathematical model introduced in previous studies is experimentally validated. Numerical results are obtained for an existing set of commercial unit PEM fuel cells. The model accounts for pressure drops in the gas channels, and for temperature gradients with respect to space in the flow direction, that are investigated by direct infrared imaging, showing that even at low current operation such gradients are present in fuel cell operation, and therefore should be considered by a PEMFC model, since large coolant flow rates are limited due to induced high pressure drops in the cooling channels. The computed polarization and power curves are directly compared to the experimentally measured ones with good qualitative and quantitative agreement. The combination of accuracy and low computational time allow for the future utilization of the model as a reliable tool for PEMFC simulation, control, design and optimization purposes. (author)

  8. The Earthquake‐Source Inversion Validation (SIV) Project

    KAUST Repository

    Mai, Paul Martin

    2016-04-27

    Finite-fault earthquake source inversions infer the (time-dependent) displacement on the rupture surface from geophysical data. The resulting earthquake source models document the complexity of the rupture process. However, multiple source models for the same earthquake, obtained by different research teams, often exhibit remarkable dissimilarities. To address the uncertainties in earthquake-source inversion methods and to understand strengths and weaknesses of the various approaches used, the Source Inversion Validation (SIV) project conducts a set of forward-modeling exercises and inversion benchmarks. In this article, we describe the SIV strategy, the initial benchmarks, and current SIV results. Furthermore, we apply statistical tools for quantitative waveform comparison and for investigating source-model (dis)similarities that enable us to rank the solutions, and to identify particularly promising source inversion approaches. All SIV exercises (with related data and descriptions) and statistical comparison tools are available via an online collaboration platform, and we encourage source modelers to use the SIV benchmarks for developing and testing new methods. We envision that the SIV efforts will lead to new developments for tackling the earthquake-source imaging problem.

  9. The Earthquake‐Source Inversion Validation (SIV) Project

    Science.gov (United States)

    Mai, P. Martin; Schorlemmer, Danijel; Page, Morgan T.; Ampuero, Jean-Paul; Asano, Kimiyuki; Causse, Mathieu; Custodio, Susana; Fan, Wenyuan; Festa, Gaetano; Galis, Martin; Gallovic, Frantisek; Imperatori, Walter; Käser, Martin; Malytskyy, Dmytro; Okuwaki, Ryo; Pollitz, Fred; Passone, Luca; Razafindrakoto, Hoby N. T.; Sekiguchi, Haruko; Song, Seok Goo; Somala, Surendra N.; Thingbaijam, Kiran K. S.; Twardzik, Cedric; van Driel, Martin; Vyas, Jagdish C.; Wang, Rongjiang; Yagi, Yuji; Zielke, Olaf

    2016-01-01

    Finite‐fault earthquake source inversions infer the (time‐dependent) displacement on the rupture surface from geophysical data. The resulting earthquake source models document the complexity of the rupture process. However, multiple source models for the same earthquake, obtained by different research teams, often exhibit remarkable dissimilarities. To address the uncertainties in earthquake‐source inversion methods and to understand strengths and weaknesses of the various approaches used, the Source Inversion Validation (SIV) project conducts a set of forward‐modeling exercises and inversion benchmarks. In this article, we describe the SIV strategy, the initial benchmarks, and current SIV results. Furthermore, we apply statistical tools for quantitative waveform comparison and for investigating source‐model (dis)similarities that enable us to rank the solutions, and to identify particularly promising source inversion approaches. All SIV exercises (with related data and descriptions) and statistical comparison tools are available via an online collaboration platform, and we encourage source modelers to use the SIV benchmarks for developing and testing new methods. We envision that the SIV efforts will lead to new developments for tackling the earthquake‐source imaging problem.

  10. Validation for a recirculation model.

    Science.gov (United States)

    LaPuma, P T

    2001-04-01

    Recent Clean Air Act regulations designed to reduce volatile organic compound (VOC) emissions have placed new restrictions on painting operations. Treating large volumes of air which contain dilute quantities of VOCs can be expensive. Recirculating some fraction of the air allows an operator to comply with environmental regulations at reduced cost. However, there is a potential impact on employee safety because indoor pollutants will inevitably increase when air is recirculated. A computer model was developed, written in Microsoft Excel 97, to predict compliance costs and indoor air concentration changes with respect to changes in the level of recirculation for a given facility. The model predicts indoor air concentrations based on product usage and mass balance equations. This article validates the recirculation model using data collected from a C-130 aircraft painting facility at Hill Air Force Base, Utah. Air sampling data and air control cost quotes from vendors were collected for the Hill AFB painting facility and compared to the model's predictions. The model's predictions for strontium chromate and isocyanate air concentrations were generally between the maximum and minimum air sampling points with a tendency to predict near the maximum sampling points. The model's capital cost predictions for a thermal VOC control device ranged from a 14 percent underestimate to a 50 percent overestimate of the average cost quotes. A sensitivity analysis of the variables is also included. The model is demonstrated to be a good evaluation tool in understanding the impact of recirculation.

  11. Validation of Magnetospheric Magnetohydrodynamic Models

    Science.gov (United States)

    Curtis, Brian

    Magnetospheric magnetohydrodynamic (MHD) models are commonly used for both prediction and modeling of Earth's magnetosphere. To date, very little validation has been performed to determine their limits, uncertainties, and differences. In this work, we performed a comprehensive analysis using several commonly used validation techniques in the atmospheric sciences to MHD-based models of Earth's magnetosphere for the first time. The validation techniques of parameter variability/sensitivity analysis and comparison to other models were used on the OpenGGCM, BATS-R-US, and SWMF magnetospheric MHD models to answer several questions about how these models compare. The questions include: (1) the difference between the model's predictions prior to and following to a reversal of Bz in the upstream interplanetary field (IMF) from positive to negative, (2) the influence of the preconditioning duration, and (3) the differences between models under extreme solar wind conditions. A differencing visualization tool was developed and used to address these three questions. We find: (1) For a reversal in IMF Bz from positive to negative, the OpenGGCM magnetopause is closest to Earth as it has the weakest magnetic pressure near-Earth. The differences in magnetopause positions between BATS-R-US and SWMF are explained by the influence of the ring current, which is included in SWMF. Densities are highest for SWMF and lowest for OpenGGCM. The OpenGGCM tail currents differ significantly from BATS-R-US and SWMF; (2) A longer preconditioning time allowed the magnetosphere to relax more, giving different positions for the magnetopause with all three models before the IMF Bz reversal. There were differences greater than 100% for all three models before the IMF Bz reversal. The differences in the current sheet region for the OpenGGCM were small after the IMF Bz reversal. The BATS-R-US and SWMF differences decreased after the IMF Bz reversal to near zero; (3) For extreme conditions in the solar

  12. Software Validation via Model Animation

    Science.gov (United States)

    Dutle, Aaron M.; Munoz, Cesar A.; Narkawicz, Anthony J.; Butler, Ricky W.

    2015-01-01

    This paper explores a new approach to validating software implementations that have been produced from formally-verified algorithms. Although visual inspection gives some confidence that the implementations faithfully reflect the formal models, it does not provide complete assurance that the software is correct. The proposed approach, which is based on animation of formal specifications, compares the outputs computed by the software implementations on a given suite of input values to the outputs computed by the formal models on the same inputs, and determines if they are equal up to a given tolerance. The approach is illustrated on a prototype air traffic management system that computes simple kinematic trajectories for aircraft. Proofs for the mathematical models of the system's algorithms are carried out in the Prototype Verification System (PVS). The animation tool PVSio is used to evaluate the formal models on a set of randomly generated test cases. Output values computed by PVSio are compared against output values computed by the actual software. This comparison improves the assurance that the translation from formal models to code is faithful and that, for example, floating point errors do not greatly affect correctness and safety properties.

  13. X-CT投影数据的建模与仿真验证%Modeling and Simulation Validation for X-CT Projection Data

    Institute of Scientific and Technical Information of China (English)

    张剑飞; 刘同双

    2012-01-01

    虚拟X-CT仿真系统是研究CT技术的重要工具和手段,研究成果的可靠性依赖于虚拟仿真系统的准确性和适用性,为了开发更接近于真实X-CT的虚拟系统,提出了一种新的投影数据仿真模型.仿真模型中引入了多能谱、散射、光电噪声、探测器量子效率、光电转换效率和探测器单元尺寸等特性,对投影数据进行建模后,采用基于希尔伯特变换的滤波反投影重建算法进行CT图像重建,实验结果表明,提出的投影数据仿真模型对仿真过程中可能出现的硬化、散射和噪声问题进行了有效的处理,能够准确、真实地反应CT成像过程.%Virtual X - ray CT is an important tool and means for CT techniques research. Reliability of the research work depends on the accuracy of virtual CT simulation. In order to develop virtual CT system that much more close to real CT, a computer simulation model for X - CT projection data was presented in this paper. The model incorporates poly - chromaticity, scatter, quantum and electronic noise, the detective quantum efficiency, photoelectric conversion efficiency of detector and other attributes of CT system. After modeling the projection data, filtered back projection based on Hilbert transform was adopted to reconstruct CT image. Experiments demonstrate that our model can effectively simulate beam - hardening, scatter and noise, and reflect the CT imaging procedure accurately and realistically.

  14. Flight Research and Validation Formerly Experimental Capabilities Supersonic Project

    Science.gov (United States)

    Banks, Daniel

    2009-01-01

    This slide presentation reviews the work of the Experimental Capabilities Supersonic project, that is being reorganized into Flight Research and Validation. The work of Experimental Capabilities Project in FY '09 is reviewed, and the specific centers that is assigned to do the work is given. The portfolio of the newly formed Flight Research and Validation (FRV) group is also reviewed. The various projects for FY '10 for the FRV are detailed. These projects include: Eagle Probe, Channeled Centerbody Inlet Experiment (CCIE), Supersonic Boundary layer Transition test (SBLT), Aero-elastic Test Wing-2 (ATW-2), G-V External Vision Systems (G5 XVS), Air-to-Air Schlieren (A2A), In Flight Background Oriented Schlieren (BOS), Dynamic Inertia Measurement Technique (DIM), and Advanced In-Flight IR Thermography (AIR-T).

  15. Verifying and Validating Simulation Models

    Energy Technology Data Exchange (ETDEWEB)

    Hemez, Francois M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-02-23

    This presentation is a high-level discussion of the Verification and Validation (V&V) of computational models. Definitions of V&V are given to emphasize that “validation” is never performed in a vacuum; it accounts, instead, for the current state-of-knowledge in the discipline considered. In particular comparisons between physical measurements and numerical predictions should account for their respective sources of uncertainty. The differences between error (bias), aleatoric uncertainty (randomness) and epistemic uncertainty (ignorance, lack-of- knowledge) are briefly discussed. Four types of uncertainty in physics and engineering are discussed: 1) experimental variability, 2) variability and randomness, 3) numerical uncertainty and 4) model-form uncertainty. Statistical sampling methods are available to propagate, and analyze, variability and randomness. Numerical uncertainty originates from the truncation error introduced by the discretization of partial differential equations in time and space. Model-form uncertainty is introduced by assumptions often formulated to render a complex problem more tractable and amenable to modeling and simulation. The discussion concludes with high-level guidance to assess the “credibility” of numerical simulations, which stems from the level of rigor with which these various sources of uncertainty are assessed and quantified.

  16. Obstructive lung disease models: what is valid?

    Science.gov (United States)

    Ferdinands, Jill M; Mannino, David M

    2008-12-01

    Use of disease simulation models has led to scrutiny of model methods and demand for evidence that models credibly simulate health outcomes. We sought to describe recent obstructive lung disease simulation models and their validation. Medline and EMBASE were used to identify obstructive lung disease simulation models published from January 2000 to June 2006. Publications were reviewed to assess model attributes and four types of validation: first-order (verification/debugging), second-order (comparison with studies used in model development), third-order (comparison with studies not used in model development), and predictive validity. Six asthma and seven chronic obstructive pulmonary disease models were identified. Seven (54%) models included second-order validation, typically by comparing observed outcomes to simulations of source study cohorts. Seven (54%) models included third-order validation, in which modeled outcomes were usually compared qualitatively for agreement with studies independent of the model. Validation endpoints included disease prevalence, exacerbation, and all-cause mortality. Validation was typically described as acceptable, despite near-universal absence of criteria for judging adequacy of validation. Although over half of recent obstructive lung disease simulation models report validation, inconsistencies in validation methods and lack of detailed reporting make assessing adequacy of validation difficult. For simulation modeling to be accepted as a tool for evaluating clinical and public health programs, models must be validated to credibly simulate health outcomes of interest. Defining the required level of validation and providing guidance for quantitative assessment and reporting of validation are important future steps in promoting simulation models as practical decision tools.

  17. Final report on LDRD project : elucidating performance of proton-exchange-membrane fuel cells via computational modeling with experimental discovery and validation.

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Chao Yang (Pennsylvania State University, University Park, PA); Pasaogullari, Ugur (Pennsylvania State University, University Park, PA); Noble, David R.; Siegel, Nathan P.; Hickner, Michael A.; Chen, Ken Shuang

    2006-11-01

    In this report, we document the accomplishments in our Laboratory Directed Research and Development project in which we employed a technical approach of combining experiments with computational modeling and analyses to elucidate the performance of hydrogen-fed proton exchange membrane fuel cells (PEMFCs). In the first part of this report, we document our focused efforts on understanding water transport in and removal from a hydrogen-fed PEMFC. Using a transparent cell, we directly visualized the evolution and growth of liquid-water droplets at the gas diffusion layer (GDL)/gas flow channel (GFC) interface. We further carried out a detailed experimental study to observe, via direct visualization, the formation, growth, and instability of water droplets at the GDL/GFC interface using a specially-designed apparatus, which simulates the cathode operation of a PEMFC. We developed a simplified model, based on our experimental observation and data, for predicting the onset of water-droplet instability at the GDL/GFC interface. Using a state-of-the-art neutron imaging instrument available at NIST (National Institute of Standard and Technology), we probed liquid-water distribution inside an operating PEMFC under a variety of operating conditions and investigated effects of evaporation due to local heating by waste heat on water removal. Moreover, we developed computational models for analyzing the effects of micro-porous layer on net water transport across the membrane and GDL anisotropy on the temperature and water distributions in the cathode of a PEMFC. We further developed a two-phase model based on the multiphase mixture formulation for predicting the liquid saturation, pressure drop, and flow maldistribution across the PEMFC cathode channels. In the second part of this report, we document our efforts on modeling the electrochemical performance of PEMFCs. We developed a constitutive model for predicting proton conductivity in polymer electrolyte membranes and compared

  18. Unit testing, model validation, and biological simulation

    Science.gov (United States)

    Watts, Mark D.; Ghayoomie, S. Vahid; Larson, Stephen D.; Gerkin, Richard C.

    2016-01-01

    The growth of the software industry has gone hand in hand with the development of tools and cultural practices for ensuring the reliability of complex pieces of software. These tools and practices are now acknowledged to be essential to the management of modern software. As computational models and methods have become increasingly common in the biological sciences, it is important to examine how these practices can accelerate biological software development and improve research quality. In this article, we give a focused case study of our experience with the practices of unit testing and test-driven development in OpenWorm, an open-science project aimed at modeling Caenorhabditis elegans. We identify and discuss the challenges of incorporating test-driven development into a heterogeneous, data-driven project, as well as the role of model validation tests, a category of tests unique to software which expresses scientific models. PMID:27635225

  19. Knowledge Model: Project Knowledge Management

    DEFF Research Database (Denmark)

    Durao, Frederico; Dolog, Peter; Grolin, Daniel

    2009-01-01

    The Knowledge model for project management serves several goals:Introducing relevant concepts of project management area for software development (Section 1). Reviewing and understanding the real case requirements from the industrial perspective. (Section 2). Giving some preliminary suggestions f...

  20. Knowledge Model: Project Knowledge Management

    DEFF Research Database (Denmark)

    Durao, Frederico; Dolog, Peter; Grolin, Daniel

    2009-01-01

    The Knowledge model for project management serves several goals:Introducing relevant concepts of project management area for software development (Section 1). Reviewing and understanding the real case requirements from the industrial perspective. (Section 2). Giving some preliminary suggestions...

  1. Validation of a multi-objective, predictive urban traffic model

    NARCIS (Netherlands)

    Wilmink, I.R.; Haak, P. van den; Woldeab, Z.; Vreeswijk, J.

    2013-01-01

    This paper describes the results of the verification and validation of the ecoStrategic Model, which was developed, implemented and tested in the eCoMove project. The model uses real-time and historical traffic information to determine the current, predicted and desired state of traffic in a network

  2. Aeroservoelastic Modeling Project

    Data.gov (United States)

    National Aeronautics and Space Administration — CFDRC proposes to develop, validate and demonstrate a comprehensive aeroservoelastic analysis framework for aerospace vehicles by enabling coupled interactions...

  3. Contact Modelling in Resistance Welding, Part II: Experimental Validation

    DEFF Research Database (Denmark)

    Song, Quanfeng; Zhang, Wenqi; Bay, Niels

    2006-01-01

    Contact algorithms in resistance welding presented in the previous paper are experimentally validated in the present paper. In order to verify the mechanical contact algorithm, two types of experiments, i.e. sandwich upsetting of circular, cylindrical specimens and compression tests of discs...... with a solid ring projection towards a flat ring, are carried out at room temperature. The complete algorithm, involving not only the mechanical model but also the thermal and electrical models, is validated by projection welding experiments. The experimental results are in satisfactory agreement...

  4. Controlled Hydrogen Fleet and Infrastructure Demonstration and Validation Project

    Energy Technology Data Exchange (ETDEWEB)

    Stottler, Gary

    2012-02-08

    General Motors, LLC and energy partner Shell Hydrogen, LLC, deployed a system of hydrogen fuel cell electric vehicles integrated with a hydrogen fueling station infrastructure to operate under real world conditions as part of the U.S. Department of Energy's Controlled Hydrogen Fleet and Infrastructure Validation and Demonstration Project. This technical report documents the performance and describes the learnings from progressive generations of vehicle fuel cell system technology and multiple approaches to hydrogen generation and delivery for vehicle fueling.

  5. On grey relation projection model based on projection pursuit

    Institute of Scientific and Technical Information of China (English)

    Wang Shuo; Yang Shanlin; Ma Xijun

    2008-01-01

    Multidimensional grey relation projection value can be synthesized as one-dimensional projection value by u-sing projection pursuit model.The larger the projection value is,the better the model.Thus,according to the projection value,the best one can be chosen from the model aggregation.Because projection pursuit modeling based on accelera-ting genetic algorithm can simplify the implementation procedure of the projection pursuit technique and overcome its complex calculation as well as the difficulty in implementing its program,a new method can be obtained for choosing the best grey relation projection model based on the projection pursuit technique.

  6. OC5 Project Phase I: Validation of Hydrodynamic Loading on a Fixed Cylinder: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Robertson, A. N.; Wendt, F. F.; Jonkman, J. M.; Popko, W.; Vorpahl, F.; Stansberg, C. T.; Bachynski, E. E.; Bayati, I.; Beyer, F.; de Vaal, J. B.; Harries, R.; Yamaguchi, A.; Shin, H.; Kim, B.; van der Zee, T.; Bozonnet, P.; Aguilo, B.; Bergua, R.; Qvist, J.; Qijun, W.; Chen, X.; Guerinel, M.; Tu, Y.; Yutong, H.; Li, R.; Bouy, L.

    2015-04-23

    This paper describes work performed during the first half of Phase I of the Offshore Code Comparison Collaboration Continuation, with Correlation project (OC5). OC5 is a project run under the IEA Wind Research Task 30, and is focused on validating the tools used for modeling offshore wind systems. In this first phase, simulated responses from a variety of offshore wind modeling tools were modeling tools were validated against tank test data of a fixed, suspended cylinder (without a wind turbine) that was tested under regular and irregular wave conditions at MARINTEK. The results from this phase include an examination of different approaches one can use for defining and calibrating hydrodynamic coefficients for a model, and the importance of higher-order wave models in accurately modeling the hydrodynamic loads on offshore substructures.

  7. Does assessing project work enhance the validity of qualifications? The case of GCSE coursework

    Directory of Open Access Journals (Sweden)

    Victoria Crisp

    2009-03-01

    Full Text Available This paper begins by describing current views on validity and how certain assessment forms, such as school-based project work, may enhance validity. It then touches on debates about the dependability of assessment by teachers. GCSEs and GCSE coursework are then described along with the reasons for the inclusion of coursework in many GCSEs. Crooks, Kane and Cohen’s (1996 chain model of eight linked stages of validity enquiry is then used as a structure within which to consider the validity of project work assessments, and specifically GCSE coursework assessment, drawing on the available literature. Strengths for validity include the ability to assess objectives that are difficult to test in written examinations, promoting additional skills such as critical thinking, creativity and independent thinking, and improving motivation. Possible threats to validity include the potential for internet and other types of plagiarism, tasks becoming overly structured and formulaic thus reducing the positive impact on learning, and the potentially heavy workload for teachers and students. The paper concludes by describing current policy changes in the UK with regard to GCSE coursework and relates this to strong and weak validity links for project work as a mode of assessment.

  8. On validation of multibody musculoskeletal models

    DEFF Research Database (Denmark)

    Lund, Morten Enemark; de Zee, Mark; Andersen, Michael Skipper;

    2012-01-01

    This paper reviews the opportunities to validate multibody musculoskeletal models in view of the current transition of musculoskeletal modelling from a research topic to a practical simulation tool in product design, healthcare and other important applications. This transition creates a new need...... for improvement of the validation of multibody musculoskeletal models are pointed out and directions for future research in the field are proposed. It is our hope that a more structured approach to model validation can help to improve the credibility of musculoskeletal models....

  9. Optimal Data Split Methodology for Model Validation

    CERN Document Server

    Morrison, Rebecca; Terejanu, Gabriel; Miki, Kenji; Prudhomme, Serge

    2011-01-01

    The decision to incorporate cross-validation into validation processes of mathematical models raises an immediate question - how should one partition the data into calibration and validation sets? We answer this question systematically: we present an algorithm to find the optimal partition of the data subject to certain constraints. While doing this, we address two critical issues: 1) that the model be evaluated with respect to predictions of a given quantity of interest and its ability to reproduce the data, and 2) that the model be highly challenged by the validation set, assuming it is properly informed by the calibration set. This framework also relies on the interaction between the experimentalist and/or modeler, who understand the physical system and the limitations of the model; the decision-maker, who understands and can quantify the cost of model failure; and the computational scientists, who strive to determine if the model satisfies both the modeler's and decision maker's requirements. We also note...

  10. Validation of systems biology models

    NARCIS (Netherlands)

    Hasdemir, D.

    2015-01-01

    The paradigm shift from qualitative to quantitative analysis of biological systems brought a substantial number of modeling approaches to the stage of molecular biology research. These include but certainly are not limited to nonlinear kinetic models, static network models and models obtained by the

  11. Feature Extraction for Structural Dynamics Model Validation

    Energy Technology Data Exchange (ETDEWEB)

    Farrar, Charles [Los Alamos National Laboratory; Nishio, Mayuko [Yokohama University; Hemez, Francois [Los Alamos National Laboratory; Stull, Chris [Los Alamos National Laboratory; Park, Gyuhae [Chonnam Univesity; Cornwell, Phil [Rose-Hulman Institute of Technology; Figueiredo, Eloi [Universidade Lusófona; Luscher, D. J. [Los Alamos National Laboratory; Worden, Keith [University of Sheffield

    2016-01-13

    As structural dynamics becomes increasingly non-modal, stochastic and nonlinear, finite element model-updating technology must adopt the broader notions of model validation and uncertainty quantification. For example, particular re-sampling procedures must be implemented to propagate uncertainty through a forward calculation, and non-modal features must be defined to analyze nonlinear data sets. The latter topic is the focus of this report, but first, some more general comments regarding the concept of model validation will be discussed.

  12. Model Validation in Ontology Based Transformations

    Directory of Open Access Journals (Sweden)

    Jesús M. Almendros-Jiménez

    2012-10-01

    Full Text Available Model Driven Engineering (MDE is an emerging approach of software engineering. MDE emphasizes the construction of models from which the implementation should be derived by applying model transformations. The Ontology Definition Meta-model (ODM has been proposed as a profile for UML models of the Web Ontology Language (OWL. In this context, transformations of UML models can be mapped into ODM/OWL transformations. On the other hand, model validation is a crucial task in model transformation. Meta-modeling permits to give a syntactic structure to source and target models. However, semantic requirements have to be imposed on source and target models. A given transformation will be sound when source and target models fulfill the syntactic and semantic requirements. In this paper, we present an approach for model validation in ODM based transformations. Adopting a logic programming based transformational approach we will show how it is possible to transform and validate models. Properties to be validated range from structural and semantic requirements of models (pre and post conditions to properties of the transformation (invariants. The approach has been applied to a well-known example of model transformation: the Entity-Relationship (ER to Relational Model (RM transformation.

  13. Empirical data validation for model building

    Science.gov (United States)

    Kazarian, Aram

    2008-03-01

    Optical Proximity Correction (OPC) has become an integral and critical part of process development for advanced technologies with challenging k I requirements. OPC solutions in turn require stable, predictive models to be built that can project the behavior of all structures. These structures must comprehend all geometries that can occur in the layout in order to define the optimal corrections by feature, and thus enable a manufacturing process with acceptable margin. The model is built upon two main component blocks. First, is knowledge of the process conditions which includes the optical parameters (e.g. illumination source, wavelength, lens characteristics, etc) as well as mask definition, resist parameters and process film stack information. Second, is the empirical critical dimension (CD) data collected using this process on specific test features the results of which are used to fit and validate the model and to project resist contours for all allowable feature layouts. The quality of the model therefore is highly dependent on the integrity of the process data collected for this purpose. Since the test pattern suite generally extends to below the resolution limit that the process can support with adequate latitude, the CD measurements collected can often be quite noisy with marginal signal-to-noise ratios. In order for the model to be reliable and a best representation of the process behavior, it is necessary to scrutinize empirical data to ensure that it is not dominated by measurement noise or flyer/outlier points. The primary approach for generating a clean, smooth and dependable empirical data set should be a replicated measurement sampling that can help to statistically reduce measurement noise by averaging. However, it can often be impractical to collect the amount of data needed to ensure a clean data set by this method. An alternate approach is studied in this paper to further smooth the measured data by means of curve fitting to identify remaining

  14. The European Food Consumption Validation Project: conclusions and recommendations

    DEFF Research Database (Denmark)

    de Boer, E. J.; Slimani, N.; van 't Veer, P.

    2011-01-01

    Background/Objectives: To outline and discuss the main results and conclusions of the European Food Consumption Validation (EFCOVAL) Project. Subjects/Methods: The EFCOVAL Project was carried out within the EU Sixth Framework Program by researchers in 11 EU countries. The activities focused on (1...... showed that two non-consecutive EPIC-Soft 24-HDRs are suitable to estimate the usual intake distributions of protein and potassium of European adult populations. The 2-day non-consecutive 24-HDRs in combination with a food propensity questionnaire also appeared to be appropriate to rank individuals...... according to their fish and fruit and vegetable intake in a comparable way in five European centers. Dietary intake of (young) children can be assessed by the combination of EPIC-Soft 24-HDRs and food recording booklets. The EPIC-Soft-standardized method of describing foods is useful to estimate dietary...

  15. Requirements Validation: Execution of UML Models with CPN Tools

    DEFF Research Database (Denmark)

    Machado, Ricardo J.; Lassen, Kristian Bisgaard; Oliveira, Sérgio

    2007-01-01

    Requirements validation is a critical task in any engineering project. The confrontation of stakeholders with static requirements models is not enough, since stakeholders with non-computer science education are not able to discover all the inter-dependencies between the elicited requirements. Eve...... requirements, where the system to be built must explicitly support the interaction between people within a pervasive cooperative workflow execution. A case study from a real project is used to illustrate the proposed approach.......Requirements validation is a critical task in any engineering project. The confrontation of stakeholders with static requirements models is not enough, since stakeholders with non-computer science education are not able to discover all the inter-dependencies between the elicited requirements. Even...... with simple unified modelling language (UML) requirements models, it is not easy for the development team to get confidence on the stakeholders' requirements validation. This paper describes an approach, based on the construction of executable interactive prototypes, to support the validation of workflow...

  16. FY2017 Pilot Project Plan for the Nuclear Energy Knowledge and Validation Center Initiative

    Energy Technology Data Exchange (ETDEWEB)

    Ren, Weiju [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-10-30

    To prepare for technical development of computational code validation under the Nuclear Energy Knowledge and Validation Center (NEKVAC) initiative, several meetings were held by a group of experts of the Idaho National Laboratory (INL) and the Oak Ridge National Laboratory (ORNL) to develop requirements of, and formulate a structure for, a transient fuel database through leveraging existing resources. It was concluded in discussions of these meetings that a pilot project is needed to address the most fundamental issues that can generate immediate stimulus to near-future validation developments as well as long-lasting benefits to NEKVAC operation. The present project is proposed based on the consensus of these discussions. Analysis of common scenarios in code validation indicates that the incapability of acquiring satisfactory validation data is often a showstopper that must first be tackled before any confident validation developments can be carried out. Validation data are usually found scattered in different places most likely with interrelationships among the data not well documented, incomplete with information for some parameters missing, nonexistent, or unrealistic to experimentally generate. Furthermore, with very different technical backgrounds, the modeler, the experimentalist, and the knowledgebase developer that must be involved in validation data development often cannot communicate effectively without a data package template that is representative of the data structure for the information domain of interest to the desired code validation. This pilot project is proposed to use the legendary TREAT Experiments Database to provide core elements for creating an ideal validation data package. Data gaps and missing data interrelationships will be identified from these core elements. All the identified missing elements will then be filled in with experimental data if available from other existing sources or with dummy data if nonexistent. The resulting hybrid

  17. Quantitative model validation techniques: new insights

    CERN Document Server

    Ling, You

    2012-01-01

    This paper develops new insights into quantitative methods for the validation of computational model prediction. Four types of methods are investigated, namely classical and Bayesian hypothesis testing, a reliability-based method, and an area metric-based method. Traditional Bayesian hypothesis testing is extended based on interval hypotheses on distribution parameters and equality hypotheses on probability distributions, in order to validate models with deterministic/stochastic output for given inputs. Two types of validation experiments are considered - fully characterized (all the model/experimental inputs are measured and reported as point values) and partially characterized (some of the model/experimental inputs are not measured or are reported as intervals). Bayesian hypothesis testing can minimize the risk in model selection by properly choosing the model acceptance threshold, and its results can be used in model averaging to avoid Type I/II errors. It is shown that Bayesian interval hypothesis testing...

  18. Validating agent based models through virtual worlds.

    Energy Technology Data Exchange (ETDEWEB)

    Lakkaraju, Kiran; Whetzel, Jonathan H.; Lee, Jina; Bier, Asmeret Brooke; Cardona-Rivera, Rogelio E.; Bernstein, Jeremy Ray Rhythm

    2014-01-01

    As the US continues its vigilance against distributed, embedded threats, understanding the political and social structure of these groups becomes paramount for predicting and dis- rupting their attacks. Agent-based models (ABMs) serve as a powerful tool to study these groups. While the popularity of social network tools (e.g., Facebook, Twitter) has provided extensive communication data, there is a lack of ne-grained behavioral data with which to inform and validate existing ABMs. Virtual worlds, in particular massively multiplayer online games (MMOG), where large numbers of people interact within a complex environ- ment for long periods of time provide an alternative source of data. These environments provide a rich social environment where players engage in a variety of activities observed between real-world groups: collaborating and/or competing with other groups, conducting battles for scarce resources, and trading in a market economy. Strategies employed by player groups surprisingly re ect those seen in present-day con icts, where players use diplomacy or espionage as their means for accomplishing their goals. In this project, we propose to address the need for ne-grained behavioral data by acquiring and analyzing game data a commercial MMOG, referred to within this report as Game X. The goals of this research were: (1) devising toolsets for analyzing virtual world data to better inform the rules that govern a social ABM and (2) exploring how virtual worlds could serve as a source of data to validate ABMs established for analogous real-world phenomena. During this research, we studied certain patterns of group behavior to compliment social modeling e orts where a signi cant lack of detailed examples of observed phenomena exists. This report outlines our work examining group behaviors that underly what we have termed the Expression-To-Action (E2A) problem: determining the changes in social contact that lead individuals/groups to engage in a particular behavior

  19. Model validation: Correlation for updating

    Indian Academy of Sciences (India)

    D J Ewins

    2000-06-01

    In this paper, a review is presented of the various methods which are available for the purpose of performing a systematic comparison and correlation between two sets of vibration data. In the present case, the application of interest is in conducting this correlation process as a prelude to model correlation or updating activity.

  20. Development and validation of a building design waste reduction model.

    Science.gov (United States)

    Llatas, C; Osmani, M

    2016-10-01

    Reduction in construction waste is a pressing need in many countries. The design of building elements is considered a pivotal process to achieve waste reduction at source, which enables an informed prediction of their wastage reduction levels. However the lack of quantitative methods linking design strategies to waste reduction hinders designing out waste practice in building projects. Therefore, this paper addresses this knowledge gap through the design and validation of a Building Design Waste Reduction Strategies (Waste ReSt) model that aims to investigate the relationships between design variables and their impact on onsite waste reduction. The Waste ReSt model was validated in a real-world case study involving 20 residential buildings in Spain. The validation process comprises three stages. Firstly, design waste causes were analyzed. Secondly, design strategies were applied leading to several alternative low waste building elements. Finally, their potential source reduction levels were quantified and discussed within the context of the literature. The Waste ReSt model could serve as an instrumental tool to simulate designing out strategies in building projects. The knowledge provided by the model could help project stakeholders to better understand the correlation between the design process and waste sources and subsequently implement design practices for low-waste buildings.

  1. Validation of the Hot Strip Mill Model

    Energy Technology Data Exchange (ETDEWEB)

    Richard Shulkosky; David Rosberg; Jerrud Chapman

    2005-03-30

    The Hot Strip Mill Model (HSMM) is an off-line, PC based software originally developed by the University of British Columbia (UBC) and the National Institute of Standards and Technology (NIST) under the AISI/DOE Advanced Process Control Program. The HSMM was developed to predict the temperatures, deformations, microstructure evolution and mechanical properties of steel strip or plate rolled in a hot mill. INTEG process group inc. undertook the current task of enhancing and validating the technology. With the support of 5 North American steel producers, INTEG process group tested and validated the model using actual operating data from the steel plants and enhanced the model to improve prediction results.

  2. Ground-water models: Validate or invalidate

    Science.gov (United States)

    Bredehoeft, J.D.; Konikow, L.F.

    1993-01-01

    The word validation has a clear meaning to both the scientific community and the general public. Within the scientific community the validation of scientific theory has been the subject of philosophical debate. The philosopher of science, Karl Popper, argued that scientific theory cannot be validated, only invalidated. Popper’s view is not the only opinion in this debate; however, many scientists today agree with Popper (including the authors). To the general public, proclaiming that a ground-water model is validated carries with it an aura of correctness that we do not believe many of us who model would claim. We can place all the caveats we wish, but the public has its own understanding of what the word implies. Using the word valid with respect to models misleads the public; verification carries with it similar connotations as far as the public is concerned. Our point is this: using the terms validation and verification are misleading, at best. These terms should be abandoned by the ground-water community.

  3. Model performance analysis and model validation in logistic regression

    Directory of Open Access Journals (Sweden)

    Rosa Arboretti Giancristofaro

    2007-10-01

    Full Text Available In this paper a new model validation procedure for a logistic regression model is presented. At first, we illustrate a brief review of different techniques of model validation. Next, we define a number of properties required for a model to be considered "good", and a number of quantitative performance measures. Lastly, we describe a methodology for the assessment of the performance of a given model by using an example taken from a management study.

  4. Feature extraction for structural dynamics model validation

    Energy Technology Data Exchange (ETDEWEB)

    Hemez, Francois [Los Alamos National Laboratory; Farrar, Charles [Los Alamos National Laboratory; Park, Gyuhae [Los Alamos National Laboratory; Nishio, Mayuko [UNIV OF TOKYO; Worden, Keith [UNIV OF SHEFFIELD; Takeda, Nobuo [UNIV OF TOKYO

    2010-11-08

    This study focuses on defining and comparing response features that can be used for structural dynamics model validation studies. Features extracted from dynamic responses obtained analytically or experimentally, such as basic signal statistics, frequency spectra, and estimated time-series models, can be used to compare characteristics of structural system dynamics. By comparing those response features extracted from experimental data and numerical outputs, validation and uncertainty quantification of numerical model containing uncertain parameters can be realized. In this study, the applicability of some response features to model validation is first discussed using measured data from a simple test-bed structure and the associated numerical simulations of these experiments. issues that must be considered were sensitivity, dimensionality, type of response, and presence or absence of measurement noise in the response. Furthermore, we illustrate a comparison method of multivariate feature vectors for statistical model validation. Results show that the outlier detection technique using the Mahalanobis distance metric can be used as an effective and quantifiable technique for selecting appropriate model parameters. However, in this process, one must not only consider the sensitivity of the features being used, but also correlation of the parameters being compared.

  5. A Procedural Model for Process Improvement Projects

    OpenAIRE

    Kreimeyer, Matthias;Daniilidis, Charampos;Lindemann, Udo

    2017-01-01

    Process improvement projects are of a complex nature. It is therefore necessary to use experience and knowledge gained in previous projects when executing a new project. Yet, there are few pragmatic planning aids, and transferring the institutional knowledge from one project to the next is difficult. This paper proposes a procedural model that extends common models for project planning to enable staff on a process improvement project to adequately plan their projects, enabling them to documen...

  6. Gear Windage Modeling Progress - Experimental Validation Status

    Science.gov (United States)

    Kunz, Rob; Handschuh, Robert F.

    2008-01-01

    In the Subsonics Rotary Wing (SRW) Project being funded for propulsion work at NASA Glenn Research Center, performance of the propulsion system is of high importance. In current rotorcraft drive systems many gearing components operate at high rotational speed (pitch line velocity > 24000 ft/ min). In our testing of high speed helical gear trains at NASA Glenn we have found that the work done on the air - oil mist within the gearbox can become a significant part of the power loss of the system. This loss mechanism is referred to as windage. The effort described in this presentation is to try to understand the variables that affect windage, develop a good experimental data base to validate, the analytical project being conducted at Penn State University by Dr. Rob Kunz under a NASA SRW NRA. The presentation provides an update to the status of these efforts.

  7. Regimes of validity for balanced models

    Science.gov (United States)

    Gent, Peter R.; McWilliams, James C.

    1983-07-01

    Scaling analyses are presented which delineate the atmospheric and oceanic regimes of validity for the family of balanced models described in Gent and McWilliams (1983a). The analyses follow and extend the classical work of Charney (1948) and others. The analyses use three non-dimensional parameters which represent the flow scale relative to the Earth's radius, the dominance of turbulent or wave-like processes, and the dominant component of the potential vorticity. For each regime, the models that are accurate both at leading order and through at least one higher order of accuracy in the appropriate small parameter are then identified. In particular, it is found that members of the balanced family are the appropriate models of higher-order accuracy over a broad range of parameter regimes. Examples are also given of particular atmospheric and oceanic phenomena which are in the regimes of validity for the different balanced models.

  8. Validation of Hadronic Models in GEANT4

    Energy Technology Data Exchange (ETDEWEB)

    Koi, Tatsumi; Wright, Dennis H.; /SLAC; Folger, Gunter; Ivanchenko, Vladimir; Kossov, Mikhail; Starkov, Nikolai; /CERN; Heikkinen, Aatos; /Helsinki Inst. of Phys.; Truscott,; Lei, Fan; /QinetiQ; Wellisch, Hans-Peter

    2007-09-26

    Geant4 is a software toolkit for the simulation of the passage of particles through matter. It has abundant hadronic models from thermal neutron interactions to ultra relativistic hadrons. An overview of validations in Geant4 hadronic physics is presented based on thin target measurements. In most cases, good agreement is available between Monte Carlo prediction and experimental data; however, several problems have been detected which require some improvement in the models.

  9. Validation of hadronic models in GEANT4

    CERN Document Server

    Koi, Tatsumi; Folger, Günter; Ivanchenko, Vladimir; Kossov, Mikhail; Starkov, Nikolai; Heikkinen, Aatos; Truscott, Pete; Lei, Fan; Wellisch, Hans-Peter

    2007-01-01

    Geant4 is a software toolkit for the simulation of the passage of particles through matter. It has abundant hadronic models from thermal neutron interactions to ultra relativistic hadrons. An overview of validations in Geant4 hadronic physics is presented based on thin-target measurements. In most cases, good agreement is available between Monte Carlo prediction and experimental data; however, several problems have been detected which require some improvement in the models.

  10. A proposed model for construction project management ...

    African Journals Online (AJOL)

    The lack of a proper communication skills model for project management may ... done to identify the most important project management communication skills and applications of communication that effective project managers should possess.

  11. The Validation of Climate Models: The Development of Essential Practice

    Science.gov (United States)

    Rood, R. B.

    2011-12-01

    It is possible from both a scientific and philosophical perspective to state that climate models cannot be validated. However, with the realization that the scientific investigation of climate change is as much a subject of politics as of science, maintaining this formal notion of "validation" has significant consequences. For example, it relegates the bulk of work of many climate scientists to an exercise of model evaluation that can be construed as ill-posed. Even within the science community this motivates criticism of climate modeling as an exercise of weak scientific practice. Stepping outside of the science community, statements that validation is impossible are used in political arguments to discredit the scientific investigation of climate, to maintain doubt about projections of climate change, and hence, to prohibit the development of public policy to regulate the emissions of greenhouse gases. With the acceptance of the impossibility of validation, scientists often state that the credibility of models can be established through an evaluation process. A robust evaluation process leads to the quantitative description of the modeling system against a standard set of measures. If this process is standardized as institutional practice, then this provides a measure of model performance from one modeling release to the next. It is argued, here, that such a robust and standardized evaluation of climate models can be structured and quantified as "validation." Arguments about the nuanced meaning of validation and evaluation are a subject about which the climate modeling community needs to develop a standard. It does injustice to a body of science-based knowledge to maintain that validation is "impossible." Rather than following such a premise, which immediately devalues the knowledge base, it is more useful to develop a systematic, standardized approach to robust, appropriate validation. This stands to represent the complexity of the Earth's climate and its

  12. Quantitative system validation in model driven design

    DEFF Research Database (Denmark)

    Hermanns, Hilger; Larsen, Kim Guldstrand; Raskin, Jean-Francois;

    2010-01-01

    The European STREP project Quasimodo1 develops theory, techniques and tool components for handling quantitative constraints in model-driven development of real-time embedded systems, covering in particular real-time, hybrid and stochastic aspects. This tutorial highlights the advances made, focus...

  13. A Review of Models and Procedures for Synthetic Validation for Entry-Level Army Jobs

    Science.gov (United States)

    1988-12-01

    ARI Research Note 88-107 A Review of Models and Procedures for Co Synthetic Validation for Entry-LevelM -£.2 Army Jobs C i Jennifer L. Crafts, Philip...of Models and Procecures for Synthetic Validation for Entry-Level Army Jobs 12. PERSONAL AUTHOR(S) Crafts, Jennifer L., Szenas, Fhilip L., Chia, Wel...well as ability. ProJect A Validity Results Campbell (1986) and McHerry, Houigh. Thquam, Hanson, and Ashworth (1987) have conducted extensive

  14. SPR Hydrostatic Column Model Verification and Validation.

    Energy Technology Data Exchange (ETDEWEB)

    Bettin, Giorgia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lord, David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rudeen, David Keith [Gram, Inc. Albuquerque, NM (United States)

    2015-10-01

    A Hydrostatic Column Model (HCM) was developed to help differentiate between normal "tight" well behavior and small-leak behavior under nitrogen for testing the pressure integrity of crude oil storage wells at the U.S. Strategic Petroleum Reserve. This effort was motivated by steady, yet distinct, pressure behavior of a series of Big Hill caverns that have been placed under nitrogen for extended period of time. This report describes the HCM model, its functional requirements, the model structure and the verification and validation process. Different modes of operation are also described, which illustrate how the software can be used to model extended nitrogen monitoring and Mechanical Integrity Tests by predicting wellhead pressures along with nitrogen interface movements. Model verification has shown that the program runs correctly and it is implemented as intended. The cavern BH101 long term nitrogen test was used to validate the model which showed very good agreement with measured data. This supports the claim that the model is, in fact, capturing the relevant physical phenomena and can be used to make accurate predictions of both wellhead pressure and interface movements.

  15. Evolution of the JPSS Ground Project Calibration and Validation System

    Science.gov (United States)

    Chander, G.; Jain, P.

    2014-12-01

    The Joint Polar Satellite System (JPSS) is the National Oceanic and Atmospheric Administration's (NOAA) next-generation operational Earth observation Program that acquires and distributes global environmental data from multiple polar-orbiting satellites. The JPSS Program plays a critical role to NOAA's mission to understand and predict changes in weather, climate, oceans, and coasts environments, which supports the nation's economy and protects lives and property. The National Aeronautics and Space Administration (NASA) is acquiring and implementing the JPSS, comprised of flight and ground systems on behalf of NOAA. The JPSS satellites are planned to fly in afternoon orbit and will provide operational continuity of satellite-based observations and products for NOAA Polar-orbiting Operational Environmental Satellites (POES) and the Suomi National Polar-orbiting Partnership (SNPP) satellite. Government Resource for Algorithm Verification, Independent Test, and Evaluation (GRAVITE) system is a NOAA system developed and deployed by JPSS Ground Project to support Calibration and Validation (Cal/Val), Algorithm Integration, Investigation, and Tuning, and Data Quality Monitoring. It is a mature, deployed system that supports SNPP mission and has been in operations since SNPP launch. This paper discusses the major re-architecture for Block 2.0 that incorporates SNPP lessons learned, architecture of the system, and demonstrates how GRAVITE has evolved as a system with increased performance. It is a robust, reliable, maintainable, scalable, and secure system that supports development, test, and production strings, replaces proprietary and custom software, uses open source software, and is compliant with NASA and NOAA standards. "[Pending NASA Goddard Applied Engineering & Technology Directorate (AETD) Approval]"

  16. Model validation in soft systems practice

    Energy Technology Data Exchange (ETDEWEB)

    Checkland, P. [Univ. of Lancaster (United Kingdom)

    1995-03-01

    The concept of `a model` usually evokes the connotation `model of part of the real world`. That is an almost automatic response. It makes sense especially in relation to the way the concept has been developed and used in natural science. Classical operational research (OR), with its scientific aspirations, and systems engineering, use the concept in the same way and in addition use models as surrogates for the real world, on which experimentation is cheap. In these fields the key feature of a model is representativeness. In soft systems methodology (SSM) models are not of part of the world; they are only relevant to debate about the real world and are used in a cyclic learning process. The paper shows how the different concepts of validation in classical OR and SSM lead to a way of sharply defining the nature of `soft OR`. 21 refs.

  17. Wake models developed during the Wind Shadow project

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, S.; Ott, S.; Pena, A.; Berg, J.; Nielsen, M.; Rathmann, O.; Joergensen, H.

    2011-11-15

    The Wind Shadow project has developed and validated improved models for determining the wakes losses, and thereby the array efficiency of very large, closely packed wind farms. The rationale behind the project has been that the existing software has been covering these types of wind farms poorly, both with respect to the densely packed turbines and the large fetches needed to describe the collective shadow effects of one farm to the next. Further the project has developed the necessary software for the use of the models. Guidelines with recommendations for the use of the models are included in the model deliverables. The project has been carried out as a collaborative project between Risoe DTU, DONG, Vattenfall, DNV and VESTAS, and it has been financed by energinet.dk grant no. 10086. (Author)

  18. Modelling of Transport Projects Uncertainties

    DEFF Research Database (Denmark)

    Salling, Kim Bang; Leleur, Steen

    2009-01-01

    This paper proposes a new way of handling the uncertainties present in transport decision making based on infrastructure appraisals. The paper suggests to combine the principle of Optimism Bias, which depicts the historical tendency of overestimating transport related benefits and underestimating...... investment costs, with a quantitative risk analysis based on Monte Carlo simulation and to make use of a set of exploratory scenarios. The analysis is carried out by using the CBA-DK model representing the Danish standard approach to socio-economic cost-benefit analysis. Specifically, the paper proposes......-based graphs which function as risk-related decision support for the appraised transport infrastructure project....

  19. Space market model development project

    Science.gov (United States)

    Bishop, Peter C.

    1987-01-01

    The objectives of the research program, Space Market Model Development Project, (Phase 1) were: (1) to study the need for business information in the commercial development of space; and (2) to propose a design for an information system to meet the identified needs. Three simultaneous research strategies were used in proceeding toward this goal: (1) to describe the space business information which currently exists; (2) to survey government and business representatives on the information they would like to have; and (3) to investigate the feasibility of generating new economical information about the space industry.

  20. Verification and Validation of Flight Critical Systems Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Verification and Validation is a multi-disciplinary activity that encompasses elements of systems engineering, safety, software engineering and test. The elements...

  1. Validation of Air Traffic Controller Workload Models

    Science.gov (United States)

    1979-09-01

    SAR) tapes dtirinq the data reduc- tion phase of the project. Kentron International Limited provided the software support for the oroject. This included... ETABS ) or to revised traffic control procedures. The models also can be used to verify productivity benefits after new configurations have been...col- lected and processed manually. A preliminary compari- son has been made between standard NAS Stage A and ETABS operations at Miami. 1.2

  2. Information systems validation using formal models

    Directory of Open Access Journals (Sweden)

    Azadeh Sarram

    2014-03-01

    Full Text Available During the past few years, there has been growing interest to use unified modeling language (UML to consider the functional requirements. However, lacking a tool to detect the accuracy and the logic of diagrams in this language makes a formal model indispensable. In this study, conversion of primary UML model of a system to a colored Petri net has been accomplished in order to examine the precision of the model. For this purpose, first the definition of priority and implementation tags for UML activity diagram are provided; then it is turned into colored Petri net. Second, the proposed model provides translated tags in terms of net transitions and some monitoring are used to control the system characteristics. Finally, an executable model of UML activity diagram is provided so that the designer could simulate the model by using the simulation results to detect and to refine the problems of the model. In addition, by checking the results, we find out the proposed method enhances authenticity and accuracy of early models and the ratio of system validation increases compared with previous methods.

  3. Validation methodology focussing on fuel efficiency as applied in the eCoMove project

    NARCIS (Netherlands)

    Themann, P.; Iasi, L.; Larburu, M.; Trommer, S.

    2012-01-01

    This paper discusses the validation approach applied in the eCoMove project (a large scale EU 7th Framework Programme project). In this project, applications are developed that on the one hand optimise network-wide traffic management and control, and on the other hand advise drivers on the most

  4. Validation methodology focussing on fuel efficiency as applied in the eCoMove project

    NARCIS (Netherlands)

    Themann, P.; Iasi, L.; Larburu, M.; Trommer, S.

    2012-01-01

    This paper discusses the validation approach applied in the eCoMove project (a large scale EU 7th Framework Programme project). In this project, applications are developed that on the one hand optimise network-wide traffic management and control, and on the other hand advise drivers on the most effi

  5. Bayesian structural equation modeling method for hierarchical model validation

    Energy Technology Data Exchange (ETDEWEB)

    Jiang Xiaomo [Department of Civil and Environmental Engineering, Vanderbilt University, Box 1831-B, Nashville, TN 37235 (United States)], E-mail: xiaomo.jiang@vanderbilt.edu; Mahadevan, Sankaran [Department of Civil and Environmental Engineering, Vanderbilt University, Box 1831-B, Nashville, TN 37235 (United States)], E-mail: sankaran.mahadevan@vanderbilt.edu

    2009-04-15

    A building block approach to model validation may proceed through various levels, such as material to component to subsystem to system, comparing model predictions with experimental observations at each level. Usually, experimental data becomes scarce as one proceeds from lower to higher levels. This paper presents a structural equation modeling approach to make use of the lower-level data for higher-level model validation under uncertainty, integrating several components: lower-level data, higher-level data, computational model, and latent variables. The method proposed in this paper uses latent variables to model two sets of relationships, namely, the computational model to system-level data, and lower-level data to system-level data. A Bayesian network with Markov chain Monte Carlo simulation is applied to represent the two relationships and to estimate the influencing factors between them. Bayesian hypothesis testing is employed to quantify the confidence in the predictive model at the system level, and the role of lower-level data in the model validation assessment at the system level. The proposed methodology is implemented for hierarchical assessment of three validation problems, using discrete observations and time-series data.

  6. GCSS Idealized Cirrus Model Comparison Project

    Science.gov (United States)

    Starr, David OC.; Benedetti, Angela; Boehm, Matt; Brown, Philip R. A.; Gierens, Klaus; Girard, Eric; Giraud, Vincent; Jakob, Christian; Jensen, Eric; Khvorostyanov, Vitaly; hide

    2000-01-01

    related to the shape of the particle size distribution and the habits of the ice crystal population, whether assumed or explicitly calculated. In order to isolate the fall speed effect from that of the associated ice crystal population, simulations were also performed where ice water fall speed was set to the same constant value everywhere in each model. Values of 20 and 60 cm/s were assumed. Current results of the project will be described and implications will be drawn. In particular, this exercise is found to strongly focus the definition of issues resulting in observed inter-model differences and to suggest possible strategies for observational validation of the models. The next step in this project is to perform similar comparisons for well observed case studies with sufficient high quality data to adequately define model initiation and forcing specifications and to support quantitative validation of the results.

  7. A community diagnostic tool for chemistry climate model validation

    Directory of Open Access Journals (Sweden)

    A. Gettelman

    2012-09-01

    Full Text Available This technical note presents an overview of the Chemistry-Climate Model Validation Diagnostic (CCMVal-Diag tool for model evaluation. The CCMVal-Diag tool is a flexible and extensible open source package that facilitates the complex evaluation of global models. Models can be compared to other models, ensemble members (simulations with the same model, and/or many types of observations. The initial construction and application is to coupled chemistry-climate models (CCMs participating in CCMVal, but the evaluation of climate models that submitted output to the Coupled Model Intercomparison Project (CMIP is also possible. The package has been used to assist with analysis of simulations for the 2010 WMO/UNEP Scientific Ozone Assessment and the SPARC Report on the Evaluation of CCMs. The CCMVal-Diag tool is described and examples of how it functions are presented, along with links to detailed descriptions, instructions and source code. The CCMVal-Diag tool supports model development as well as quantifies model changes, both for different versions of individual models and for different generations of community-wide collections of models used in international assessments. The code allows further extensions by different users for different applications and types, e.g. to other components of the Earth system. User modifications are encouraged and easy to perform with minimum coding.

  8. Model validation of channel zapping quality

    OpenAIRE

    Kooij, R.; Nicolai, F.; Ahmed, K.; Brunnström, K.

    2009-01-01

    In an earlier paper we showed, that perceived quality of channel zapping is related to the perceived quality of download time of web browsing, as suggested by ITU-T Rec.G.1030. We showed this by performing subjective tests resulting in an excellent fit with a 0.99 correlation. This was what we call a lean forward experiment and gave the rule of thumb result that the zapping time must be less than 0.43 sec to be good ( > 3.5 on the MOS scale). To validate the model we have done new subjective ...

  9. The Digital Astronaut Project Bone Remodeling Model

    Science.gov (United States)

    Pennline, James A.; Mulugeta, Lealem; Lewandowski, Beth E.; Thompson, William K.; Sibonga, Jean D.

    2014-01-01

    Under the conditions of microgravity, astronauts lose bone mass at a rate of 1% to 2% a month, particularly in the lower extremities such as the proximal femur: (1) The most commonly used countermeasure against bone loss has been prescribed exercise, (2) However, current exercise countermeasures do not completely eliminate bone loss in long duration, 4 to 6 months, spaceflight, (3,4) leaving the astronaut susceptible to early onset osteoporosis and a greater risk of fracture later in their lives. The introduction of the Advanced Resistive Exercise Device, coupled with improved nutrition, has further minimized the 4 to 6 month bone loss. But further work is needed to implement optimal exercise prescriptions, and (5) In this light, NASA's Digital Astronaut Project (DAP) is working with NASA physiologists to implement well-validated computational models that can help understand the mechanisms of bone demineralization in microgravity, and enhance exercise countermeasure development.

  10. W-320 Project thermal modeling

    Energy Technology Data Exchange (ETDEWEB)

    Sathyanarayana, K., Fluor Daniel Hanford

    1997-03-18

    This report summarizes the results of thermal analysis performed to provide a technical basis in support of Project W-320 to retrieve by sluicing the sludge in Tank 241-C-106 and to transfer into Tank 241-AY-102. Prior theraml evaluations in support of Project W-320 safety analysis assumed the availability of 2000 to 3000 CFM, as provided by Tank Farm Operations, for tank floor cooling channels from the secondary ventilation system. As this flow availability has no technical basis, a detailed Tank 241-AY-102 secondary ventilation and floor coating channel flow model was developed and analysis was performed. The results of the analysis show that only about 150 cfm flow is in floor cooLing channels. Tank 241-AY-102 thermal evaluation was performed to determine the necessary cooling flow for floor cooling channels using W-030 primary ventilation system for different quantities of Tank 241-C-106 sludge transfer into Tank 241-AY-102. These sludge transfers meet different options for the project along with minimum required modification of the ventilation system. Also the results of analysis for the amount of sludge transfer using the current system is presented. The effect of sludge fluffing factor, heat generation rate and its distribution between supernatant and sludge in Tank 241-AY-102 on the amount of sludge transfer from Tank 241-C-106 were evaluated and the results are discussed. Also transient thermal analysis was performed to estimate the time to reach the steady state. For a 2 feet sludge transfer, about 3 months time will be requirad to reach steady state. Therefore, for the purpose of process control, a detailed transient thermal analysis using GOTH Computer Code will be required to determine transient response of the sludge in Tank 241-AY-102. Process control considerations are also discussed to eliminate the potential for a steam bump during retrieval and storage in Tanks 241-C-106 and 241-AY-102 respectively.

  11. Sharks, Minnows, and Wheelbarrows: Calculus Modeling Projects

    Science.gov (United States)

    Smith, Michael D.

    2011-01-01

    The purpose of this article is to present two very active applied modeling projects that were successfully implemented in a first semester calculus course at Hollins University. The first project uses a logistic equation to model the spread of a new disease such as swine flu. The second project is a human take on the popular article "Do Dogs Know…

  12. Validity of information security policy models

    Directory of Open Access Journals (Sweden)

    Joshua Onome Imoniana

    Full Text Available Validity is concerned with establishing evidence for the use of a method to be used with a particular set of population. Thus, when we address the issue of application of security policy models, we are concerned with the implementation of a certain policy, taking into consideration the standards required, through attribution of scores to every item in the research instrument. En today's globalized economic scenarios, the implementation of information security policy, in an information technology environment, is a condition sine qua non for the strategic management process of any organization. Regarding this topic, various studies present evidences that, the responsibility for maintaining a policy rests primarily with the Chief Security Officer. The Chief Security Officer, in doing so, strives to enhance the updating of technologies, in order to meet all-inclusive business continuity planning policies. Therefore, for such policy to be effective, it has to be entirely embraced by the Chief Executive Officer. This study was developed with the purpose of validating specific theoretical models, whose designs were based on literature review, by sampling 10 of the Automobile Industries located in the ABC region of Metropolitan São Paulo City. This sampling was based on the representativeness of such industries, particularly with regards to each one's implementation of information technology in the region. The current study concludes, presenting evidence of the discriminating validity of four key dimensions of the security policy, being such: the Physical Security, the Logical Access Security, the Administrative Security, and the Legal & Environmental Security. On analyzing the Alpha of Crombach structure of these security items, results not only attest that the capacity of those industries to implement security policies is indisputable, but also, the items involved, homogeneously correlate to each other.

  13. Modelling of Transport Projects Uncertainties

    DEFF Research Database (Denmark)

    Salling, Kim Bang; Leleur, Steen

    2012-01-01

    This paper proposes a new way of handling the uncertainties present in transport decision making based on infrastructure appraisals. The paper suggests to combine the principle of Optimism Bias, which depicts the historical tendency of overestimating transport related benefits and underestimating...... investment costs, with a quantitative risk analysis based on Monte Carlo simulation and to make use of a set of exploratory scenarios. The analysis is carried out by using the CBA-DK model representing the Danish standard approach to socio-economic cost-benefit analysis. Specifically, the paper proposes......-based graphs which functions as risk-related decision support for the appraised transport infrastructure project. The presentation of RSF is demonstrated by using an appraisal case concerning a new airfield in the capital of Greenland, Nuuk....

  14. Validation of the filament winding process model

    Science.gov (United States)

    Calius, Emilo P.; Springer, George S.; Wilson, Brian A.; Hanson, R. Scott

    1987-01-01

    Tests were performed toward validating the WIND model developed previously for simulating the filament winding of composite cylinders. In these tests two 24 in. long, 8 in. diam and 0.285 in. thick cylinders, made of IM-6G fibers and HBRF-55 resin, were wound at + or - 45 deg angle on steel mandrels. The temperatures on the inner and outer surfaces and inside the composite cylinders were recorded during oven cure. The temperatures inside the cylinders were also calculated by the WIND model. The measured and calculated temperatures were then compared. In addition, the degree of cure and resin viscosity distributions inside the cylinders were calculated for the conditions which existed in the tests.

  15. Assessment model validity document FARF31

    Energy Technology Data Exchange (ETDEWEB)

    Elert, Mark; Gylling Bjoern; Lindgren, Maria [Kemakta Konsult AB, Stockholm (Sweden)

    2004-08-01

    The prime goal of model validation is to build confidence in the model concept and that the model is fit for its intended purpose. In other words: Does the model predict transport in fractured rock adequately to be used in repository performance assessments. Are the results reasonable for the type of modelling tasks the model is designed for. Commonly, in performance assessments a large number of realisations of flow and transport is made to cover the associated uncertainties. Thus, the flow and transport including radioactive chain decay are preferably calculated in the same model framework. A rather sophisticated concept is necessary to be able to model flow and radionuclide transport in the near field and far field of a deep repository, also including radioactive chain decay. In order to avoid excessively long computational times there is a need for well-based simplifications. For this reason, the far field code FARF31 is made relatively simple, and calculates transport by using averaged entities to represent the most important processes. FARF31 has been shown to be suitable for the performance assessments within the SKB studies, e.g. SR 97. Among the advantages are that it is a fast, simple and robust code, which enables handling of many realisations with wide spread in parameters in combination with chain decay of radionuclides. Being a component in the model chain PROPER, it is easy to assign statistical distributions to the input parameters. Due to the formulation of the advection-dispersion equation in FARF31 it is possible to perform the groundwater flow calculations separately.The basis for the modelling is a stream tube, i.e. a volume of rock including fractures with flowing water, with the walls of the imaginary stream tube defined by streamlines. The transport within the stream tube is described using a dual porosity continuum approach, where it is assumed that rock can be divided into two distinct domains with different types of porosity

  16. [Catalonia's primary healthcare accreditation model: a valid model].

    Science.gov (United States)

    Davins, Josep; Gens, Montserrat; Pareja, Clara; Guzmán, Ramón; Marquet, Roser; Vallès, Roser

    2014-07-01

    There are few experiences of accreditation models validated by primary care teams (EAP). The aim of this study was to detail the process of design, development, and subsequent validation of the consensus EAP accreditation model of Catalonia. An Operating Committee of the Health Department of Catalonia revised models proposed by the European Foundation for Quality Management, the Joint Commission International and the Institut Català de la Salut and proposed 628 essential standards to the technical group (25 experts in primary care and quality of care), to establish consensus standards. The consensus document was piloted in 30 EAP for the purpose of validating the contents, testing standards and identifying evidence. Finally, we did a survey to assess acceptance and validation of the document. The Technical Group agreed on a total of 414 essential standards. The pilot selected a total of 379. Mean compliance with the standards of the final document in the 30 EAP was 70.4%. The standards results were the worst fulfilment percentage. The survey target that 83% of the EAP found it useful and 78% found the content of the accreditation manual suitable as a tool to assess the quality of the EAP, and identify opportunities for improvement. On the downside they highlighted its complexity and laboriousness. We have a model that fits the reality of the EAP, and covers all relevant issues for the functioning of an excellent EAP. The model developed in Catalonia is a model for easy understanding.

  17. SDG and qualitative trend based model multiple scale validation

    Science.gov (United States)

    Gao, Dong; Xu, Xin; Yin, Jianjin; Zhang, Hongyu; Zhang, Beike

    2017-09-01

    Verification, Validation and Accreditation (VV&A) is key technology of simulation and modelling. For the traditional model validation methods, the completeness is weak; it is carried out in one scale; it depends on human experience. The SDG (Signed Directed Graph) and qualitative trend based multiple scale validation is proposed. First the SDG model is built and qualitative trends are added to the model. And then complete testing scenarios are produced by positive inference. The multiple scale validation is carried out by comparing the testing scenarios with outputs of simulation model in different scales. Finally, the effectiveness is proved by carrying out validation for a reactor model.

  18. The Copenhagen Traffic Model and its Application in the Metro City Ring Project

    DEFF Research Database (Denmark)

    Vuk, Goran; Overgård, Christian Hansen; Fox, J.

    2009-01-01

    In June 2007, the Danish Parliament passed an act to finance the construction of the Metro City Ring in Copenhagen. The assessment project is based on the passenger patronage forecasts for 2015 from the Copenhagen traffic model. In this paper we show how the model forecasts for this particular...... infrastructure project can be explained through detailed knowledge of model structure and model validation....

  19. On-orbit validation system for space structure composite actuators Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This SBIR project delivers an On-orbit Validation System (OVS) that provides performance and durability data for Macro Fiber Composite (MFC) active piezocomposite...

  20. Forecasting project schedule performance using probabilistic and deterministic models

    Directory of Open Access Journals (Sweden)

    S.A. Abdel Azeem

    2014-04-01

    Full Text Available Earned value management (EVM was originally developed for cost management and has not widely been used for forecasting project duration. In addition, EVM based formulas for cost or schedule forecasting are still deterministic and do not provide any information about the range of possible outcomes and the probability of meeting the project objectives. The objective of this paper is to develop three models to forecast the estimated duration at completion. Two of these models are deterministic; earned value (EV and earned schedule (ES models. The third model is a probabilistic model and developed based on Kalman filter algorithm and earned schedule management. Hence, the accuracies of the EV, ES and Kalman Filter Forecasting Model (KFFM through the different project periods will be assessed and compared with the other forecasting methods such as the Critical Path Method (CPM, which makes the time forecast at activity level by revising the actual reporting data for each activity at a certain data date. A case study project is used to validate the results of the three models. Hence, the best model is selected based on the lowest average percentage of error. The results showed that the KFFM developed in this study provides probabilistic prediction bounds of project duration at completion and can be applied through the different project periods with smaller errors than those observed in EV and ES forecasting models.

  1. Proposal of New PRORISK Model for GSD Projects

    Directory of Open Access Journals (Sweden)

    M. Rizwan Jameel Qureshi

    2015-05-01

    Full Text Available The level of complexity and risks associated with software are increasing exponentially because of competing environment especially in geographically distributed projects. Global software development (GSD face challenges like distance, communication and coordination challenges. The coordination and communication challenges are the main causes of failure in GSD. Project Oriented Risk Management (PRORISK is one of the models to address the importance of risk management and project management processes in standard software projects. However, existing model is not proposed to handle GSD associated risks. This warrants the proposal of new PRORISK model to manage the risks of GSD. Survey is used as a research design to validate the proposed solution. We anticipate that the proposed solution will help the software companies to cater the risks associated with GSD.

  2. Full-Scale Cookoff Model Validation Experiments

    Energy Technology Data Exchange (ETDEWEB)

    McClelland, M A; Rattanapote, M K; Heimdahl, E R; Erikson, W E; Curran, P O; Atwood, A I

    2003-11-25

    This paper presents the experimental results of the third and final phase of a cookoff model validation effort. In this phase of the work, two generic Heavy Wall Penetrators (HWP) were tested in two heating orientations. Temperature and strain gage data were collected over the entire test period. Predictions for time and temperature of reaction were made prior to release of the live data. Predictions were comparable to the measured values and were highly dependent on the established boundary conditions. Both HWP tests failed at a weld located near the aft closure of the device. More than 90 percent of unreacted explosive was recovered in the end heated experiment and less than 30 percent recovered in the side heated test.

  3. Modeling Research Project Risks with Fuzzy Maps

    Science.gov (United States)

    Bodea, Constanta Nicoleta; Dascalu, Mariana Iuliana

    2009-01-01

    The authors propose a risks evaluation model for research projects. The model is based on fuzzy inference. The knowledge base for fuzzy process is built with a causal and cognitive map of risks. The map was especially developed for research projects, taken into account their typical lifecycle. The model was applied to an e-testing research…

  4. The Canvas model in project formulation

    OpenAIRE

    Ferreira-Herrera, Diana Carolina

    2016-01-01

    Purpose: The aim of this article is to determine the relevance of the Canvas methodology in project formulation through model characterization, thus answering the question: Is the Canvas methodology a relevant model for project management in an entrepreneurial context? Description: The Canvas model seeks to manage projects as business units. It is a model intended for emphasizing the entrepreneurial potential in project management. For this, texts and articles that have provided the basis for...

  5. OC5 Project Phase Ib: Validation of Hydrodynamic Loading on a Fixed, Flexible Cylinder for Offshore Wind Applications

    Energy Technology Data Exchange (ETDEWEB)

    Robertson, Amy N.; Wendt, Fabian; Jonkman, Jason M.; Popko, Wojciech; Borg, Michael; Bredmose, Henrik; Schlutter, Flemming; Qvist, Jacob; Bergua, Roger; Harries, Rob; Yde, Anders; Nygaard, Tor Anders; de Vaal, Jacobus Bernardus; Oggiano, Luca; Bozonnet, Pauline; Bouy, Ludovic; Sanchez, C. B.; Garcia, R. G.; Bachynski, E. E.; Tu, Y.; Bayati, I.; Borisade, F.; Shin, H.; van der Zee, T.; Guerinel, M.

    2016-09-01

    This paper summarizes the findings from Phase Ib of the Offshore Code Comparison, Collaboration, Continued with Correlation (OC5) project. OC5 is a project run under the International Energy Agency (IEA) Wind Research Task 30, and is focused on validating the tools used for modelling offshore wind systems through the comparison of simulated responses of select offshore wind systems (and components) to physical test data. For Phase Ib of the project, simulated hydrodynamic loads on a flexible cylinder fixed to a sloped bed were validated against test measurements made in the shallow water basin at the Danish Hydraulic Institute (DHI) with support from the Technical University of Denmark (DTU). The first phase of OC5 examined two simple cylinder structures (Phase Ia and Ib) to focus on validation of hydrodynamic models used in the various tools before moving on to more complex offshore wind systems and the associated coupled physics. Verification and validation activities such as these lead to improvement of offshore wind modelling tools, which will enable the development of more innovative and cost-effective offshore wind designs.

  6. The Validity of Project Assessment in an Advanced Level Biology Examination.

    Science.gov (United States)

    Brown, C. R.; And Others

    1995-01-01

    Assessment of practical objectives by means of a project which occurred in an operational Advanced level examination in the United Kingdom is analyzed for construct validity. As in previous research, low correlation were found between scores of (n=218) candidates on the project and on the other components of the examination. (18 references)…

  7. Validation and intercomparison of Persistent Scatterers Interferometry: PSIC4 project results

    NARCIS (Netherlands)

    Raucoules, D.; Bourgine, B.; Michele, M. de; Le Cozannet, G.; Closset, L.; Bremmer, C.; Veldkamp, H.; Tragheim, D.; Bateson, L.; Crosetto, M.; Agudo, M.; Engdahl, M.

    2009-01-01

    This article presents the main results of the Persistent Scatterer Interferometry Codes Cross Comparison and Certification for long term differential interferometry (PSIC4) project. The project was based on the validation of the PSI (Persistent Scatterer Interferometry) data with respect to levellin

  8. Entry Systems Modeling (ESM) Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Cutting edge customer driven research in two areas:Aerosciences, including the completion and delivery of two new aerothermal CFD codes, a first ever validated shock...

  9. Statistical validation of normal tissue complication probability models

    NARCIS (Netherlands)

    Xu, Cheng-Jian; van der Schaaf, Arjen; van t Veld, Aart; Langendijk, Johannes A.; Schilstra, Cornelis

    2012-01-01

    PURPOSE: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. METHODS AND MATERIALS: A penalized regression method, LASSO (least absolute shrinkage

  10. Development and application of new quality model for software projects.

    Science.gov (United States)

    Karnavel, K; Dillibabu, R

    2014-01-01

    The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects.

  11. Geochemistry Model Validation Report: Material Degradation and Release Model

    Energy Technology Data Exchange (ETDEWEB)

    H. Stockman

    2001-09-28

    The purpose of this Analysis and Modeling Report (AMR) is to validate the Material Degradation and Release (MDR) model that predicts degradation and release of radionuclides from a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. This AMR is prepared according to ''Technical Work Plan for: Waste Package Design Description for LA'' (Ref. 17). The intended use of the MDR model is to estimate the long-term geochemical behavior of waste packages (WPs) containing U. S . Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The model is intended to predict (1) the extent to which criticality control material, such as gadolinium (Gd), will remain in the WP after corrosion of the initial WP, (2) the extent to which fissile Pu and uranium (U) will be carried out of the degraded WP by infiltrating water, and (3) the chemical composition and amounts of minerals and other solids left in the WP. The results of the model are intended for use in criticality calculations. The scope of the model validation report is to (1) describe the MDR model, and (2) compare the modeling results with experimental studies. A test case based on a degrading Pu-ceramic WP is provided to help explain the model. This model does not directly feed the assessment of system performance. The output from this model is used by several other models, such as the configuration generator, criticality, and criticality consequence models, prior to the evaluation of system performance. This document has been prepared according to AP-3.10Q, ''Analyses and Models'' (Ref. 2), and prepared in accordance with the technical work plan (Ref. 17).

  12. A simplified model of software project dynamics

    OpenAIRE

    Ruiz Carreira, Mercedes; Ramos Román, Isabel; Toro Bonilla, Miguel

    2001-01-01

    The simulation of a dynamic model for software development projects (hereinafter SDPs) helps to investigate the impact of a technological change, of different management policies, and of maturity level of organisations over the whole project. In the beginning of the 1990s, with the appearance of the dynamic model for SDPs by Abdel-Hamid and Madnick [Software Project Dynamics: An Integrated Approach, Prentice-Hall, Englewood Cliffs, NJ, 1991], a significant advance took place in the field of p...

  13. Validation of Biomarker-based risk prediction models

    OpenAIRE

    Taylor, Jeremy M.G.; Ankerst, Donna P.; Andridge, Rebecca R.

    2008-01-01

    The increasing availability and use of predictive models to facilitate informed decision making highlights the need for careful assessment of the validity of these models. In particular, models involving biomarkers require careful validation for two reasons: issues with overfitting when complex models involve a large number of biomarkers, and inter-laboratory variation in assays used to measure biomarkers. In this paper we distinguish between internal and external statistical validation. Inte...

  14. Development of the Digital Astronaut Project for the analysis of the mechanisms of physiologic adaptation to microgravity: Validation of the cardiovascular system module

    Science.gov (United States)

    Summers, Richard; Coleman, Thomas; Meck, Janice

    The physiologic adaptation of humans to the microgravity environment is complex and requires an integrative perspective to fully understand the mechanisms involved. A large computer model of human systems physiology provides the framework for the development of the Digital Astronaut to be used by NASA in the analysis of adaptive mechanisms. While project expansion is ongoing to include all relevant systems, we describe the validation results of the cardiovascular phase of model development. The cardiovascular aspects of the model were validated by benchmark comparisons to published literature findings of changes in left ventricular mass, right atrial pressure and plasma volumes. Computer simulations using the model predicted microgravity induced changes in the target endpoints within statistical validity of experimental findings. Therefore, the current cardiovascular portion of the Digital Astronaut Project computer model appears to accurately predict observed microgravity induced physiologic adaptations. The ongoing process of model development to include all spaceflight relevant systems will require similar validations.

  15. Model validation of channel zapping quality

    Science.gov (United States)

    Kooij, Robert; Nicolai, Floris; Ahmed, Kamal; Brunnström, Kjell

    2009-02-01

    In an earlier paper we showed, that perceived quality of channel zapping is related to the perceived quality of download time of web browsing, as suggested by ITU-T Rec.G.1030. We showed this by performing subjective tests resulting in an excellent fit with a 0.99 correlation. This was what we call a lean forward experiment and gave the rule of thumb result that the zapping time must be less than 0.43 sec to be good ( > 3.5 on the MOS scale). To validate the model we have done new subjective experiments. These experiments included lean backwards zapping i.e. sitting in a sofa with a remote control. The subjects are more forgiving in this case and the requirement could be relaxed to 0.67 sec. We also conducted subjective experiments where the zapping times are varying. We found that the MOS rating decreases if zapping delay times are varying. In our experiments we assumed uniformly distributed delays, where the variance cannot be larger than the mean delay. We found that in order to obtain a MOS rating of at least 3.5, that the maximum allowed variance, and thus also the maximum allowed mean zapping delay, is 0.46 sec.

  16. OC5 Project Phase Ib: Validation of Hydrodynamic Loading on a Fixed, Flexible Cylinder for Offshore Wind Applications

    DEFF Research Database (Denmark)

    Robertson, Amy N.; Wendt, Fabian; Jonkman, Jason M.;

    2016-01-01

    systems through the comparison of simulated responses of select offshore wind systems (and components) to physical test data. For Phase Ib of the project, simulated hydrodynamic loads on a flexible cylinder fixed to a sloped bed were validated against test measurements made in the shallow water basin...... and the associated coupled physics. Verification and validation activities such as these lead to improvement of offshore wind modelling tools, which will enable the development of more innovative and costeffective offshore wind designs. (C) 2016 The Authors. Published by Elsevier Ltd....

  17. INFORMATIONAL-ANALYTIC MODEL OF REGIONAL PROJECT PORTFOLIO FORMING

    Directory of Open Access Journals (Sweden)

    I. A. Osaulenko

    2016-01-01

    Full Text Available The article is devoted to the problem of regional project portfolio management in context of interaction of the regional development’s motive forces interaction. The features of innovation development on the regional level and their influence on the portfolio forming process considered. An existing approaches for portfolio modelling and formal criterion of the projects selection analyzed. At the same time the organization of key subjects of regional development interaction described. The aim of the article is investigation of informational aspects of project selection in process of the main development’s motive forces interaction and analytic model of portfolio filling validation. At that an inclination of stakeholders to reach a consensus taking into account. The Triple Helix conception using for concrete definition of the functions of the regional development’s motive forces. Asserted, that any component of innovation triad «science–business–government» can be an initiator of regional project, but it need to support two another components. Non-power interaction theory using for investigation of subjects interrelations in process of joint activity proposed. One of the key concept of the theory is information distance. It characterizes inclination of the parties to reach a consensus based on statistics. Projections of information distance onto directions of development axes using for more accurate definition of mutual positions in the all lines of development proposed. Another important parameter of the model which has an influence on the project support is awareness of stakeholders about it. Formalized description of project in the form of fast set of parameters proposes to use for determination of the awareness. The weighting coefficients for each parameter by expert way. Simultaneously the precision of the each parameter setting for all presented projects determines. On the base of appointed values of information distances and

  18. World energy projection system: Model documentation

    Energy Technology Data Exchange (ETDEWEB)

    1992-06-01

    The World Energy Project System (WEPS) is an accounting framework that incorporates projects from independently documented models and assumptions about the future energy intensity of economic activity (ratios of total energy consumption divided by gross domestic product) and about the rate of incremental energy requirements met by hydropower, geothermal, coal, and natural gas to produce projections of world energy consumption published annually by the Energy Information Administration (EIA) in the International Energy Outlook (IEO) (Figure 1). Two independently documented models presented in Figure 1, the Oil Market Simulation (OMS) model and the World Integrated Nuclear Evaluation System (WINES) provide projections of oil and nuclear power consumption published in the IEO. Output from a third independently documented model, and the International Coal Trade Model (ICTM), is not published in the IEO but is used in WEPS as a supply check on projections of world coal consumption produced by WEPS and published in the IEO. A WEPS model of natural gas production documented in this report provides the same type of implicit supply check on the WEPS projections of world natural gas consumption published in the IEO. Two additional models are included in Figure 1, the OPEC Capacity model and the Non-OPEC Oil Production model. These WEPS models provide inputs to the OMS model and are documented in this report.

  19. A Hybrid Authorization Model For Project-Oriented Workflow

    Institute of Scientific and Technical Information of China (English)

    Zhang Xiaoguang(张晓光); Cao Jian; Zhang Shensheng

    2003-01-01

    In the context of workflow systems, security-relevant aspect is related to the assignment of activities to (human or automated) agents. This paper intends to cast light on the management of project-oriented workflow. A comprehensive authorization model is proposed from the perspective of project management. In this model, the concept of activity decomposition and team is introduced, which improves the security of conventional role-based access control. Furthermore, policy is provided to define the static and dynamic constraints such as Separation of Duty (SoD). Validity of constraints is proposed to provide a fine-grained assignment, which improves the performance of policy management. The model is applicable not only to project-oriented workflow applications but also to other teamwork environments such as virtual enterprise.

  20. Validation technique using mean and variance of kriging model

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ho Sung; Jung, Jae Jun; Lee, Tae Hee [Hanyang Univ., Seoul (Korea, Republic of)

    2007-07-01

    To validate rigorously the accuracy of metamodel is an important research area in metamodel techniques. A leave-k-out cross-validation technique not only requires considerable computational cost but also cannot measure quantitatively the fidelity of metamodel. Recently, the average validation technique has been proposed. However the average validation criterion may stop a sampling process prematurely even if kriging model is inaccurate yet. In this research, we propose a new validation technique using an average and a variance of response during a sequential sampling method, such as maximum entropy sampling. The proposed validation technique becomes more efficient and accurate than cross-validation technique, because it integrates explicitly kriging model to achieve an accurate average and variance, rather than numerical integration. The proposed validation technique shows similar trend to root mean squared error such that it can be used as a strop criterion for sequential sampling.

  1. Validity of covariance models for the analysis of geographical variation

    DEFF Research Database (Denmark)

    Guillot, Gilles; Schilling, Rene L.; Porcu, Emilio

    2014-01-01

    attention lately and show that the conditions under which they are valid mathematical models have been overlooked so far. 3. We provide rigorous results for the construction of valid covariance models in this family. 4. We also outline how to construct alternative covariance models for the analysis...

  2. Drilling forces model for lunar regolith exploration and experimental validation

    Science.gov (United States)

    Zhang, Tao; Ding, Xilun

    2017-02-01

    China's Chang'e lunar exploration project aims to sample and return lunar regolith samples at a minimum penetration depth of 2 m in 2017. Unlike such tasks on the Earth, automated drilling and sampling missions on the Moon are more complicated. Therefore, a delicately designed drill tool is required to minimize operational cost and enhance reliability. Penetration force and rotational torque are two critical parameters in designing the drill tool. In this paper, a novel numerical model for predicting penetration force and rotational torque in the drilling of lunar regolith is proposed. The model is based on quasi-static Mohr-Coulomb soil mechanics and explicitly describes the interaction between drill tool and lunar regolith. Geometric features of drill tool, mechanical properties of lunar regolith, and drilling parameters are taken into consideration in the model. Consequently, a drilling test bed was developed, and experimental penetration force and rotational torque were obtained in penetrating a lunar regolith simulant with different drilling parameters. Finally, theoretical and experimental results were compared to validate the proposed model. Experimental results indicated that the numerical model had good accuracy and was effective in predicting the penetration force and rotational torque in drilling the lunar regolith simulant.

  3. The sequential propensity household projection model

    Directory of Open Access Journals (Sweden)

    Tom Wilson

    2013-04-01

    Full Text Available BACKGROUND The standard method of projecting living arrangements and households in Australia and New Zealand is the 'propensity model', a type of extended headship rate model. Unfortunately it possesses a number of serious shortcomings, including internal inconsistencies, difficulties in setting living arrangement assumptions, and very limited scenario creation capabilities. Data allowing the application of more sophisticated dynamic household projection models are unavailable in Australia. OBJECTIVE The aim was create a projection model to overcome these shortcomings whilst minimising input data requirements and costs, and retaining the projection outputs users are familiar with. METHODS The sequential propensity household projection model is proposed. Living arrangement projections take place in a sequence of calculations, with progressively more detailed living arrangement categories calculated in each step. In doing so the model largely overcomes the three serious deficiencies of the standard propensity model noted above. RESULTS The model is illustrated by three scenarios produced for one case study State, Queensland. They are: a baseline scenario in which all propensities are held constant to demonstrate the effects of population growth and ageing, a housing crisis scenario where housing affordability declines, and a prosperity scenario where families and individuals enjoy greater real incomes. A sensitivity analysis in which assumptions are varied one by one is also presented. CONCLUSIONS The sequential propensity model offers a more effective method of producing household and living arrangement projections than the standard propensity model, and is a practical alternative to dynamic projection models for countries and regions where the data and resources to apply such models are unavailable.

  4. Test-driven verification/validation of model transformations

    Institute of Scientific and Technical Information of China (English)

    László LENGYEL; Hassan CHARAF

    2015-01-01

    Why is it important to verify/validate model transformations? The motivation is to improve the quality of the trans-formations, and therefore the quality of the generated software artifacts. Verified/validated model transformations make it possible to ensure certain properties of the generated software artifacts. In this way, verification/validation methods can guarantee different requirements stated by the actual domain against the generated/modified/optimized software products. For example, a verified/ validated model transformation can ensure the preservation of certain properties during the model-to-model transformation. This paper emphasizes the necessity of methods that make model transformation verified/validated, discusses the different scenarios of model transformation verification and validation, and introduces the principles of a novel test-driven method for verifying/ validating model transformations. We provide a solution that makes it possible to automatically generate test input models for model transformations. Furthermore, we collect and discuss the actual open issues in the field of verification/validation of model transformations.

  5. Development and validation of mathematical modelling for pressurised combustion

    Energy Technology Data Exchange (ETDEWEB)

    Richter, S.; Knaus, H.; Risio, B.; Schnell, U.; Hein, K.R.G. [University of Stuttgart, Stuttgart (Germany). Inst. fuer Verfahrenstechnik und Dampfkesselwesen

    1998-12-31

    The advanced 3D-coal combustion code AIOLOS for quasi-stationary turbulent reacting flows is based on a conservative finite-volume procedure. Equations for the conservation of mass, momentum and scalar quantities are solved. In order to deal with pressurized combustion chambers which are usually of cylindrical shape, a first task in the frame of the project consisted in the extension of the code towards cylindrical co-ordinates, since the basic version of AIOLOS was only suitable for cartesian grids. Furthermore, the domain decomposition method was extended to the new co-ordinate system. Its advantage consists in the possibility to introduce refined sub-grids, providing a better resolution of regions where high gradients occur (e.g. high velocity and temperature gradients near the burners). The accuracy of the code was proven by means of a small-scale test case. The results obtained with AIOLOS were compared with the predictions of the commercial CFD-code FLOW3D and validated against the velocity and temperature distributions measured at the test facility. The work during the second period focused mainly on the extension of the reaction model, as well as on the modelling of the optical properties of the flue gas. A modified submodel for char burnout was developed, considering the influence of pressure on diffusion mechanisms and on the chemical reaction at the char particle. The goal during the third project period was to improve the numerical description of turbulence effects and of the radiative heat transfer, in order to obtain an adequate modelling of the complex processes in pressurized coal combustion furnaces. Therefore, a differential Reynolds stress turbulence model (RSM) and a Discrete-Ordinates radiation model were implemented, respectively. 13 refs., 13 figs., 1 tab.

  6. System Advisor Model: Flat Plate Photovoltaic Performance Modeling Validation Report

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, J.; Whitmore, J.; Kaffine, L.; Blair, N.; Dobos, A. P.

    2013-12-01

    The System Advisor Model (SAM) is a free software tool that performs detailed analysis of both system performance and system financing for a variety of renewable energy technologies. This report provides detailed validation of the SAM flat plate photovoltaic performance model by comparing SAM-modeled PV system generation data to actual measured production data for nine PV systems ranging from 75 kW to greater than 25 MW in size. The results show strong agreement between SAM predictions and field data, with annualized prediction error below 3% for all fixed tilt cases and below 8% for all one axis tracked cases. The analysis concludes that snow cover and system outages are the primary sources of disagreement, and other deviations resulting from seasonal biases in the irradiation models and one axis tracking issues are discussed in detail.

  7. SDG-based Model Validation in Chemical Process Simulation

    Institute of Scientific and Technical Information of China (English)

    张贝克; 许欣; 马昕; 吴重光

    2013-01-01

    Signed direct graph (SDG) theory provides algorithms and methods that can be applied directly to chemical process modeling and analysis to validate simulation models, and is a basis for the development of a soft-ware environment that can automate the validation activity. This paper is concentrated on the pretreatment of the model validation. We use the validation scenarios and standard sequences generated by well-established SDG model to validate the trends fitted from the simulation model. The results are helpful to find potential problems, as-sess possible bugs in the simulation model and solve the problem effectively. A case study on a simulation model of boiler is presented to demonstrate the effectiveness of this method.

  8. A Model of Project and Organisational Dynamics

    Directory of Open Access Journals (Sweden)

    Jenny Leonard

    2012-04-01

    Full Text Available The strategic, transformational nature of many information systems projects is now widely understood. Large-scale implementations of systems are known to require significant management of organisational change in order to be successful. Moreover, projects are rarely executed in isolation – most organisations have a large programme of projects being implemented at any one time. However, project and value management methodologies provide ad hoc definitions of the relationship between a project and its environment. This limits the ability of an organisation to manage the larger dynamics between projects and organisations, over time, and between projects. The contribution of this paper, therefore, is to use literature on organisational theory to provide a more systematic understanding of this area. The organisational facilitators required to obtain value from a project are categorised, and the processes required to develop those facilitators are defined. This formalisation facilitates generalisation between projects and highlights any time and path dependencies required in developing organisational facilitators. The model therefore has the potential to contribute to the development of IS project management theory within dynamic organisational contexts. Six cases illustrate how this model could be used.

  9. [Development and validation of indicators for best patient safety practices: the ISEP-Brazil Project].

    Science.gov (United States)

    Gama, Zenewton André da Silva; Saturno-Hernández, Pedro Jesus; Ribeiro, Denise Nieuwenhoff Cardoso; Freitas, Marise Reis de; Medeiros, Paulo José de; Batista, Almária Mariz; Barreto, Analúcia Filgueira Gouveia; Lira, Benize Fernandes; Medeiros, Carlos Alexandre de Souza; Vasconcelos, Cilane Cristina Costa da Silva; Silva, Edna Marta Mendes da; Faria, Eduardo Dantas Baptista de; Dantas, Jane Francinete; Neto, José Gomes; Medeiros, Luana Cristina Lins de; Sicolo, Miguel Angel; Fonseca, Patrícia de Cássia Bezerra; Costa, Rosângela Maria Morais da; Monte, Francisca Sueli; Melo, Veríssimo de

    2016-09-19

    Efficacious patient safety monitoring should focus on the implementation of evidence-based practices that avoid unnecessary harm related to healthcare. The ISEP-Brazil project aimed to develop and validate indicators for best patient safety practices in Brazil. The basis was the translation and adaptation of the indicators validated in the ISEP-Spain project and the document Safe Practices for Better Healthcare (U.S. National Quality Forum), recommending 34 best practices. A 25-member expert panel validated the indicators. Reliability and feasibility were based on a pilot study in three hospitals with different management formats (state, federal, and private). Seventy-five best practice indicators were approved (39 structure; 36 process) for 31 of the 34 recommendations. The indicators were considered valid, reliable, and useful for monitoring patient safety in Brazilian hospitals.

  10. K3 projective models in scrolls

    CERN Document Server

    Johnsen, Trygve

    2004-01-01

    The exposition studies projective models of K3 surfaces whose hyperplane sections are non-Clifford general curves. These models are contained in rational normal scrolls. The exposition supplements standard descriptions of models of general K3 surfaces in projective spaces of low dimension, and leads to a classification of K3 surfaces in projective spaces of dimension at most 10. The authors bring further the ideas in Saint-Donat's classical article from 1974, lifting results from canonical curves to K3 surfaces and incorporating much of the Brill-Noether theory of curves and theory of syzygies developed in the mean time.

  11. ADOPT: A Historically Validated Light Duty Vehicle Consumer Choice Model

    Energy Technology Data Exchange (ETDEWEB)

    Brooker, A.; Gonder, J.; Lopp, S.; Ward, J.

    2015-05-04

    The Automotive Deployment Option Projection Tool (ADOPT) is a light-duty vehicle consumer choice and stock model supported by the U.S. Department of Energy’s Vehicle Technologies Office. It estimates technology improvement impacts on U.S. light-duty vehicles sales, petroleum use, and greenhouse gas emissions. ADOPT uses techniques from the multinomial logit method and the mixed logit method estimate sales. Specifically, it estimates sales based on the weighted value of key attributes including vehicle price, fuel cost, acceleration, range and usable volume. The average importance of several attributes changes nonlinearly across its range and changes with income. For several attributes, a distribution of importance around the average value is used to represent consumer heterogeneity. The majority of existing vehicle makes, models, and trims are included to fully represent the market. The Corporate Average Fuel Economy regulations are enforced. The sales feed into the ADOPT stock model. It captures key aspects for summing petroleum use and greenhouse gas emissions This includes capturing the change in vehicle miles traveled by vehicle age, the creation of new model options based on the success of existing vehicles, new vehicle option introduction rate limits, and survival rates by vehicle age. ADOPT has been extensively validated with historical sales data. It matches in key dimensions including sales by fuel economy, acceleration, price, vehicle size class, and powertrain across multiple years. A graphical user interface provides easy and efficient use. It manages the inputs, simulation, and results.

  12. Statistical validation of normal tissue complication probability models.

    Science.gov (United States)

    Xu, Cheng-Jian; van der Schaaf, Arjen; Van't Veld, Aart A; Langendijk, Johannes A; Schilstra, Cornelis

    2012-09-01

    To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use. Copyright © 2012 Elsevier Inc. All rights reserved.

  13. Statistical Validation of Normal Tissue Complication Probability Models

    Energy Technology Data Exchange (ETDEWEB)

    Xu Chengjian, E-mail: c.j.xu@umcg.nl [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schaaf, Arjen van der; Veld, Aart A. van' t; Langendijk, Johannes A. [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schilstra, Cornelis [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Radiotherapy Institute Friesland, Leeuwarden (Netherlands)

    2012-09-01

    Purpose: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. Methods and Materials: A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Results: Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Conclusion: Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use.

  14. Validation of ecological state space models using the Laplace approximation

    DEFF Research Database (Denmark)

    Thygesen, Uffe Høgsbro; Albertsen, Christoffer Moesgaard; Berg, Casper Willestofte

    2017-01-01

    Many statistical models in ecology follow the state space paradigm. For such models, the important step of model validation rarely receives as much attention as estimation or hypothesis testing, perhaps due to lack of available algorithms and software. Model validation is often based on a naive...... for estimation in general mixed effects models. Implementing one-step predictions in the R package Template Model Builder, we demonstrate that it is possible to perform model validation with little effort, even if the ecological model is multivariate, has non-linear dynamics, and whether observations...... are continuous or discrete. With both simulated data, and a real data set related to geolocation of seals, we demonstrate both the potential and the limitations of the techniques. Our results fill a need for convenient methods for validating a state space model, or alternatively, rejecting it while indicating...

  15. Three Dimensional Vapor Intrusion Modeling: Model Validation and Uncertainty Analysis

    Science.gov (United States)

    Akbariyeh, S.; Patterson, B.; Rakoczy, A.; Li, Y.

    2013-12-01

    Volatile organic chemicals (VOCs), such as chlorinated solvents and petroleum hydrocarbons, are prevalent groundwater contaminants due to their improper disposal and accidental spillage. In addition to contaminating groundwater, VOCs may partition into the overlying vadose zone and enter buildings through gaps and cracks in foundation slabs or basement walls, a process termed vapor intrusion. Vapor intrusion of VOCs has been recognized as a detrimental source for human exposures to potential carcinogenic or toxic compounds. The simulation of vapor intrusion from a subsurface source has been the focus of many studies to better understand the process and guide field investigation. While multiple analytical and numerical models were developed to simulate the vapor intrusion process, detailed validation of these models against well controlled experiments is still lacking, due to the complexity and uncertainties associated with site characterization and soil gas flux and indoor air concentration measurement. In this work, we present an effort to validate a three-dimensional vapor intrusion model based on a well-controlled experimental quantification of the vapor intrusion pathways into a slab-on-ground building under varying environmental conditions. Finally, a probabilistic approach based on Monte Carlo simulations is implemented to determine the probability distribution of indoor air concentration based on the most uncertain input parameters.

  16. Prospects and problems for standardizing model validation in systems biology.

    Science.gov (United States)

    Gross, Fridolin; MacLeod, Miles

    2017-10-01

    There are currently no widely shared criteria by which to assess the validity of computational models in systems biology. Here we discuss the feasibility and desirability of implementing validation standards for modeling. Having such a standard would facilitate journal review, interdisciplinary collaboration, model exchange, and be especially relevant for applications close to medical practice. However, even though the production of predictively valid models is considered a central goal, in practice modeling in systems biology employs a variety of model structures and model-building practices. These serve a variety of purposes, many of which are heuristic and do not seem to require strict validation criteria and may even be restricted by them. Moreover, given the current situation in systems biology, implementing a validation standard would face serious technical obstacles mostly due to the quality of available empirical data. We advocate a cautious approach to standardization. However even though rigorous standardization seems premature at this point, raising the issue helps us develop better insights into the practices of systems biology and the technical problems modelers face validating models. Further it allows us to identify certain technical validation issues which hold regardless of modeling context and purpose. Informal guidelines could in fact play a role in the field by helping modelers handle these. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Teaching mathematical modelling through project work

    DEFF Research Database (Denmark)

    Blomhøj, Morten; Kjeldsen, Tinne Hoff

    2006-01-01

    are reported in manners suitable for internet publication for colleagues. The reports and the related discussions reveal interesting dilemmas concerning the teaching of mathematical modelling and how to cope with these through “setting the scene” for the students modelling projects and through dialogues...... in their own classes, evaluate and report a project based problem oriented course in mathematical modelling. The in-service course runs over one semester and includes three seminars of 3, 1 and 2 days. Experiences show that the course objectives in general are fulfilled and that the course projects......The paper presents and analyses experiences from developing and running an in-service course in project work and mathematical modelling for mathematics teachers in the Danish gymnasium, e.g. upper secondary level, grade 10-12. The course objective is to support the teachers to develop, try out...

  18. Causal Models for Safety Assurance Technologies Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Fulfillment of NASA's System-Wide Safety and Assurance Technology (SSAT) project at NASA requires leveraging vast amounts of data into actionable knowledge. Models...

  19. Teaching mathematical modelling through project work

    DEFF Research Database (Denmark)

    Blomhøj, Morten; Kjeldsen, Tinne Hoff

    2006-01-01

    The paper presents and analyses experiences from developing and running an in-service course in project work and mathematical modelling for mathematics teachers in the Danish gymnasium, e.g. upper secondary level, grade 10-12. The course objective is to support the teachers to develop, try out...... in their own classes, evaluate and report a project based problem oriented course in mathematical modelling. The in-service course runs over one semester and includes three seminars of 3, 1 and 2 days. Experiences show that the course objectives in general are fulfilled and that the course projects...... are reported in manners suitable for internet publication for colleagues. The reports and the related discussions reveal interesting dilemmas concerning the teaching of mathematical modelling and how to cope with these through “setting the scene” for the students modelling projects and through dialogues...

  20. Developing Project Duration Models in Software Engineering

    Institute of Scientific and Technical Information of China (English)

    Pierre Bourque; Serge Oligny; Alain Abran; Bertrand Fournier

    2007-01-01

    Based on the empirical analysis of data contained in the International Software Benchmarking Standards Group(ISBSG) repository, this paper presents software engineering project duration models based on project effort. Duration models are built for the entire dataset and for subsets of projects developed for personal computer, mid-range and mainframeplatforms. Duration models are also constructed for projects requiring fewer than 400 person-hours of effort and for projectsre quiring more than 400 person-hours of effort. The usefulness of adding the maximum number of assigned resources as asecond independent variable to explain duration is also analyzed. The opportunity to build duration models directly fromproject functional size in function points is investigated as well.

  1. Requirements Validation: Execution of UML Models with CPN Tools

    DEFF Research Database (Denmark)

    Machado, Ricardo J.; Lassen, Kristian Bisgaard; Oliveira, Sérgio

    2007-01-01

    with simple unified modelling language (UML) requirements models, it is not easy for the development team to get confidence on the stakeholders' requirements validation. This paper describes an approach, based on the construction of executable interactive prototypes, to support the validation of workflow...

  2. Prospects and problems for standardizing model validation in systems biology

    NARCIS (Netherlands)

    Gross, Fridolin; MacLeod, Miles Alexander James

    2017-01-01

    There are currently no widely shared criteria by which to assess the validity of computational models in systems biology. Here we discuss the feasibility and desirability of implementing validation standards for modeling. Having such a standard would facilitate journal review, interdisciplinary coll

  3. Validation and Adaptation of Router and Switch Models

    NARCIS (Netherlands)

    Boltjes, B.; Fernandez Diaz, I.; Kock, B.A.; Langeveld, R.J.G.M.; Schoenmaker, G.

    2003-01-01

    This paper describes validating OPNET models of key devices for the next generation IP-based tactical network of the Royal Netherlands Army (RNLA). The task of TNO-FEL is to provide insight in scalability and performance of future deployed networks. Because validated models ol key Cisco equipment we

  4. Foundational Issues in Statistical Modeling: Statistical Model Specification and Validation

    Directory of Open Access Journals (Sweden)

    Aris Spanos

    2011-01-01

    Full Text Available Statistical model specification and validation raise crucial foundational problems whose pertinent resolution holds the key to learning from data by securing the reliability of frequentist inference. The paper questions the judiciousness of several current practices, including the theory-driven approach, and the Akaike-type model selection procedures, arguing that they often lead to unreliable inferences. This is primarily due to the fact that goodness-of-fit/prediction measures and other substantive and pragmatic criteria are of questionable value when the estimated model is statistically misspecified. Foisting one's favorite model on the data often yields estimated models which are both statistically and substantively misspecified, but one has no way to delineate between the two sources of error and apportion blame. The paper argues that the error statistical approach can address this Duhemian ambiguity by distinguishing between statistical and substantive premises and viewing empirical modeling in a piecemeal way with a view to delineate the various issues more effectively. It is also argued that Hendry's general to specific procedures does a much better job in model selection than the theory-driven and the Akaike-type procedures primary because of its error statistical underpinnings.

  5. Technical Report Series on Global Modeling and Data Assimilation. Volume 42; Soil Moisture Active Passive (SMAP) Project Calibration and Validation for the L4_C Beta-Release Data Product

    Science.gov (United States)

    Koster, Randal D. (Editor); Kimball, John S.; Jones, Lucas A.; Glassy, Joseph; Stavros, E. Natasha; Madani, Nima (Editor); Reichle, Rolf H.; Jackson, Thomas; Colliander, Andreas

    2015-01-01

    During the post-launch Cal/Val Phase of SMAP there are two objectives for each science product team: 1) calibrate, verify, and improve the performance of the science algorithms, and 2) validate accuracies of the science data products as specified in the L1 science requirements according to the Cal/Val timeline. This report provides analysis and assessment of the SMAP Level 4 Carbon (L4_C) product specifically for the beta release. The beta-release version of the SMAP L4_C algorithms utilizes a terrestrial carbon flux model informed by SMAP soil moisture inputs along with optical remote sensing (e.g. MODIS) vegetation indices and other ancillary biophysical data to estimate global daily NEE and component carbon fluxes, particularly vegetation gross primary production (GPP) and ecosystem respiration (Reco). Other L4_C product elements include surface (<10 cm depth) soil organic carbon (SOC) stocks and associated environmental constraints to these processes, including soil moisture and landscape FT controls on GPP and Reco (Kimball et al. 2012). The L4_C product encapsulates SMAP carbon cycle science objectives by: 1) providing a direct link between terrestrial carbon fluxes and underlying freeze/thaw and soil moisture constraints to these processes, 2) documenting primary connections between terrestrial water, energy and carbon cycles, and 3) improving understanding of terrestrial carbon sink activity in northern ecosystems.

  6. Custom map projections for regional groundwater models

    Science.gov (United States)

    Kuniansky, Eve L.

    2017-01-01

    For regional groundwater flow models (areas greater than 100,000 km2), improper choice of map projection parameters can result in model error for boundary conditions dependent on area (recharge or evapotranspiration simulated by application of a rate using cell area from model discretization) and length (rivers simulated with head-dependent flux boundary). Smaller model areas can use local map coordinates, such as State Plane (United States) or Universal Transverse Mercator (correct zone) without introducing large errors. Map projections vary in order to preserve one or more of the following properties: area, shape, distance (length), or direction. Numerous map projections are developed for different purposes as all four properties cannot be preserved simultaneously. Preservation of area and length are most critical for groundwater models. The Albers equal-area conic projection with custom standard parallels, selected by dividing the length north to south by 6 and selecting standard parallels 1/6th above or below the southern and northern extent, preserves both area and length for continental areas in mid latitudes oriented east-west. Custom map projection parameters can also minimize area and length error in non-ideal projections. Additionally, one must also use consistent vertical and horizontal datums for all geographic data. The generalized polygon for the Floridan aquifer system study area (306,247.59 km2) is used to provide quantitative examples of the effect of map projections on length and area with different projections and parameter choices. Use of improper map projection is one model construction problem easily avoided.

  7. Toward Validation of the Diagnostic-Prescriptive Model

    Science.gov (United States)

    Ysseldyke, James E.; Sabatino, David A.

    1973-01-01

    Criticized are recent research efforts to validate the diagnostic prescriptive model of remediating learning disabilities, and proposed is a 6-step psychoeducational model designed to ascertain links between behavioral differences and instructional outcomes. (DB)

  8. Alaska North Slope Tundra Travel Model and Validation Study

    Energy Technology Data Exchange (ETDEWEB)

    Harry R. Bader; Jacynthe Guimond

    2006-03-01

    lack of variability in snow depth cover throughout the period of field experimentation. The amount of change in disturbance indicators was greater in the tundra communities of the Foothills than in those of the Coastal Plain. However the overall level of change in both community types was less than expected. In Coastal Plain communities, ground hardness and snow slab thickness were found to play an important role in change in active layer depth and soil moisture as a result of treatment. In the Foothills communities, snow cover had the most influence on active layer depth and soil moisture as a result of treatment. Once certain minimum thresholds for ground hardness, snow slab thickness, and snow depth were attained, it appeared that little or no additive effect was realized regarding increased resistance to disturbance in the tundra communities studied. DNR used the results of this modeling project to set a standard for maximum permissible disturbance of cross-country tundra travel, with the threshold set below the widely accepted standard of Low Disturbance levels (as determined by the U.S. Fish and Wildlife Service). DNR followed the modeling project with a validation study, which seemed to support the field trial conclusions and indicated that the standard set for maximum permissible disturbance exhibits a conservative bias in favor of environmental protection. Finally DNR established a quick and efficient tool for visual estimations of disturbance to determine when investment in field measurements is warranted. This Visual Assessment System (VAS) seemed to support the plot disturbance measurements taking during the modeling and validation phases of this project.

  9. Models for Validation of Prior Learning (VPL)

    DEFF Research Database (Denmark)

    Ehlers, Søren

    would have been categorized as utopian can become realpolitik. Validation of Prior Learning (VPL) was in Europe mainly regarded as utopian while universities in the United States of America (USA) were developing ways to obtain credits to those students which was coming with experiences from working life....

  10. Modeling and Simulation Behavior Validation Methodology and Extension Model Validation for the Individual Soldier

    Science.gov (United States)

    2015-03-01

    Historical Methods The three historical methods of validation are rationalism, empiricism , and positive economics. Rationalism requires that... Empiricism requires every assumption and outcome to be empirically validated. Positive economics requires only that the model’s outcome(s) be correct...historical methods of rationalism, empiricism , and positive economics into a multistage process of validation. This validation method consists of (1

  11. Tsunami-HySEA model validation for tsunami current predictions

    Science.gov (United States)

    Macías, Jorge; Castro, Manuel J.; González-Vida, José Manuel; Ortega, Sergio

    2016-04-01

    Model ability to compute and predict tsunami flow velocities is of importance in risk assessment and hazard mitigation. Substantial damage can be produced by high velocity flows, particularly in harbors and bays, even when the wave height is small. Besides, an accurate simulation of tsunami flow velocities and accelerations is fundamental for advancing in the study of tsunami sediment transport. These considerations made the National Tsunami Hazard Mitigation Program (NTHMP) proposing a benchmark exercise focussed on modeling and simulating tsunami currents. Until recently, few direct measurements of tsunami velocities were available to compare and to validate model results. After Tohoku 2011 many current meters measurement were made, mainly in harbors and channels. In this work we present a part of the contribution made by the EDANYA group from the University of Malaga to the NTHMP workshop organized at Portland (USA), 9-10 of February 2015. We have selected three out of the five proposed benchmark problems. Two of them consist in real observed data from the Tohoku 2011 event, one at Hilo Habour (Hawaii) and the other at Tauranga Bay (New Zealand). The third one consists in laboratory experimental data for the inundation of Seaside City in Oregon. Acknowledgements: This research has been partially supported by the Junta de Andalucía research project TESELA (P11-RNM7069) and the Spanish Government Research project DAIFLUID (MTM2012-38383-C02-01) and Universidad de Málaga, Campus de Excelencia Andalucía TECH. The GPU and multi-GPU computations were performed at the Unit of Numerical Methods (UNM) of the Research Support Central Services (SCAI) of the University of Malaga.

  12. EXPENSES FORECASTING MODEL IN UNIVERSITY PROJECTS PLANNING

    Directory of Open Access Journals (Sweden)

    Sergei A. Arustamov

    2016-11-01

    Full Text Available The paper deals with mathematical model presentation of cash flows in project funding. We describe different types of expenses linked to university project activities. Problems of project budgeting that contribute most uncertainty have been revealed. As an example of the model implementation we consider calculation of vacation allowance expenses for project participants. We define problems of forecast for funds reservation: calculation based on methodology established by the Ministry of Education and Science calculation according to the vacation schedule and prediction of the most probable amount. A stochastic model for vacation allowance expenses has been developed. We have proposed methods and solution of the problems that increase the accuracy of forecasting for funds reservation based on 2015 data.

  13. The Validation of Computer-based Models in Engineering: Some Lessons from Computing Science

    Directory of Open Access Journals (Sweden)

    D. J. Murray-Smith

    2001-01-01

    Full Text Available Questions of the quality of computer-based models and the formal processes of model testing, involving internal verification and external validation, are usually given only passing attention in engineering reports and in technical publications. However, such models frequently provide a basis for analysis methods, design calculations or real-time decision-making in complex engineering systems. This paper reviews techniques used for external validation of computer-based models and contrasts the somewhat casual approach which is usually adopted in this field with the more formal approaches to software testing and documentation recommended for large software projects. Both activities require intimate knowledge of the intended application, a systematic approach and considerable expertise and ingenuity in the design of tests. It is concluded that engineering degree courses dealing with modelling techniques and computer simulation should put more emphasis on model limitations, testing and validation.

  14. A conceptual model of psychological contracts in construction projects

    Directory of Open Access Journals (Sweden)

    Yongjian Ke

    2016-09-01

    Full Text Available The strategic importance of relationship style contracting is recognised in the construction industry. Both public and private sector clients are stipulating more integrated and collaborative forms of procurement. Despite relationship and integrated contractual arrangement being available for some time, it is clear that construction firms have been slow to adopt them. Hence it is timely to examine how social exchanges, via unwritten agreement and behaviours, are being nurtured in construction projects. This paper adopted the concept of Psychological Contracts (PC to describe unwritten agreement and behaviours. A conceptual model of the PC is developed and validated using the results from a questionnaire survey administered to construction professionals in Australia. The results uncovered the relationships that existed amongst relational conditions and relational benefits, the PC and the partners’ satisfaction. The results show that all the hypotheses in the conceptual model of the PC are supported, suggesting the PC model is important and may have an effect on project performance and relationship quality among contracting parties. A validated model of the PC in construction was then developed based on the correlations among each component. The managerial implications are that past relationships and relationship characteristics should be taken into account in the selection of procurement partners and the promise of future resources, support and tangible relational outcomes are also vital. It is important for contracting parties to pay attention to unwritten agreements (the PC and behaviours when managing construction projects.

  15. Marshal: Maintaining Evolving Models Project

    Data.gov (United States)

    National Aeronautics and Space Administration — SIFT proposes to design and develop the Marshal system, a mixed-initiative tool for maintaining task models over the course of evolving missions. Marshal-enabled...

  16. Advanced Spacecraft Thermal Modeling Project

    Data.gov (United States)

    National Aeronautics and Space Administration — For spacecraft developers who spend millions to billions of dollars per unit and require 3 to 7 years to deploy, the LoadPath reduced-order (RO) modeling thermal...

  17. Validation Analysis of the Shoal Groundwater Flow and Transport Model

    Energy Technology Data Exchange (ETDEWEB)

    A. Hassan; J. Chapman

    2008-11-01

    Environmental restoration at the Shoal underground nuclear test is following a process prescribed by a Federal Facility Agreement and Consent Order (FFACO) between the U.S. Department of Energy, the U.S. Department of Defense, and the State of Nevada. Characterization of the site included two stages of well drilling and testing in 1996 and 1999, and development and revision of numerical models of groundwater flow and radionuclide transport. Agreement on a contaminant boundary for the site and a corrective action plan was reached in 2006. Later that same year, three wells were installed for the purposes of model validation and site monitoring. The FFACO prescribes a five-year proof-of-concept period for demonstrating that the site groundwater model is capable of producing meaningful results with an acceptable level of uncertainty. The corrective action plan specifies a rigorous seven step validation process. The accepted groundwater model is evaluated using that process in light of the newly acquired data. The conceptual model of ground water flow for the Project Shoal Area considers groundwater flow through the fractured granite aquifer comprising the Sand Springs Range. Water enters the system by the infiltration of precipitation directly on the surface of the mountain range. Groundwater leaves the granite aquifer by flowing into alluvial deposits in the adjacent basins of Fourmile Flat and Fairview Valley. A groundwater divide is interpreted as coinciding with the western portion of the Sand Springs Range, west of the underground nuclear test, preventing flow from the test into Fourmile Flat. A very low conductivity shear zone east of the nuclear test roughly parallels the divide. The presence of these lateral boundaries, coupled with a regional discharge area to the northeast, is interpreted in the model as causing groundwater from the site to flow in a northeastward direction into Fairview Valley. Steady-state flow conditions are assumed given the absence of

  18. The Primi Project: August-September 2009 Validation Cruise On Oil Spill Detection And Fate

    Science.gov (United States)

    Santoleri, R.; Bignami, F.; Bohm, E.; Nichio, F.; De Dominicis, M.; Ruggieri, G.; Marulllo, S.; Trivero, P.; Zambianchi, E.; Archetti, R.; Adamo, M.; Biamino, W.; Borasi, M.; Buongiorno Nardelli, B.; Cavagnero, M.; Colao, F.; Colella, S.; Coppini, G.; Debettio, V.; De Carolis, G.; Forneris, V.; Griffa, A.; Iacono, R.; Lombardi, E.; Manzella, G.; Mercantini, A.; Napolitano, E.; Pinardi, N.; Pandiscia, G.; Pisano, A.; Rupolo, V.; Reseghetti, F.; Sabia, L.; Sorgente, R.; Sprovieri, M.; Terranova, G.; Trani, M.; Volpe, G.

    2010-04-01

    In the framework of the ASI PRIMI Project, CNR- ISAC, in collaboration with the PRIMI partners, organized a validation cruise for the PRIMI oil spill monitoring and forecasting system on board the CNR R/V Urania. The cruise (Aug. 6 - Sept. 7 2009) took place in the Sicily Strait, an area affected by large oil tanker traffic. The cruise plan was organized in order to have the ship within the selected SAR image frames at acquisition time so that the ship could move toward the oil slick and verify it via visual and instrumental inspection. During the cruise, several oil spills, presumably being the result of illegal tank washing, were detected by the PRIMI system and were verified in situ. Preliminary results indicate that SAR and optical satellites are able to detect heavy and thin film oil spills, the maturity of oil spill forecasting models and that further work combining satellite, model and in situ data is necessary to assess the spill severity from the signature in satellite imagery.

  19. Future meteorological drought: projections of regional climate models for Europe

    Science.gov (United States)

    Stagge, James; Tallaksen, Lena; Rizzi, Jonathan

    2015-04-01

    In response to the major European drought events of the last decade, projecting future drought frequency and severity in a non-stationary climate is a major concern for Europe. Prior drought studies have identified regional hotspots in the Mediterranean and Eastern European regions, but have otherwise produced conflicting results with regard to future drought severity. Some of this disagreement is likely related to the relatively coarse resolution of Global Climate Models (GCMs) and regional averaging, which tends to smooth extremes. This study makes use of the most current Regional Climate Models (RCMs) forced with CMIP5 climate projections to quantify the projected change in meteorological drought for Europe during the next century at a fine, gridded scale. Meteorological drought is quantified using the Standardized Precipitation Index (SPI) and the Standardized Precipitation-Evapotranspiration Index (SPEI), which normalize accumulated precipitation and climatic water balance anomaly, respectively, for a specific location and time of year. By comparing projections for these two indices, the importance of precipitation deficits can be contrasted with the importance of evapotranspiration increases related to temperature changes. Climate projections are based on output from CORDEX (the Coordinated Regional Climate Downscaling Experiment), which provides high resolution regional downscaled climate scenarios that have been extensively tested for numerous regions around the globe, including Europe. SPI and SPEI are then calculated on a gridded scale at a spatial resolution of either 0.44 degrees (~50 km) or 0.11 degrees (~12.5km) for the three projected emission pathways (rcp26, rcp45, rcp85). Analysis is divided into two major sections: first validating the models with respect to observed historical trends in meteorological drought from 1970-2005 and then comparing drought severity and frequency during three future time periods (2011-2040, 2041-2070, 2071-2100) to the

  20. Model-Based Verification and Validation of the SMAP Uplink Processes

    Science.gov (United States)

    Khan, M. Omair; Dubos, Gregory F.; Tirona, Joseph; Standley, Shaun

    2013-01-01

    This case study stands as an example of how a project can validate a system-level design earlier in the project life cycle than traditional V&V processes by using simulation on a system model. Specifically, this paper describes how simulation was added to a system model of the Soil Moisture Active-Passive (SMAP) mission's uplink process.Also discussed are the advantages and disadvantages of the methods employed and the lessons learned; which are intended to benefit future model-based and simulation-based V&V development efforts.

  1. Validation of Numerical Shallow Water Models for Tidal Lagoons

    Energy Technology Data Exchange (ETDEWEB)

    Eliason, D.; Bourgeois, A.

    1999-11-01

    An analytical solution is presented for the case of a stratified, tidally forced lagoon. This solution, especially its energetics, is useful for the validation of numerical shallow water models under stratified, tidally forced conditions. The utility of the analytical solution for validation is demonstrated for a simple finite difference numerical model. A comparison is presented of the energetics of the numerical and analytical solutions in terms of the convergence of model results to the analytical solution with increasing spatial and temporal resolution.

  2. Ford Plug-In Project: Bringing PHEVs to Market Demonstration and Validation Project

    Energy Technology Data Exchange (ETDEWEB)

    D' Annunzio, Julie [Ford Motor Company, Dearborn, MI (United States); Slezak, Lee [U.S. DOE Office of Energy Efficiency & Renewable Energy, Washington, DC (United States); Conley, John Jason [National Energy Technology Lab. (NETL), Albany, OR (United States)

    2014-03-26

    This project is in support of our national goal to reduce our dependence on fossil fuels. By supporting efforts that contribute toward the successful mass production of plug-in hybrid electric vehicles, our nation’s transportation-related fuel consumption can be offset with energy from the grid. Over four and a half years ago, when this project was originally initiated, plug-in electric vehicles were not readily available in the mass marketplace. Through the creation of a 21 unit plug-in hybrid vehicle fleet, this program was designed to demonstrate the feasibility of the technology and to help build cross-industry familiarity with the technology and interface of this technology with the grid. Ford Escape PHEV Demonstration Fleet 3 March 26, 2014 Since then, however, plug-in vehicles have become increasingly more commonplace in the market. Ford, itself, now offers an all-electric vehicle and two plug-in hybrid vehicles in North America and has announced a third plug-in vehicle offering for Europe. Lessons learned from this project have helped in these production vehicle launches and are mentioned throughout this report. While the technology of plugging in a vehicle to charge a high voltage battery with energy from the grid is now in production, the ability for vehicle-to-grid or bi-directional energy flow was farther away than originally expected. Several technical, regulatory and potential safety issues prevented progressing the vehicle-to-grid energy flow (V2G) demonstration and, after a review with the DOE, V2G was removed from this demonstration project. Also proving challenging were communications between a plug-in vehicle and the grid or smart meter. While this project successfully demonstrated the vehicle to smart meter interface, cross-industry and regulatory work is still needed to define the vehicle-to-grid communication interface.

  3. Ford Plug-In Project: Bringing PHEVs to Market Demonstration and Validation Project

    Energy Technology Data Exchange (ETDEWEB)

    None

    2013-12-31

    This project is in support of our national goal to reduce our dependence on fossil fuels. By supporting efforts that contribute toward the successful mass production of plug-in hybrid electric vehicles, our nation’s transportation-related fuel consumption can be offset with energy from the grid. Over four and a half years ago, when this project was originally initiated, plug-in electric vehicles were not readily available in the mass marketplace. Through the creation of a 21 unit plug-in hybrid vehicle fleet, this program was designed to demonstrate the feasibility of the technology and to help build cross-industry familiarity with the technology and interface of this technology with the grid. Ford Escape PHEV Demonstration Fleet 3 March 26, 2014 Since then, however, plug-in vehicles have become increasingly more commonplace in the market. Ford, itself, now offers an all-electric vehicle and two plug-in hybrid vehicles in North America and has announced a third plug-in vehicle offering for Europe. Lessons learned from this project have helped in these production vehicle launches and are mentioned throughout this report. While the technology of plugging in a vehicle to charge a high voltage battery with energy from the grid is now in production, the ability for vehicle-to-grid or bi-directional energy flow was farther away than originally expected. Several technical, regulatory and potential safety issues prevented progressing the vehicle-to-grid energy flow (V2G) demonstration and, after a review with the DOE, V2G was removed from this demonstration project. Also proving challenging were communications between a plug-in vehicle and the grid or smart meter. While this project successfully demonstrated the vehicle to smart meter interface, cross-industry and regulatory work is still needed to define the vehicle-to-grid communication interface.

  4. Model county ordinance for wind projects

    Energy Technology Data Exchange (ETDEWEB)

    Bain, D.A. [Oregon Office of Energy, Portland, OR (United States)

    1997-12-31

    Permitting is a crucial step in the development cycle of a wind project and permits affect the timing, cost, location, feasibility, layout, and impacts of wind projects. Counties often have the lead responsibility for permitting yet few have appropriate siting regulations for wind projects. A model ordinance allows a county to quickly adopt appropriate permitting procedures. The model county wind ordinance developed for use by northwest states is generally applicable across the country and counties seeking to adopt siting or zoning regulations for wind will find it a good starting place. The model includes permitting procedures for wind measurement devices and two types of wind systems. Both discretionary and nondiscretionary standards apply to wind systems and a conditional use permit would be issued. The standards, criteria, conditions for approval, and process procedures are defined for each. Adaptation examples for the four northwest states are provided along with a model Wind Resource Overlay Zone.

  5. Using virtual reality to validate system models

    Energy Technology Data Exchange (ETDEWEB)

    Winter, V.L.; Caudell, T.P.

    1999-12-09

    To date most validation techniques are highly biased towards calculations involving symbolic representations of problems. These calculations are either formal (in the case of consistency and completeness checks), or informal in the case of code inspections. The authors believe that an essential type of evidence of the correctness of the formalization process must be provided by (i.e., must originate from) human-based calculation. They further believe that human calculation can by significantly amplified by shifting from symbolic representations to graphical representations. This paper describes their preliminary efforts in realizing such a representational shift.

  6. Modelling of Transport Projects Uncertainties

    DEFF Research Database (Denmark)

    Salling, Kim Bang; Leleur, Steen

    2009-01-01

    This paper proposes a new way of handling the uncertainties present in transport decision making based on infrastructure appraisals. The paper suggests to combine the principle of Optimism Bias, which depicts the historical tendency of overestimating transport related benefits and underestimating...... investment costs, with a quantitative risk analysis based on Monte Carlo simulation and to make use of a set of exploratory scenarios. The analysis is carried out by using the CBA-DK model representing the Danish standard approach to socio-economic cost-benefit analysis. Specifically, the paper proposes...... to supplement Optimism Bias and the associated Reference Class Forecasting (RCF) technique with a new technique that makes use of a scenario-grid. We tentatively introduce and refer to this as Reference Scenario Forecasting (RSF). The final RSF output from the CBA-DK model consists of a set of scenario...

  7. Extending Model Checking to Object Process Validation

    NARCIS (Netherlands)

    Rein, van H.

    2002-01-01

    Object-oriented techniques allow the gathering and modelling of system requirements in terms of an application area. The expression of data and process models at that level is a great asset in communication with non-technical people in that area, but it does not necessarily lead to consistent models

  8. A practical approach to validating a PD model

    NARCIS (Netherlands)

    Medema, Lydian; Koning, Ruud H.; Lensink, Robert; Medema, M.

    2009-01-01

    The capital adequacy framework Basel II aims to promote the adoption of stronger risk management practices by the banking industry. The implementation makes validation of credit risk models more important. Lenders therefore need a validation methodology to convince their supervisors that their credi

  9. A Practical Approach to Validating a PD Model

    NARCIS (Netherlands)

    Medema, L.; Koning, de R.; Lensink, B.W.

    2009-01-01

    The capital adequacy framework Basel II aims to promote the adoption of stronger risk management practices by the banking industry. The implementation makes validation of credit risk models more important. Lenders therefore need a validation methodology to convince their supervisors that their credi

  10. Cross-validation criteria for SETAR model selection

    NARCIS (Netherlands)

    de Gooijer, J.G.

    2001-01-01

    Three cross-validation criteria, denoted C, C_c, and C_u are proposed for selecting the orders of a self-exciting threshold autoregressive SETAR) model when both the delay and the threshold value are unknown. The derivatioon of C is within a natural cross-validation framework. The crietion C_c is si

  11. Ensuring the Validity of the Micro Foundation in DSGE Models

    DEFF Research Database (Denmark)

    Andreasen, Martin Møller

    The presence of i) stochastic trends, ii) deterministic trends, and/or iii) stochastic volatil- ity in DSGE models may imply that the agents' objective functions attain infinite values. We say that such models do not have a valid micro foundation. The paper derives sufficient condi- tions which...... ensure that the objective functions of the households and the firms are finite even when various trends and stochastic volatility are included in a standard DSGE model. Based on these conditions we test the validity of the micro foundation in six DSGE models from the literature. The models of Justiniano...... in these models remains to be established....

  12. Development and validation of a realistic head model for EEG

    Science.gov (United States)

    Bangera, Nitin Bhalchandra

    The utility of extracranial electrical or magnetic field recordings (EEG or MEG) is greatly enhanced if the generators of the bioelectromagnetic fields can be determined accurately from the measured fields. This procedure, known as the 'inverse method,' depends critically on calculations of the projection from generators in the brain to the EEG and MEG sensors. Improving and validating this calculation, known as the 'forward solution,' is the focus of this dissertation. The improvements involve more accurate modeling of the structures of the brain and thus understanding how current flows within the brain as a result of addition of structures in the forward model. Validation compares calculations using different forward models to the experimental results obtained by stimulating with implanted dipole electrodes. The human brain tissue displays inhomogeneity in electrical conductivity and also displays anisotropy, notably in the skull and brain white matter. In this dissertation, a realistic head model has been implemented using the finite element method to calculate the effects of inhomogeneity and anisotropy in the human brain. Accurate segmentation of the brain tissue type is implemented using a semi-automatic method to segment multimodal imaging data from multi-spectral MRI scans (different flip angles) in conjunction with the regular T1-weighted scans and computed x-ray tomography images. The electrical conductivity in the anisotropic white matter tissue is quantified from diffusion tensor MRI. The finite element model is constructed using AMIRA, a commercial segmentation and visualization tool and solved using ABAQUS, a commercial finite element solver. The model is validated using experimental data collected from intracranial stimulation in medically intractable epileptic patients. Depth electrodes are implanted in medically intractable epileptic patients in order to direct surgical therapy when the foci cannot be localized with the scalp EEG. These patients

  13. Validating a model that predicts daily growth and feed quality of New Zealand dairy pastures.

    Science.gov (United States)

    Woodward, S J

    2001-09-01

    The Pasture Quality (PQ) model is a simple, mechanistic, dynamical system model that was designed to capture the essential biological processes in grazed grass-clover pasture, and to be optimised to derive improved grazing strategies for New Zealand dairy farms. While the individual processes represented in the model (photosynthesis, tissue growth, flowering, leaf death, decomposition, worms) were based on experimental data, this did not guarantee that the assembled model would accurately predict the behaviour of the system as a whole (i.e., pasture growth and quality). Validation of the whole model was thus a priority, since any strategy derived from the model could impact a farm business in the order of thousands of dollars per annum if adopted. This paper describes the process of defining performance criteria for the model, obtaining suitable data to test the model, and carrying out the validation analysis. The validation process highlighted a number of weaknesses in the model, which will lead to the model being improved. As a result, the model's utility will be enhanced. Furthermore, validation was found to have an unexpected additional benefit, in that despite the model's poor initial performance, support was generated for the model among field scientists involved in the wider project.

  14. Uncertainty Quantification in Climate Modeling and Projection

    Energy Technology Data Exchange (ETDEWEB)

    Qian, Yun; Jackson, Charles; Giorgi, Filippo; Booth, Ben; Duan, Qingyun; Forest, Chris; Higdon, Dave; Hou, Z. Jason; Huerta, Gabriel

    2016-05-01

    The projection of future climate is one of the most complex problems undertaken by the scientific community. Although scientists have been striving to better understand the physical basis of the climate system and to improve climate models, the overall uncertainty in projections of future climate has not been significantly reduced (e.g., from the IPCC AR4 to AR5). With the rapid increase of complexity in Earth system models, reducing uncertainties in climate projections becomes extremely challenging. Since uncertainties always exist in climate models, interpreting the strengths and limitations of future climate projections is key to evaluating risks, and climate change information for use in Vulnerability, Impact, and Adaptation (VIA) studies should be provided with both well-characterized and well-quantified uncertainty. The workshop aimed at providing participants, many of them from developing countries, information on strategies to quantify the uncertainty in climate model projections and assess the reliability of climate change information for decision-making. The program included a mixture of lectures on fundamental concepts in Bayesian inference and sampling, applications, and hands-on computer laboratory exercises employing software packages for Bayesian inference, Markov Chain Monte Carlo methods, and global sensitivity analyses. The lectures covered a range of scientific issues underlying the evaluation of uncertainties in climate projections, such as the effects of uncertain initial and boundary conditions, uncertain physics, and limitations of observational records. Progress in quantitatively estimating uncertainties in hydrologic, land surface, and atmospheric models at both regional and global scales was also reviewed. The application of Uncertainty Quantification (UQ) concepts to coupled climate system models is still in its infancy. The Coupled Model Intercomparison Project (CMIP) multi-model ensemble currently represents the primary data for

  15. Efficient Integration, Validation and Troubleshooting in Multimodal Distributed Diagnostic Schemes Project

    Data.gov (United States)

    National Aeronautics and Space Administration — In general, development and validation of diagnostic models for complex safety critical systems are time and cost intensive jobs. The proposed Phase-II effort will...

  16. Verification, Validation and Credibility Assessment of a Computational Model of the Advanced Resistive Exercise Device (ARED)

    Science.gov (United States)

    Werner, C. R.; Humphreys, B. T.; Mulugeta, L.

    2014-01-01

    The Advanced Resistive Exercise Device (ARED) is the resistive exercise device used by astronauts on the International Space Station (ISS) to mitigate bone loss and muscle atrophy due to extended exposure to microgravity (micro g). The Digital Astronaut Project (DAP) has developed a multi-body dynamics model of biomechanics models for use in spaceflight exercise physiology research and operations. In an effort to advance model maturity and credibility of the ARED model, the DAP performed verification, validation and credibility (VV and C) assessment of the analyses of the model in accordance to NASA-STD-7009 'Standards for Models and Simulations'.

  17. Amendment to Validated dynamic flow model

    DEFF Research Database (Denmark)

    Knudsen, Torben

    2011-01-01

    The purpose of WP2 is to establish flow models relating the wind speed at turbines in a farm. Until now, active control of power reference has not been included in these models as only data with standard operation has been available. In this report the first data series with power reference...... excitations from the Thanet farm are used for trying to update some of the models discussed in D2.5. Because of very limited amount of data only simple dynamic transfer function models can be obtained. The three obtained data series are somewhat different. Only the first data set seems to have the front...... turbine in undisturbed flow. For this data set both the multiplicative model and in particular the simple first order transfer function model can predict the down wind wind speed from upwind wind speed and loading....

  18. POMP - Pervasive Object Model Project

    DEFF Research Database (Denmark)

    Schougaard, Kari Rye; Schultz, Ulrik Pagh

    applications, we consider it essential that a standard object-oriented style of programming can be used for those parts of the application that do not concern its mobility. This position paper describes an ongoing effort to implement a language and a virtual machine for applications that execute in a pervasive...... mobility. Mobile agent platforms are often based on such virtual machines, but typically do not provide strong mobility (the ability to migrate at any program point), and have limited support for multi-threaded applications, although there are exceptions. For a virtual machine to support mobile...... computing environment. This system, named POM (Pervasive Object Model), supports applications split into coarse-grained, strongly mobile units that communicate using method invocations through proxies. We are currently investigating efficient execution of mobile applications, scalability to suit...

  19. World Energy Projection System model documentation

    Energy Technology Data Exchange (ETDEWEB)

    Hutzler, M.J.; Anderson, A.T.

    1997-09-01

    The World Energy Projection System (WEPS) was developed by the Office of Integrated Analysis and Forecasting within the Energy Information Administration (EIA), the independent statistical and analytical agency of the US Department of Energy. WEPS is an integrated set of personal computer based spreadsheets containing data compilations, assumption specifications, descriptive analysis procedures, and projection models. The WEPS accounting framework incorporates projections from independently documented models and assumptions about the future energy intensity of economic activity (ratios of total energy consumption divided by gross domestic product GDP), and about the rate of incremental energy requirements met by natural gas, coal, and renewable energy sources (hydroelectricity, geothermal, solar, wind, biomass, and other renewable resources). Projections produced by WEPS are published in the annual report, International Energy Outlook. This report documents the structure and procedures incorporated in the 1998 version of the WEPS model. It has been written to provide an overview of the structure of the system and technical details about the operation of each component of the model for persons who wish to know how WEPS projections are produced by EIA.

  20. Assessment of Co-benefits of Clean Development Projects Based on the Project Design Documents of India’s Power’s Sector Currently under Registration and Validation

    Directory of Open Access Journals (Sweden)

    Ryo Eto

    2013-12-01

    Full Text Available Energy-related Clean Development Mechanism (CDM projects contribute to sustainable development through reducing air pollutants in addition to CO2 emissions. This paper evaluates the co-benefits of ten coal-fired power generation CDM projects which are currently in registration and validation with a power generation mix linear programming model in India’s power sector from 2006 to 2031. Two scenarios are developed to identify impacts of the CDM projects. As a result, the co-benefits are invoked by the CDM projects in India’s power sector. CO2 emissions decrease by 79 Mt CO2 and SOx and NOx emissions decrease by 0.8 Mt SOx and 0.6 Mt NOx from the baseline in 2031. Including benefits from the reduction of the air pollutants warrants sustainable development benefit and contributes to enhance the generated CER prices. Thus, we argue that addressing co-benefits encourages both host countries and investors to participate CDM projects.

  1. High-resolution computational algorithms for simulating offshore wind turbines and farms: Model development and validation

    Energy Technology Data Exchange (ETDEWEB)

    Calderer, Antoni [Univ. of Minnesota, Minneapolis, MN (United States); Yang, Xiaolei [Stony Brook Univ., NY (United States); Angelidis, Dionysios [Univ. of Minnesota, Minneapolis, MN (United States); Feist, Chris [Univ. of Minnesota, Minneapolis, MN (United States); Guala, Michele [Univ. of Minnesota, Minneapolis, MN (United States); Ruehl, Kelley [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Guo, Xin [Univ. of Minnesota, Minneapolis, MN (United States); Boomsma, Aaron [Univ. of Minnesota, Minneapolis, MN (United States); Shen, Lian [Univ. of Minnesota, Minneapolis, MN (United States); Sotiropoulos, Fotis [Stony Brook Univ., NY (United States)

    2015-10-30

    The present project involves the development of modeling and analysis design tools for assessing offshore wind turbine technologies. The computational tools developed herein are able to resolve the effects of the coupled interaction of atmospheric turbulence and ocean waves on aerodynamic performance and structural stability and reliability of offshore wind turbines and farms. Laboratory scale experiments have been carried out to derive data sets for validating the computational models.

  2. Validity of microgravity simulation models on earth

    DEFF Research Database (Denmark)

    Regnard, J; Heer, M; Drummer, C

    2001-01-01

    Many studies have used water immersion and head-down bed rest as experimental models to simulate responses to microgravity. However, some data collected during space missions are at variance or in contrast with observations collected from experimental models. These discrepancies could reflect inc...

  3. EMMD-Prony approach for dynamic validation of simulation models

    Institute of Scientific and Technical Information of China (English)

    Ruiyang Bai

    2015-01-01

    Model validation and updating is critical to model credi-bility growth. In order to assess model credibility quantitatively and locate model error precisely, a new dynamic validation method based on extremum field mean mode decomposition (EMMD) and the Prony method is proposed in this paper. Firstly, complex dy-namic responses from models and real systems are processed into stationary components by EMMD. These components always have definite physical meanings which can be the evidence about rough model error location. Secondly, the Prony method is applied to identify the features of each EMMD component. Amplitude si-milarity, frequency similarity, damping similarity and phase simi-larity are defined to describe the similarity of dynamic responses. Then quantitative validation metrics are obtained based on the improved entropy weight and energy proportion. Precise model error location is realized based on the physical meanings of these features. The application of this method in aircraft control er design provides evidence about its feasibility and usability.

  4. Uncertainty Quantification and Validation for RANS Turbulence Models

    Science.gov (United States)

    Oliver, Todd; Moser, Robert

    2011-11-01

    Uncertainty quantification and validation procedures for RANS turbulence models are developed and applied. The procedures used here rely on a Bayesian view of probability. In particular, the uncertainty quantification methodology requires stochastic model development, model calibration, and model comparison, all of which are pursued using tools from Bayesian statistics. Model validation is also pursued in a probabilistic framework. The ideas and processes are demonstrated on a channel flow example. Specifically, a set of RANS models--including Baldwin-Lomax, Spalart-Allmaras, k- ɛ, k- ω, and v2- f--and uncertainty representations are analyzed using DNS data for fully-developed channel flow. Predictions of various quantities of interest and the validity (or invalidity) of the various models for making those predictions will be examined. This work is supported by the Department of Energy [National Nuclear Security Administration] under Award Number [DE-FC52-08NA28615].

  5. Validation of a national hydrological model

    Science.gov (United States)

    McMillan, H. K.; Booker, D. J.; Cattoën, C.

    2016-10-01

    Nationwide predictions of flow time-series are valuable for development of policies relating to environmental flows, calculating reliability of supply to water users, or assessing risk of floods or droughts. This breadth of model utility is possible because various hydrological signatures can be derived from simulated flow time-series. However, producing national hydrological simulations can be challenging due to strong environmental diversity across catchments and a lack of data available to aid model parameterisation. A comprehensive and consistent suite of test procedures to quantify spatial and temporal patterns in performance across various parts of the hydrograph is described and applied to quantify the performance of an uncalibrated national rainfall-runoff model of New Zealand. Flow time-series observed at 485 gauging stations were used to calculate Nash-Sutcliffe efficiency and percent bias when simulating between-site differences in daily series, between-year differences in annual series, and between-site differences in hydrological signatures. The procedures were used to assess the benefit of applying a correction to the modelled flow duration curve based on an independent statistical analysis. They were used to aid understanding of climatological, hydrological and model-based causes of differences in predictive performance by assessing multiple hypotheses that describe where and when the model was expected to perform best. As the procedures produce quantitative measures of performance, they provide an objective basis for model assessment that could be applied when comparing observed daily flow series with competing simulated flow series from any region-wide or nationwide hydrological model. Model performance varied in space and time with better scores in larger and medium-wet catchments, and in catchments with smaller seasonal variations. Surprisingly, model performance was not sensitive to aquifer fraction or rain gauge density.

  6. Toward a validation process for model based safety analysis

    OpenAIRE

    Adeline, Romain; Cardoso, Janette; Darfeuil, Pierre; Humbert, Sophie; Seguin, Christel

    2010-01-01

    Today, Model Based processes become more and more widespread to achieve the analysis of a system. However, there is no formal testing approach to ensure that the formal model is compliant with the real system. In the paper, we choose to study AltaRica model. We present a general process to well construct and validate an AltaRica formal model. The focus is made on this validation phase, i.e. verifying the compliance between the model and the real system. For it, the proposed process recommends...

  7. Climate model validation and selection for hydrological applications in representative Mediterranean catchments

    Directory of Open Access Journals (Sweden)

    R. Deidda

    2013-07-01

    Full Text Available This paper discusses the relative performance of several climate models in providing reliable forcing for hydrological modeling in six representative catchments in the Mediterranean region. We consider 14 Regional Climate Models (RCMs, from the EU-FP6 ENSEMBLES project, run for the A1B emission scenario on a common 0.22-degree (about 24 km rotated grid over Europe and the Mediterranean. In the validation period (1951 to 2010 we consider daily precipitation and surface temperatures from the E-OBS dataset, available from the ENSEMBLES project and the data providers in the ECA&D project. Our primary objective is to rank the 14 RCMs for each catchment and select the four best performing ones to use as common forcing for hydrological models in the six Mediterranean basins considered in the EU-FP7 CLIMB project. Using a common suite of 4 RCMs for all studied catchments reduces the (epistemic uncertainty when evaluating trends and climate change impacts in the XXI century. We present and discuss the validation setting, as well as the obtained results and, to some detail, the difficulties we experienced when processing the data. In doing so we also provide useful information and hint for an audience of researchers not directly involved in climate modeling, but interested in the use of climate model outputs for hydrological modeling and, more in general, climate change impact studies in the Mediterranean.

  8. Clinical audit project in undergraduate medical education curriculum: an assessment validation study.

    Science.gov (United States)

    Tor, Elina; Steketee, Carole; Mak, Donna

    2016-09-24

    To evaluate the merit of the Clinical Audit Project (CAP) in an assessment program for undergraduate medical education using a systematic assessment validation framework. A cross-sectional assessment validation study at one medical school in Western Australia, with retrospective qualitative analysis of the design, development, implementation and outcomes of the CAP, and quantitative analysis of assessment data from four cohorts of medical students (2011- 2014). The CAP is fit for purpose with clear external and internal alignment to expected medical graduate outcomes.  Substantive validity in students' and examiners' response processes is ensured through relevant methodological and cognitive processes. Multiple validity features are built-in to the design, planning and implementation process of the CAP.  There is evidence of high internal consistency reliability of CAP scores (Cronbach's alpha > 0.8) and inter-examiner consistency reliability (intra-class correlation>0.7). Aggregation of CAP scores is psychometrically sound, with high internal consistency indicating one common underlying construct.  Significant but moderate correlations between CAP scores and scores from other assessment modalities indicate validity of extrapolation and alignment between the CAP and the overall target outcomes of medical graduates.  Standard setting, score equating and fair decision rules justify consequential validity of CAP scores interpretation and use. This study provides evidence demonstrating that the CAP is a meaningful and valid component in the assessment program. This systematic framework of validation can be adopted for all levels of assessment in medical education, from individual assessment modality, to the validation of an assessment program as a whole.

  9. Validation of Modeling Flow Approaching Navigation Locks

    Science.gov (United States)

    2013-08-01

    instrumentation, direction vernier . ........................................................................ 8  Figure 11. Plan A lock approach, upstream approach...13-9 8 Figure 9. Tools and instrumentation, bracket attached to rail. Figure 10. Tools and instrumentation, direction vernier . Numerical model

  10. Validating predictions from climate envelope models

    Science.gov (United States)

    Watling, J.; Bucklin, D.; Speroterra, C.; Brandt, L.; Cabal, C.; Romañach, Stephanie S.; Mazzotti, Frank J.

    2013-01-01

    Climate envelope models are a potentially important conservation tool, but their ability to accurately forecast species’ distributional shifts using independent survey data has not been fully evaluated. We created climate envelope models for 12 species of North American breeding birds previously shown to have experienced poleward range shifts. For each species, we evaluated three different approaches to climate envelope modeling that differed in the way they treated climate-induced range expansion and contraction, using random forests and maximum entropy modeling algorithms. All models were calibrated using occurrence data from 1967–1971 (t1) and evaluated using occurrence data from 1998–2002 (t2). Model sensitivity (the ability to correctly classify species presences) was greater using the maximum entropy algorithm than the random forest algorithm. Although sensitivity did not differ significantly among approaches, for many species, sensitivity was maximized using a hybrid approach that assumed range expansion, but not contraction, in t2. Species for which the hybrid approach resulted in the greatest improvement in sensitivity have been reported from more land cover types than species for which there was little difference in sensitivity between hybrid and dynamic approaches, suggesting that habitat generalists may be buffered somewhat against climate-induced range contractions. Specificity (the ability to correctly classify species absences) was maximized using the random forest algorithm and was lowest using the hybrid approach. Overall, our results suggest cautious optimism for the use of climate envelope models to forecast range shifts, but also underscore the importance of considering non-climate drivers of species range limits. The use of alternative climate envelope models that make different assumptions about range expansion and contraction is a new and potentially useful way to help inform our understanding of climate change effects on species.

  11. PROJECT ACTIVITY ANALYSIS WITHOUT THE NETWORK MODEL

    Directory of Open Access Journals (Sweden)

    S. Munapo

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: This paper presents a new procedure for analysing and managing activity sequences in projects. The new procedure determines critical activities, critical path, start times, free floats, crash limits, and other useful information without the use of the network model. Even though network models have been successfully used in project management so far, there are weaknesses associated with the use. A network is not easy to generate, and dummies that are usually associated with it make the network diagram complex – and dummy activities have no meaning in the original project management problem. The network model for projects can be avoided while still obtaining all the useful information that is required for project management. What are required are the activities, their accurate durations, and their predecessors.

    AFRIKAANSE OPSOMMING: Die navorsing beskryf ’n nuwerwetse metode vir die ontleding en bestuur van die sekwensiële aktiwiteite van projekte. Die voorgestelde metode bepaal kritiese aktiwiteite, die kritieke pad, aanvangstye, speling, verhasing, en ander groothede sonder die gebruik van ’n netwerkmodel. Die metode funksioneer bevredigend in die praktyk, en omseil die administratiewe rompslomp van die tradisionele netwerkmodelle.

  12. A comprehensive validation toolbox for regional ocean models - Outline, implementation and application to the Baltic Sea

    Science.gov (United States)

    Jandt, Simon; Laagemaa, Priidik; Janssen, Frank

    2014-05-01

    The systematic and objective comparison between output from a numerical ocean model and a set of observations, called validation in the context of this presentation, is a beneficial activity at several stages, starting from early steps in model development and ending at the quality control of model based products delivered to customers. Even though the importance of this kind of validation work is widely acknowledged it is often not among the most popular tasks in ocean modelling. In order to ease the validation work a comprehensive toolbox has been developed in the framework of the MyOcean-2 project. The objective of this toolbox is to carry out validation integrating different data sources, e.g. time-series at stations, vertical profiles, surface fields or along track satellite data, with one single program call. The validation toolbox, implemented in MATLAB, features all parts of the validation process - ranging from read-in procedures of datasets to the graphical and numerical output of statistical metrics of the comparison. The basic idea is to have only one well-defined validation schedule for all applications, in which all parts of the validation process are executed. Each part, e.g. read-in procedures, forms a module in which all available functions of this particular part are collected. The interface between the functions, the module and the validation schedule is highly standardized. Functions of a module are set up for certain validation tasks, new functions can be implemented into the appropriate module without affecting the functionality of the toolbox. The functions are assigned for each validation task in user specific settings, which are externally stored in so-called namelists and gather all information of the used datasets as well as paths and metadata. In the framework of the MyOcean-2 project the toolbox is frequently used to validate the forecast products of the Baltic Sea Marine Forecasting Centre. Hereby the performance of any new product

  13. Unsaturated Zone Flow Model Expert Elicitation Project

    Energy Technology Data Exchange (ETDEWEB)

    Coppersmith, K. J.

    1997-05-30

    This report presents results of the Unsaturated Zone Flow Model Expert Elicitation (UZFMEE) project at Yucca Mountain, Nevada. This project was sponsored by the US Department of Energy (DOE) and managed by Geomatrix Consultants, Inc. (Geomatrix), for TRW Environmental Safety Systems, Inc. The objective of this project was to identify and assess the uncertainties associated with certain key components of the unsaturated zone flow system at Yucca Mountain. This assessment reviewed the data inputs, modeling approaches, and results of the unsaturated zone flow model (termed the ''UZ site-scale model'') being developed by Lawrence Berkeley National Laboratory (LBNL) and the US Geological Survey (USGS). In addition to data input and modeling issues, the assessment focused on percolation flux (volumetric flow rate per unit cross-sectional area) at the potential repository horizon. An understanding of unsaturated zone processes is critical to evaluating the performance of the potential high-level nuclear waste repository at Yucca Mountain. A major goal of the project was to capture the uncertainties involved in assessing the unsaturated flow processes, including uncertainty in both the models used to represent physical controls on unsaturated zone flow and the parameter values used in the models. To ensure that the analysis included a wide range of perspectives, multiple individual judgments were elicited from members of an expert panel. The panel members, who were experts from within and outside the Yucca Mountain project, represented a range of experience and expertise. A deliberate process was followed in facilitating interactions among the experts, in training them to express their uncertainties, and in eliciting their interpretations. The resulting assessments and probability distributions, therefore, provide a reasonable aggregate representation of the knowledge and uncertainties about key issues regarding the unsaturated zone at the Yucca

  14. Validation of Model Forecasts of the Ambient Solar Wind

    Science.gov (United States)

    Macneice, P. J.; Hesse, M.; Kuznetsova, M. M.; Rastaetter, L.; Taktakishvili, A.

    2009-01-01

    Independent and automated validation is a vital step in the progression of models from the research community into operational forecasting use. In this paper we describe a program in development at the CCMC to provide just such a comprehensive validation for models of the ambient solar wind in the inner heliosphere. We have built upon previous efforts published in the community, sharpened their definitions, and completed a baseline study. We also provide first results from this program of the comparative performance of the MHD models available at the CCMC against that of the Wang-Sheeley-Arge (WSA) model. An important goal of this effort is to provide a consistent validation to all available models. Clearly exposing the relative strengths and weaknesses of the different models will enable forecasters to craft more reliable ensemble forecasting strategies. Models of the ambient solar wind are developing rapidly as a result of improvements in data supply, numerical techniques, and computing resources. It is anticipated that in the next five to ten years, the MHD based models will supplant semi-empirical potential based models such as the WSA model, as the best available forecast models. We anticipate that this validation effort will track this evolution and so assist policy makers in gauging the value of past and future investment in modeling support.

  15. Traffic modelling validation of advanced driver assistance systems

    NARCIS (Netherlands)

    Tongeren, R. van; Gietelink, O.J.; Schutter, B. de; Verhaegen, M.

    2007-01-01

    This paper presents a microscopic traffic model for the validation of advanced driver assistance systems. This model describes single-lane traffic and is calibrated with data from a field operational test. To illustrate the use of the model, a Monte Carlo simulation of single-lane traffic scenarios

  16. Validated Models for Radiation Response and Signal Generation in Scintillators: Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Kerisit, Sebastien N.; Gao, Fei; Xie, YuLong; Campbell, Luke W.; Van Ginhoven, Renee M.; Wang, Zhiguo; Prange, Micah P.; Wu, Dangxin

    2014-12-01

    This Final Report presents work carried out at Pacific Northwest National Laboratory (PNNL) under the project entitled “Validated Models for Radiation Response and Signal Generation in Scintillators” (Project number: PL10-Scin-theor-PD2Jf) and led by Drs. Fei Gao and Sebastien N. Kerisit. This project was divided into four tasks: 1) Electronic response functions (ab initio data model) 2) Electron-hole yield, variance, and spatial distribution 3) Ab initio calculations of information carrier properties 4) Transport of electron-hole pairs and scintillation efficiency Detailed information on the results obtained in each of the four tasks is provided in this Final Report. Furthermore, published peer-reviewed articles based on the work carried under this project are included in Appendix. This work was supported by the National Nuclear Security Administration, Office of Nuclear Nonproliferation Research and Development (DNN R&D/NA-22), of the U.S. Department of Energy (DOE).

  17. Stabilizing a Bicycle: A Modeling Project

    Science.gov (United States)

    Pennings, Timothy J.; Williams, Blair R.

    2010-01-01

    This article is a project that takes students through the process of forming a mathematical model of bicycle dynamics. Beginning with basic ideas from Newtonian mechanics (forces and torques), students use techniques from calculus and differential equations to develop the equations of rotational motion for a bicycle-rider system as it tips from…

  18. Validation of limited sampling models (LSM) for estimating AUC in therapeutic drug monitoring - is a separate validation group required?

    NARCIS (Netherlands)

    Proost, J. H.

    2007-01-01

    Objective: Limited sampling models (LSM) for estimating AUC in therapeutic drug monitoring are usually validated in a separate group of patients, according to published guidelines. The aim of this study is to evaluate the validation of LSM by comparing independent validation with cross-validation us

  19. Analytical Performance Modeling and Validation of Intel’s Xeon Phi Architecture

    Energy Technology Data Exchange (ETDEWEB)

    Chunduri, Sudheer; Balaprakash, Prasanna; Morozov, Vitali; Vishwanath, Venkatram; Kumaran, Kalyan

    2017-01-01

    Modeling the performance of scientific applications on emerging hardware plays a central role in achieving extreme-scale computing goals. Analytical models that capture the interaction between applications and hardware characteristics are attractive because even a reasonably accurate model can be useful for performance tuning before the hardware is made available. In this paper, we develop a hardware model for Intel’s second-generation Xeon Phi architecture code-named Knights Landing (KNL) for the SKOPE framework. We validate the KNL hardware model by projecting the performance of mini-benchmarks and application kernels. The results show that our KNL model can project the performance with prediction errors of 10% to 20%. The hardware model also provides informative recommendations for code transformations and tuning.

  20. Validation of a Hot Water Distribution Model Using Laboratory and Field Data

    Energy Technology Data Exchange (ETDEWEB)

    Backman, C.; Hoeschele, M.

    2013-07-01

    Characterizing the performance of hot water distribution systems is a critical step in developing best practice guidelines for the design and installation of high performance hot water systems. Developing and validating simulation models is critical to this effort, as well as collecting accurate input data to drive the models. In this project, the ARBI team validated the newly developed TRNSYS Type 604 pipe model against both detailed laboratory and field distribution system performance data. Validation efforts indicate that the model performs very well in handling different pipe materials, insulation cases, and varying hot water load conditions. Limitations of the model include the complexity of setting up the input file and long simulation run times. In addition to completing validation activities, this project looked at recent field hot water studies to better understand use patterns and potential behavioral changes as homeowners convert from conventional storage water heaters to gas tankless units. Based on these datasets, we conclude that the current Energy Factor test procedure overestimates typical use and underestimates the number of hot water draws. This has implications for both equipment and distribution system performance. Gas tankless water heaters were found to impact how people use hot water, but the data does not necessarily suggest an increase in usage. Further study in hot water usage and patterns is needed to better define these characteristics in different climates and home vintages.

  1. Ensuring the Validity of the Micro Foundation in DSGE Models

    DEFF Research Database (Denmark)

    Andreasen, Martin Møller

    The presence of i) stochastic trends, ii) deterministic trends, and/or iii) stochastic volatil- ity in DSGE models may imply that the agents' objective functions attain infinite values. We say that such models do not have a valid micro foundation. The paper derives sufficient condi- tions which...... ensure that the objective functions of the households and the firms are finite even when various trends and stochastic volatility are included in a standard DSGE model. Based on these conditions we test the validity of the micro foundation in six DSGE models from the literature. The models of Justiniano...... & Primiceri (American Economic Review, forth- coming) and Fernández-Villaverde & Rubio-Ramírez (Review of Economic Studies, 2007) do not satisfy these sufficient conditions, or any other known set of conditions ensuring finite values for the objective functions. Thus, the validity of the micro foundation...

  2. Project management system model development and experimental research

    OpenAIRE

    Golubeva, Viktorija

    2006-01-01

    Project management is the application of knowledge, skills, tools and techniques to project activities to meet project requirements. Project Management Information System is tightly connected with organizational structure and particularity of executed projects. However the main objective of this research was to identify project management model that would be universal, helpful and easily used with small and medium projects In analysis phase we reviewed different methodologies, project ...

  3. Experiments for foam model development and validation.

    Energy Technology Data Exchange (ETDEWEB)

    Bourdon, Christopher Jay; Cote, Raymond O.; Moffat, Harry K.; Grillet, Anne Mary; Mahoney, James F. (Honeywell Federal Manufacturing and Technologies, Kansas City Plant, Kansas City, MO); Russick, Edward Mark; Adolf, Douglas Brian; Rao, Rekha Ranjana; Thompson, Kyle Richard; Kraynik, Andrew Michael; Castaneda, Jaime N.; Brotherton, Christopher M.; Mondy, Lisa Ann; Gorby, Allen D.

    2008-09-01

    A series of experiments has been performed to allow observation of the foaming process and the collection of temperature, rise rate, and microstructural data. Microfocus video is used in conjunction with particle image velocimetry (PIV) to elucidate the boundary condition at the wall. Rheology, reaction kinetics and density measurements complement the flow visualization. X-ray computed tomography (CT) is used to examine the cured foams to determine density gradients. These data provide input to a continuum level finite element model of the blowing process.

  4. Development and validation of a cisplatin dose-ototoxicity model.

    Science.gov (United States)

    Dille, Marilyn F; Wilmington, Debra; McMillan, Garnett P; Helt, Wendy; Fausti, Stephen A; Konrad-Martin, Dawn

    2012-01-01

    Cisplatin is effective in the treatment of several cancers but is a known ototoxin resulting in shifts to hearing sensitivity in up to 50-60% of patients. Cisplatin-induced hearing shifts tend to occur first within an octave of a patient's high frequency hearing limit, termed the sensitive range for ototoxicity (SRO), and progress to lower frequencies. While it is currently not possible to know which patients will experience ototoxicity without testing their hearing directly, monitoring the SRO provides an early indication of damage. A tool to help forecast susceptibility to ototoxic-induced changes in the SRO in advance of each chemotherapy treatment visit may prove useful for ototoxicity monitoring efforts, patient counseling, and therapeutic planning. This project was designed to (1) establish pretreatment risk curves that quantify the probability that a new patient will suffer hearing loss within the SRO during treatment with cisplatin and (2) evaluate the accuracy of these predictions in an independent sample of Veterans receiving cisplatin for the treatment of cancer. Two study samples were used. The Developmental sample contained 23 subjects while the Validation sample consisted of 12 subjects. Risk curve predictions for SRO threshold shifts following cisplatin exposure were developed using a Developmental sample comprised of data from a total of 155 treatment visits obtained in 45 ears of 23 Veterans. Pure-tone thresholds were obtained within each subject's SRO at each treatment visit and compared with baseline measures. The risk of incurring an SRO shift was statistically modeled as a function of factors related to chemotherapy treatment (cisplatin dose, radiation treatment, doublet medication) and patient status (age, pre-exposure hearing, cancer location and stage). The model was reduced so that only statistically significant variables were included. Receiver-operating characteristic (ROC) curve analyses were then used to determine the accuracy of the

  5. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models

    Science.gov (United States)

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; Price, Stephen; Hoffman, Matthew; Lipscomb, William H.; Fyke, Jeremy; Vargo, Lauren; Boghozian, Adrianna; Norman, Matthew; Worley, Patrick H.

    2017-06-01

    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptops to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Ultimately, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.

  6. Advanced Reactors-Intermediate Heat Exchanger (IHX) Coupling: Theoretical Modeling and Experimental Validation

    Energy Technology Data Exchange (ETDEWEB)

    Utgikar, Vivek [Univ. of Idaho, Moscow, ID (United States); Sun, Xiaodong [The Ohio State Univ., Columbus, OH (United States); Christensen, Richard [The Ohio State Univ., Columbus, OH (United States); Sabharwall, Piyush [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-12-29

    The overall goal of the research project was to model the behavior of the advanced reactorintermediate heat exchange system and to develop advanced control techniques for off-normal conditions. The specific objectives defined for the project were: 1. To develop the steady-state thermal hydraulic design of the intermediate heat exchanger (IHX); 2. To develop mathematical models to describe the advanced nuclear reactor-IHX-chemical process/power generation coupling during normal and off-normal operations, and to simulate models using multiphysics software; 3. To develop control strategies using genetic algorithm or neural network techniques and couple these techniques with the multiphysics software; 4. To validate the models experimentally The project objectives were accomplished by defining and executing four different tasks corresponding to these specific objectives. The first task involved selection of IHX candidates and developing steady state designs for those. The second task involved modeling of the transient and offnormal operation of the reactor-IHX system. The subsequent task dealt with the development of control strategies and involved algorithm development and simulation. The last task involved experimental validation of the thermal hydraulic performances of the two prototype heat exchangers designed and fabricated for the project at steady state and transient conditions to simulate the coupling of the reactor- IHX-process plant system. The experimental work utilized the two test facilities at The Ohio State University (OSU) including one existing High-Temperature Helium Test Facility (HTHF) and the newly developed high-temperature molten salt facility.

  7. Subglacial Hydrology Model Intercomparison Project (SHMIP)

    Science.gov (United States)

    Werder, Mauro A.; de Fleurian, Basile; Creyts, Timothy T.; Damsgaard, Anders; Delaney, Ian; Dow, Christine F.; Gagliardini, Olivier; Hoffman, Matthew J.; Seguinot, Julien; Sommers, Aleah; Irarrazaval Bustos, Inigo; Downs, Jakob

    2017-04-01

    The SHMIP project is the first intercomparison project of subglacial drainage models (http://shmip.bitbucket.org). Its synthetic test suites and evaluation were designed such that any subglacial hydrology model producing effective pressure can participate. In contrast to ice deformation, the physical processes of subglacial hydrology (which in turn impacts basal sliding of glaciers) are poorly known. A further complication is that different glacial and geological settings can lead to different drainage physics. The aim of the project is therefore to qualitatively compare the outputs of the participating models for a wide range of water forcings and glacier geometries. This will allow to put existing studies, which use different drainage models, into context and will allow new studies to select the most suitable model for the problem at hand. We present the results from the just completed intercomparison exercise. Twelve models participated: eight 2D and four 1D models; nine include both an efficient and inefficient system, the other three one of the systems; all but two models use R-channels as efficient system, and/or a linked-cavity like inefficient system, one exception uses porous layers with different characteristic for each of the systems, the other exception is based on canals. The main variable used for the comparison is effective pressure, as that is a direct proxy for basal sliding of glaciers. The models produce large differences in the effective pressure fields, in particular for higher water input scenarios. This shows that the selection of a subglacial drainage model will likely impact the conclusions of a study significantly.

  8. Functional state modelling approach validation for yeast and bacteria cultivations

    Science.gov (United States)

    Roeva, Olympia; Pencheva, Tania

    2014-01-01

    In this paper, the functional state modelling approach is validated for modelling of the cultivation of two different microorganisms: yeast (Saccharomyces cerevisiae) and bacteria (Escherichia coli). Based on the available experimental data for these fed-batch cultivation processes, three different functional states are distinguished, namely primary product synthesis state, mixed oxidative state and secondary product synthesis state. Parameter identification procedures for different local models are performed using genetic algorithms. The simulation results show high degree of adequacy of the models describing these functional states for both S. cerevisiae and E. coli cultivations. Thus, the local models are validated for the cultivation of both microorganisms. This fact is a strong structure model verification of the functional state modelling theory not only for a set of yeast cultivations, but also for bacteria cultivation. As such, the obtained results demonstrate the efficiency and efficacy of the functional state modelling approach. PMID:26740778

  9. Functional state modelling approach validation for yeast and bacteria cultivations.

    Science.gov (United States)

    Roeva, Olympia; Pencheva, Tania

    2014-09-03

    In this paper, the functional state modelling approach is validated for modelling of the cultivation of two different microorganisms: yeast (Saccharomyces cerevisiae) and bacteria (Escherichia coli). Based on the available experimental data for these fed-batch cultivation processes, three different functional states are distinguished, namely primary product synthesis state, mixed oxidative state and secondary product synthesis state. Parameter identification procedures for different local models are performed using genetic algorithms. The simulation results show high degree of adequacy of the models describing these functional states for both S. cerevisiae and E. coli cultivations. Thus, the local models are validated for the cultivation of both microorganisms. This fact is a strong structure model verification of the functional state modelling theory not only for a set of yeast cultivations, but also for bacteria cultivation. As such, the obtained results demonstrate the efficiency and efficacy of the functional state modelling approach.

  10. Validation of Orthorectified Interferometric Radar Imagery and Digital Elevation Models

    Science.gov (United States)

    Smith Charles M.

    2004-01-01

    This work was performed under NASA's Verification and Validation (V&V) Program as an independent check of data supplied by EarthWatch, Incorporated, through the Earth Science Enterprise Scientific Data Purchase (SDP) Program. This document serves as the basis of reporting results associated with validation of orthorectified interferometric interferometric radar imagery and digital elevation models (DEM). This validation covers all datasets provided under the first campaign (Central America & Virginia Beach) plus three earlier missions (Indonesia, Red River: and Denver) for a total of 13 missions.

  11. Ensuring confidence in predictions: A scheme to assess the scientific validity of in silico models.

    Science.gov (United States)

    Hewitt, Mark; Ellison, Claire M; Cronin, Mark T D; Pastor, Manuel; Steger-Hartmann, Thomas; Munoz-Muriendas, Jordi; Pognan, Francois; Madden, Judith C

    2015-06-23

    The use of in silico tools within the drug development process to predict a wide range of properties including absorption, distribution, metabolism, elimination and toxicity has become increasingly important due to changes in legislation and both ethical and economic drivers to reduce animal testing. Whilst in silico tools have been used for decades there remains reluctance to accept predictions based on these methods particularly in regulatory settings. This apprehension arises in part due to lack of confidence in the reliability, robustness and applicability of the models. To address this issue we propose a scheme for the verification of in silico models that enables end users and modellers to assess the scientific validity of models in accordance with the principles of good computer modelling practice. We report here the implementation of the scheme within the Innovative Medicines Initiative project "eTOX" (electronic toxicity) and its application to the in silico models developed within the frame of this project.

  12. Validation of a Model of the Domino Effect?

    CERN Document Server

    Larham, Ron

    2008-01-01

    A recent paper proposing a model of the limiting speed of the domino effect is discussed with reference to its need and the need of models in general for validation against experimental data. It is shown that the proposed model diverges significantly from experimentally derived speed estimates over a significant range of domino spacing using data from the existing literature and this author's own measurements, hence if its use had had economic importance its use outside its range of validity could have led to loses of one sort or another to its users.

  13. Resampling procedures to validate dendro-auxometric regression models

    Directory of Open Access Journals (Sweden)

    2009-03-01

    Full Text Available Regression analysis has a large use in several sectors of forest research. The validation of a dendro-auxometric model is a basic step in the building of the model itself. The more a model resists to attempts of demonstrating its groundlessness, the more its reliability increases. In the last decades many new theories, that quite utilizes the calculation speed of the calculators, have been formulated. Here we show the results obtained by the application of a bootsprap resampling procedure as a validation tool.

  14. Validation of an Efficient Outdoor Sound Propagation Model Using BEM

    DEFF Research Database (Denmark)

    Quirós-Alpera, S.; Henriquez, Vicente Cutanda; Jacobsen, Finn

    2001-01-01

    An approximate, simple and practical model for prediction of outdoor sound propagation exists based on ray theory, diffraction theory and Fresnel-zone considerations [1]. This model, which can predict sound propagation over non-flat terrain, has been validated for combinations of flat ground, hills...... and barriers, but it still needs to be validated for configurations that involve combinations of valleys and barriers. In order to do this a boundary element model has been implemented in MATLAB to serve as a reliable reference....

  15. Validation of a Model for Ice Formation around Finned Tubes

    Directory of Open Access Journals (Sweden)

    Kamal A. R. Ismai

    2016-09-01

    Full Text Available Phase change materials although attaractive option for thermal storage applications its main drawback is the slow thermal response during charging and discharging processes due to their low thermal conductivity. The present study validates a model developed by the authors some years ago on radial fins as a method to meliorate the thermal performance of PCM in horizontal storage system. The developed model for the radial finned tube is based on pure conduction, the enthalpy approach and was discretized by the finite difference method. Experiments were realized specifically to validate the model and its numerical predictions.

  16. Child human model development: a hybrid validation approach

    NARCIS (Netherlands)

    Forbes, P.A.; Rooij, L. van; Rodarius, C.; Crandall, J.

    2008-01-01

    The current study presents a development and validation approach of a child human body model that will help understand child impact injuries and improve the biofidelity of child anthropometric test devices. Due to the lack of fundamental child biomechanical data needed to fully develop such models a

  17. Child human model development: a hybrid validation approach

    NARCIS (Netherlands)

    Forbes, P.A.; Rooij, L. van; Rodarius, C.; Crandall, J.

    2008-01-01

    The current study presents a development and validation approach of a child human body model that will help understand child impact injuries and improve the biofidelity of child anthropometric test devices. Due to the lack of fundamental child biomechanical data needed to fully develop such models a

  18. Reverse electrodialysis : A validated process model for design and optimization

    NARCIS (Netherlands)

    Veerman, J.; Saakes, M.; Metz, S. J.; Harmsen, G. J.

    2011-01-01

    Reverse electrodialysis (RED) is a technology to generate electricity using the entropy of the mixing of sea and river water. A model is made of the RED process and validated experimentally. The model is used to design and optimize the RED process. It predicts very small differences between counter-

  19. Measurements for validation of high voltage underground cable modelling

    DEFF Research Database (Denmark)

    Bak, Claus Leth; Gudmundsdottir, Unnur Stella; Wiechowski, Wojciech Tomasz

    2009-01-01

    This paper discusses studies concerning cable modelling for long high voltage AC cable lines. In investigating the possibilities of using long cables instead of overhead lines, the simulation results must be trustworthy. Therefore a model validation is of great importance. This paper describes...

  20. Model validation for karst flow using sandbox experiments

    Science.gov (United States)

    Ye, M.; Pacheco Castro, R. B.; Tao, X.; Zhao, J.

    2015-12-01

    The study of flow in karst is complex due of the high heterogeneity of the porous media. Several approaches have been proposed in the literature to study overcome the natural complexity of karst. Some of those methods are the single continuum, double continuum and the discrete network of conduits coupled with the single continuum. Several mathematical and computing models are available in the literature for each approach. In this study one computer model has been selected for each category to validate its usefulness to model flow in karst using a sandbox experiment. The models chosen are: Modflow 2005, Modflow CFPV1 and Modflow CFPV2. A sandbox experiment was implemented in such way that all the parameters required for each model can be measured. The sandbox experiment was repeated several times under different conditions. The model validation will be carried out by comparing the results of the model simulation and the real data. This model validation will allows ud to compare the accuracy of each model and the applicability in Karst. Also we will be able to evaluate if the results of the complex models improve a lot compared to the simple models specially because some models require complex parameters that are difficult to measure in the real world.

  1. Composing, Analyzing and Validating Software Models

    Science.gov (United States)

    Sheldon, Frederick T.

    1998-10-01

    This research has been conducted at the Computational Sciences Division of the Information Sciences Directorate at Ames Research Center (Automated Software Engineering Grp). The principle work this summer has been to review and refine the agenda that were carried forward from last summer. Formal specifications provide good support for designing a functionally correct system, however they are weak at incorporating non-functional performance requirements (like reliability). Techniques which utilize stochastic Petri nets (SPNs) are good for evaluating the performance and reliability for a system, but they may be too abstract and cumbersome from the stand point of specifying and evaluating functional behavior. Therefore, one major objective of this research is to provide an integrated approach to assist the user in specifying both functionality (qualitative: mutual exclusion and synchronization) and performance requirements (quantitative: reliability and execution deadlines). In this way, the merits of a powerful modeling technique for performability analysis (using SPNs) can be combined with a well-defined formal specification language. In doing so, we can come closer to providing a formal approach to designing a functionally correct system that meets reliability and performance goals.

  2. Validation of a terrestrial food chain model.

    Science.gov (United States)

    Travis, C C; Blaylock, B P

    1992-01-01

    An increasingly important topic in risk assessment is the estimation of human exposure to environmental pollutants through pathways other than inhalation. The Environmental Protection Agency (EPA) has recently developed a computerized methodology (EPA, 1990) to estimate indirect exposure to toxic pollutants from Municipal Waste Combuster emissions. This methodology estimates health risks from exposure to toxic pollutants from the terrestrial food chain (TFC), soil ingestion, drinking water ingestion, fish ingestion, and dermal absorption via soil and water. Of these, one of the most difficult to estimate is exposure through the food chain. This paper estimates the accuracy of the EPA methodology for estimating food chain contamination. To our knowledge, no data exist on measured concentrations of pollutants in food grown around Municipal Waste Incinerators, and few field-scale studies have been performed on the uptake of pollutants in the food chain. Therefore, to evaluate the EPA methodology, we compare actual measurements of background contaminant levels in food with estimates made using EPA's computerized methodology. Background levels of contaminants in air, water, and soil were used as input to the EPA food chain model to predict background levels of contaminants in food. These predicted values were then compared with the measured background contaminant levels. Comparisons were performed for dioxin, pentachlorophenol, polychlorinated biphenyls, benzene, benzo(a)pyrene, mercury, and lead.

  3. On the development and validation of QSAR models.

    Science.gov (United States)

    Gramatica, Paola

    2013-01-01

    The fundamental and more critical steps that are necessary for the development and validation of QSAR models are presented in this chapter as best practices in the field. These procedures are discussed in the context of predictive QSAR modelling that is focused on achieving models of the highest statistical quality and with external predictive power. The most important and most used statistical parameters needed to verify the real performances of QSAR models (of both linear regression and classification) are presented. Special emphasis is placed on the validation of models, both internally and externally, as well as on the need to define model applicability domains, which should be done when models are employed for the prediction of new external compounds.

  4. Integral Reactor Containment Condensation Model and Experimental Validation

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Qiao [Oregon State Univ., Corvallis, OR (United States); Corradini, Michael [Univ. of Wisconsin, Madison, WI (United States)

    2016-05-02

    This NEUP funded project, NEUP 12-3630, is for experimental, numerical and analytical studies on high-pressure steam condensation phenomena in a steel containment vessel connected to a water cooling tank, carried out at Oregon State University (OrSU) and the University of Wisconsin at Madison (UW-Madison). In the three years of investigation duration, following the original proposal, the planned tasks have been completed: (1) Performed a scaling study for the full pressure test facility applicable to the reference design for the condensation heat transfer process during design basis accidents (DBAs), modified the existing test facility to route the steady-state secondary steam flow into the high pressure containment for controllable condensation tests, and extended the operations at negative gage pressure conditions (OrSU). (2) Conducted a series of DBA and quasi-steady experiments using the full pressure test facility to provide a reliable high pressure condensation database (OrSU). (3) Analyzed experimental data and evaluated condensation model for the experimental conditions, and predicted the prototypic containment performance under accidental conditions (UW-Madison). A film flow model was developed for the scaling analysis, and the results suggest that the 1/3 scaled test facility covers large portion of laminar film flow, leading to a lower average heat transfer coefficient comparing to the prototypic value. Although it is conservative in reactor safety analysis, the significant reduction of heat transfer coefficient (50%) could under estimate the prototypic condensation heat transfer rate, resulting in inaccurate prediction of the decay heat removal capability. Further investigation is thus needed to quantify the scaling distortion for safety analysis code validation. Experimental investigations were performed in the existing MASLWR test facility at OrST with minor modifications. A total of 13 containment condensation tests were conducted for pressure

  5. Assessing the capability of numerical methods to predict earthquake ground motion: the Euroseistest verification and validation project

    Science.gov (United States)

    Chaljub, E. O.; Bard, P.; Tsuno, S.; Kristek, J.; Moczo, P.; Franek, P.; Hollender, F.; Manakou, M.; Raptakis, D.; Pitilakis, K.

    2009-12-01

    During the last decades, an important effort has been dedicated to develop accurate and computationally efficient numerical methods to predict earthquake ground motion in heterogeneous 3D media. The progress in methods and increasing capability of computers have made it technically feasible to calculate realistic seismograms for frequencies of interest in seismic design applications. In order to foster the use of numerical simulation in practical prediction, it is important to (1) evaluate the accuracy of current numerical methods when applied to realistic 3D applications where no reference solution exists (verification) and (2) quantify the agreement between recorded and numerically simulated earthquake ground motion (validation). Here we report the results of the Euroseistest verification and validation project - an ongoing international collaborative work organized jointly by the Aristotle University of Thessaloniki, Greece, the Cashima research project (supported by the French nuclear agency, CEA, and the Laue-Langevin institute, ILL, Grenoble), and the Joseph Fourier University, Grenoble, France. The project involves more than 10 international teams from Europe, Japan and USA. The teams employ the Finite Difference Method (FDM), the Finite Element Method (FEM), the Global Pseudospectral Method (GPSM), the Spectral Element Method (SEM) and the Discrete Element Method (DEM). The project makes use of a new detailed 3D model of the Mygdonian basin (about 5 km wide, 15 km long, sediments reach about 400 m depth, surface S-wave velocity is 200 m/s). The prime target is to simulate 8 local earthquakes with magnitude from 3 to 5. In the verification, numerical predictions for frequencies up to 4 Hz for a series of models with increasing structural and rheological complexity are analyzed and compared using quantitative time-frequency goodness-of-fit criteria. Predictions obtained by one FDM team and the SEM team are close and different from other predictions

  6. Experimental Validation of the Navy Air-Sea-Wave Coupled Forecasting Models

    Science.gov (United States)

    2012-09-30

    APPROACH We have participated in the DYNAMO project. We deployed (together with UEA’s Andrew Matthews ) SeaGlider which had the following...collaboration with Dr Adrian Matthews (University of East Anglia). We helped in collection and processing of this dataset which was used in COAMPS and...weather events in tropics such as tropical cyclone genesis and Madden Julian Oscillations (MJO). Figure. Initial validation of the model

  7. A prediction model for ocular damage - Experimental validation.

    Science.gov (United States)

    Heussner, Nico; Vagos, Márcia; Spitzer, Martin S; Stork, Wilhelm

    2015-08-01

    With the increasing number of laser applications in medicine and technology, accidental as well as intentional exposure of the human eye to laser sources has become a major concern. Therefore, a prediction model for ocular damage (PMOD) is presented within this work and validated for long-term exposure. This model is a combination of a raytracing model with a thermodynamical model of the human and an application which determines the thermal damage by the implementation of the Arrhenius integral. The model is based on our earlier work and is here validated against temperature measurements taken with porcine eye samples. For this validation, three different powers were used: 50mW, 100mW and 200mW with a spot size of 1.9mm. Also, the measurements were taken with two different sensing systems, an infrared camera and a fibre optic probe placed within the tissue. The temperatures were measured up to 60s and then compared against simulations. The measured temperatures were found to be in good agreement with the values predicted by the PMOD-model. To our best knowledge, this is the first model which is validated for both short-term and long-term irradiations in terms of temperature and thus demonstrates that temperatures can be accurately predicted within the thermal damage regime. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. The hypothetical world of CoMFA and model validation

    Energy Technology Data Exchange (ETDEWEB)

    Oprea, T.I. [Los Alamos National Lab., NM (United States)

    1996-12-31

    CoMFA is a technique used to establish the three-dimensional similarity of molecular structures, in relationship to a target property. Because the risk of chance correlation is high, validation is required for all CoMFA models. The following validation steps should be performed: the choice of alignment rules (superimposition and conformer criteria) has to use experimental data when available, or different (alternate) hypotheses; statistical methods (e.g., cross-validation with randomized groups), have to emphasize simplicity, robustness, predictivity and explanatory power. When several CoMFA-QSAR models on similar targets and/or structures are available, qualitative lateral validation can be applied. This meta-analysis for CoMFA models offers a broader perspective on the similarities and differences between compared biological targets, with potential applications in rational drug design [e.g., selectivity, efficacy] and environmental toxicology. Examples that focus on validation of CoMFA models include the following steroid-binding proteins: aromatase, the estrogen and the androgen receptors, a monoclonal antibody against progesterone and two steroid binding globulins.

  9. Development and Validation of Project Management Constructs of Security Door Access Control Systems: A Pilot Study in Macau

    Directory of Open Access Journals (Sweden)

    Chan Brenda Wing Han

    2016-06-01

    Full Text Available A Security Door Access Control System (SDACS project involves a number of teams from different organizations with diverse project goals. One of the main challenges of such projects is the lack of a standard approach or common understanding to achieve a common goal among project parties. This research examines various management concerns for SDACS projects, highlights the expected common understanding for project participants, develops the project management constructs, and emphasizes on the resulting value of the project to all participants. A two-stage process of scale development and validation was conducted. First, six generic constructs were identified based on the Security Access Control System Framework. Next, a multi-item scale for each construct was developed with reference to the Result-Oriented Management Framework. Expert judges were invited to conduct manual sorting of the items iteratively until reliability and validity was reached. In the next stage, further refinement and validation were carried out with a synthesized survey instrument and a series of statistical testing followed. The finalized SDACS project management constructs and the related findings help reinforce the importance of a standardized management practice for SDACS projects. The value of this research not only benefits SDACS project managers but everyone who works on the project.

  10. Importance of Computer Model Validation in Pyroprocessing Technology Development

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Y. E.; Li, Hui; Yim, M. S. [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of)

    2014-05-15

    In this research, we developed a plan for experimental validation of one of the computer models developed for ER process modeling, i. e., the ERAD code. Several candidate surrogate materials are selected for the experiment considering the chemical and physical properties. Molten salt-based pyroprocessing technology is being examined internationally as an alternative to treat spent nuclear fuel over aqueous technology. The central process in pyroprocessing is electrorefining(ER) which separates uranium from transuranic elements and fission products present in spent nuclear fuel. ER is a widely used process in the minerals industry to purify impure metals. Studies of ER by using actual spent nuclear fuel materials are problematic for both technical and political reasons. Therefore, the initial effort for ER process optimization is made by using computer models. A number of models have been developed for this purpose. But as validation of these models is incomplete and often times problematic, the simulation results from these models are inherently uncertain.

  11. The turbulent viscosity models and their experimental validation; Les modeles de viscosite turbulente et leur validation experimentale

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-12-31

    This workshop on turbulent viscosity models and on their experimental validation was organized by the `convection` section of the French society of thermal engineers. From the 9 papers presented during this workshop, 8 deal with the modeling of turbulent flows inside combustion chambers, turbo-machineries or in other energy-related applications, and have been selected for ETDE. (J.S.)

  12. Validation of the PESTLA model: Field test using data from a sandy soil in Schaijk (the Netherlands)

    NARCIS (Netherlands)

    Boekhold AE; Swartjes FA; Hoogenboom FGG; van der Linden AMA

    1993-01-01

    Within the framework of the project "Validation of PESTLA" the Schaijk data set was used to analyse PESTLA model performance. The Schaijk data set contains field data on bentazon behaviour in a coarse textured humic gley soil cropped with maize. PESTLA model input parameters were derived

  13. Regional Climate Model Intercomparison Project for Asia.

    Science.gov (United States)

    Fu, Congbin; Wang, Shuyu; Xiong, Zhe; Gutowski, William J.; Lee, Dong-Kyou; McGregor, John L.; Sato, Yasuo; Kato, Hisashi; Kim, Jeong-Woo; Suh, Myoung-Seok

    2005-02-01

    Improving the simulation of regional climate change is one of the high-priority areas of climate study because regional information is needed for climate change impact assessments. Such information is especially important for the region covered by the East Asian monsoon where there is high variability in both space and time. To this end, the Regional Climate Model Intercomparison Project (RMIP) for Asia has been established to evaluate and improve regional climate model (RCM) simulations of the monsoon climate. RMIP operates under joint support of the Asia-Pacific Network for Global Change Research (APN), the Global Change System for Analysis, Research and Training (START), the Chinese Academy of Sciences, and several projects of participating nations. The project currently involves 10 research groups from Australia, China, Japan, South Korea, and the United States, as well as scientists from India, Italy, Mongolia, North Korea, and Russia.RMIP has three simulation phases: March 1997-August 1998, which covers a full annual cycle and extremes in monsoon behavior; January 1989-December 1998, which examines simulated climatology; and a regional climate change scenario, involving nesting with a global model. This paper is a brief report of RMIP goals, implementation design, and some initial results from the first phase studies.

  14. Nonequilibrium stage modelling of dividing wall columns and experimental validation

    Science.gov (United States)

    Hiller, Christoph; Buck, Christina; Ehlers, Christoph; Fieg, Georg

    2010-11-01

    Dealing with complex process units like dividing wall columns pushes the focus on the determination of suitable modelling approaches. For this purpose a nonequilibrium stage model is developed. The successful validation is achieved by an experimental investigation of fatty alcohol mixtures under vacuum condition at pilot scale. Aim is the recovery of high purity products. The proposed model predicts the product qualities and temperature profiles very well.

  15. Human surrogate models of neuropathic pain: validity and limitations.

    Science.gov (United States)

    Binder, Andreas

    2016-02-01

    Human surrogate models of neuropathic pain in healthy subjects are used to study symptoms, signs, and the hypothesized underlying mechanisms. Although different models are available, different spontaneous and evoked symptoms and signs are inducible; 2 key questions need to be answered: are human surrogate models conceptually valid, ie, do they share the sensory phenotype of neuropathic pain states, and are they sufficiently reliable to allow consistent translational research?

  16. Blast Load Simulator Experiments for Computational Model Validation: Report 2

    Science.gov (United States)

    2017-02-01

    O’Daniel, 2016. Blast load simulator experiments for computational model validation – Report 1. ERDC/GSL TR-16-27. Vicksburg, MS: U.S. Army Engineer ...ER D C/ G SL T R- 16 -2 7 Blast Load Simulator Experiments for Computational Model Validation Report 2 G eo te ch ni ca l a nd S tr uc...Approved for public release; distribution is unlimited. The U.S. Army Engineer Research and Development Center (ERDC) solves the nation’s toughest

  17. Validation of AROME wind speed forecasts against mast observations in the Finnish wind power resource mapping project

    Science.gov (United States)

    Kilpinen, J.

    2009-09-01

    The upgrade of the Finnish wind power resource mapping is going on. The previous mapping was published 1991 and it was mainly based on observations. The climatology for the present mapping is made with meso-scale NWP mode and down scaling to detailed topography using WAsP-model. One of the tasks in the mapping is to validate/verify the model wind speed against mast observation. There is a group of masts with measurement heights around 100 meters available for this purpose. Most of the masts are in the Helsinki Testbed area while some of the masts are at existing wind farms. From a larger data set (ERA INTERIM) a representative sample of months has been chosen and also two extra 12 month sets representing extreme wind conditions. The total sample consists of 72 separate months. The lateral boundaries and first guess is from ERA INTERIM data. HIRLAM model (with 7.5 km resolution) is used to make initial analyses for 2.5 km AROME model with 6 hourly data assimilation cycle. Finally AROME model is used to simulate the wind climate. The output is with 3 hour interval. The WAsP-model is used to downscale the wind in coastal areas and hills in Northern Finland with 250 meter resolution and corresponding roughness. For validation the operative AROME is used. Only 00 UTC initial analyses are used to make forecasts up to +24 hours with 3 hourly outputs to cover the diurnal cycle. The validation period began from June 2008 and it will last to the end of the project in October 2009. The number of masts is around 20 and the height of measurements is typically between 60 and 100 meters. The validation is made with traditional verification methods. A special attention is also made to the quality control of observations. A part of the wind speed measurement instruments are not typical cup anemometers but acoustic instruments (Vaisala VXT520). The detailed results of validation will be presented. The preliminary results for the year 2008 indicate that there is a slight positive

  18. [A project to improve the validity rate for nursing staff operating single door autoclave sterilizers].

    Science.gov (United States)

    Chen, Chun-Hung; Li, Cheng-Chang; Chou, Chuan-Yu; Chen, Shu-Hwa

    2009-08-01

    This project was designed to improve the low validity rate for nurses responsible to operate single door autoclave sterilizers in the operating room. By investigating the current status, we found that the nursing staff validity rate of cognition on the autoclave sterilizer was 85%, and the practice operating check validity rate was only 80%. Such was due to a lack of in-service education. Problems with operation included: 1. Unsafe behaviors - not following standard procedure, lacking relevant operating knowledge and absence of a check form; 2. Unsafe environment - the conveying steam piping was typically not covered and lacked operation marks. Recommended improvement measures included: 1. holding in-service education; 2. generating an operation procedure flow chart; 3. implementing obstacle eliminating procedures; 4. covering piping to prevent fire and burns; 5. performing regular checks to ensure all procedures are followed. Following intervention, nursing staff cognition rose from 85% to 100%, while the operation validity rate rose from 80% to 100%. These changes ensure a safer operating room environment, and helps facilities move toward a zero accident rate in the healthcare environment.

  19. Assessing Attachment Representations in Adolescents: Discriminant Validation of the Adult Attachment Projective Picture System.

    Science.gov (United States)

    Gander, Manuela; George, Carol; Pokorny, Dan; Buchheim, Anna

    2017-04-01

    The contribution of attachment to human development and clinical risk is well established for children and adults, yet there is relatively limited knowledge about attachment in adolescence due to the poor availability of construct valid measures. The Adult Attachment Projective Picture System (AAP) is a reliable and valid instrument to assess adult attachment status. This study examines for the first time the discriminant validity of the AAP in adolescents. In our sample of 79 teenagers between 15 and 18 years, 42 % were classified as secure, 34 % as insecure-dismissing, 13 % as insecure-preoccupied and 11 % as unresolved. The results demonstrated discriminant validity for using the AAP in that age group, with no associations between attachment classifications and verbal intelligence, social desirability, story length or sociodemographic variables. These results poise the AAP to be used in clinical intervention and large-scale research investigating normative and atypical developmental correlates and sequelae of attachment, including psychopathology in adolescence.

  20. Rationale and methods of the European Food Consumption Validation (EFCOVAL) Project

    DEFF Research Database (Denmark)

    de Boer, E. J.; Slimani, N.; van 't Veer, P.

    2011-01-01

    Background/Objectives: The overall objective of the European Food Consumption Validation (EFCOVAL) Project was to further develop and validate a trans-European food consumption method to be used for the evaluation of the intake of foods, nutrients and potentially hazardous chemicals within...... the EPIC-Soft databases were adapted. Finally, the EFCOVAL Consortium developed a statistical tool (Multiple Source Method) for estimating the usual intake and distribution, which has been tested using real food consumption data and compared with three other statistical methods through a simulation study......, the repeated 24-HDR method using EPIC-Soft and a food propensity questionnaire was evaluated against biomarkers in 24-h urine collections and in blood samples among adults from Belgium, the Czech Republic, (the South of) France, the Netherlands and Norway. As a result from an expert workshop on a proposed...

  1. Validation techniques of agent based modelling for geospatial simulations

    Science.gov (United States)

    Darvishi, M.; Ahmadi, G.

    2014-10-01

    One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS) is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS), biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI's ArcGIS, OpenMap, GeoTools, etc) for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  2. Validation techniques of agent based modelling for geospatial simulations

    Directory of Open Access Journals (Sweden)

    M. Darvishi

    2014-10-01

    Full Text Available One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS, biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI’s ArcGIS, OpenMap, GeoTools, etc for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  3. MODEL-BASED VALIDATION AND VERIFICATION OF ANOMALIES IN LEGISLATION

    Directory of Open Access Journals (Sweden)

    Vjeran Strahonja

    2006-12-01

    Full Text Available An anomaly in legislation is absence of completeness, consistency and other desirable properties, caused by different semantic, syntactic or pragmatic reasons. In general, the detection of anomalies in legislation comprises validation and verification. The basic idea of research, as presented in this paper, is modelling legislation by capturing domain knowledge of legislation and specifying it in a generic way by using commonly agreed and understandable modelling concepts of the Unified Modelling Language (UML. Models of legislation enable to understand the system better, support the detection of anomalies and help to improve the quality of legislation by validation and verification. By implementing model-based approach, the object of validation and verification moves from legislation to its model. The business domain of legislation has two distinct aspects: a structural or static aspect (functionality, business data etc., and a behavioural or dynamic part (states, transitions, activities, sequences etc.. Because anomalism can occur on two different levels, on the level of a model, or on the level of legislation itself, a framework for validation and verification of legal regulation and its model is discussed. The presented framework includes some significant types of semantic and syntactic anomalies. Some ideas for assessment of pragmatic anomalies of models were found in the field of software quality metrics. Thus pragmatic features and attributes can be determined that could be relevant for evaluation purposes of models. Based on analogue standards for the evaluation of software, a qualitative and quantitative scale can be applied to determine the value of some feature for a specific model.

  4. External model validation of binary clinical risk prediction models in cardiovascular and thoracic surgery.

    Science.gov (United States)

    Hickey, Graeme L; Blackstone, Eugene H

    2016-08-01

    Clinical risk-prediction models serve an important role in healthcare. They are used for clinical decision-making and measuring the performance of healthcare providers. To establish confidence in a model, external model validation is imperative. When designing such an external model validation study, thought must be given to patient selection, risk factor and outcome definitions, missing data, and the transparent reporting of the analysis. In addition, there are a number of statistical methods available for external model validation. Execution of a rigorous external validation study rests in proper study design, application of suitable statistical methods, and transparent reporting.

  5. Validation of spectral gas radiation models under oxyfuel conditions

    Energy Technology Data Exchange (ETDEWEB)

    Becher, Johann Valentin

    2013-05-15

    Combustion of hydrocarbon fuels with pure oxygen results in a different flue gas composition than combustion with air. Standard computational-fluid-dynamics (CFD) spectral gas radiation models for air combustion are therefore out of their validity range in oxyfuel combustion. This thesis provides a common spectral basis for the validation of new spectral models. A literature review about fundamental gas radiation theory, spectral modeling and experimental methods provides the reader with a basic understanding of the topic. In the first results section, this thesis validates detailed spectral models with high resolution spectral measurements in a gas cell with the aim of recommending one model as the best benchmark model. In the second results section, spectral measurements from a turbulent natural gas flame - as an example for a technical combustion process - are compared to simulated spectra based on measured gas atmospheres. The third results section compares simplified spectral models to the benchmark model recommended in the first results section and gives a ranking of the proposed models based on their accuracy. A concluding section gives recommendations for the selection and further development of simplified spectral radiation models. Gas cell transmissivity spectra in the spectral range of 2.4 - 5.4 {mu}m of water vapor and carbon dioxide in the temperature range from 727 C to 1500 C and at different concentrations were compared in the first results section at a nominal resolution of 32 cm{sup -1} to line-by-line models from different databases, two statistical-narrow-band models and the exponential-wide-band model. The two statistical-narrow-band models EM2C and RADCAL showed good agreement with a maximal band transmissivity deviation of 3 %. The exponential-wide-band model showed a deviation of 6 %. The new line-by-line database HITEMP2010 had the lowest band transmissivity deviation of 2.2% and was therefore recommended as a reference model for the

  6. Validation of a Hot Water Distribution Model Using Laboratory and Field Data

    Energy Technology Data Exchange (ETDEWEB)

    Backman, C. [Alliance for Residential Building Innovation (ARBI), Davis, CA (United States); Hoeschele, M. [Alliance for Residential Building Innovation (ARBI), Davis, CA (United States)

    2013-07-01

    Characterizing the performance of hot water distribution systems is a critical step in developing best practice guidelines for the design and installation of high performance hot water systems. Developing and validating simulation models is critical to this effort, as well as collecting accurate input data to drive the models. In this project, the Building America research team ARBI validated the newly developed TRNSYS Type 604 pipe model against both detailed laboratory and field distribution system performance data. Validation efforts indicate that the model performs very well in handling different pipe materials, insulation cases, and varying hot water load conditions. Limitations of the model include the complexity of setting up the input file and long simulation run times. This project also looked at recent field hot water studies to better understand use patterns and potential behavioral changes as homeowners convert from conventional storage water heaters to gas tankless units. The team concluded that the current Energy Factor test procedure overestimates typical use and underestimates the number of hot water draws, which has implications for both equipment and distribution system performance. Gas tankless water heaters were found to impact how people use hot water, but the data does not necessarily suggest an increase in usage. Further study in hot water usage and patterns is needed to better define these characteristics in different climates and home vintages.

  7. Validation of a Best-Fit Pharmacokinetic Model for Scopolamine Disposition after Intranasal Administration

    Science.gov (United States)

    Wu, L.; Chow, D. S-L.; Tam, V.; Putcha, L.

    2015-01-01

    An intranasal gel formulation of scopolamine (INSCOP) was developed for the treatment of Motion Sickness. Bioavailability and pharmacokinetics (PK) were determined per Investigative New Drug (IND) evaluation guidance by the Food and Drug Administration. Earlier, we reported the development of a PK model that can predict the relationship between plasma, saliva and urinary scopolamine (SCOP) concentrations using data collected from an IND clinical trial with INSCOP. This data analysis project is designed to validate the reported best fit PK model for SCOP by comparing observed and model predicted SCOP concentration-time profiles after administration of INSCOP.

  8. Experimental Validation of a Thermoelastic Model for SMA Hybrid Composites

    Science.gov (United States)

    Turner, Travis L.

    2001-01-01

    This study presents results from experimental validation of a recently developed model for predicting the thermomechanical behavior of shape memory alloy hybrid composite (SMAHC) structures, composite structures with an embedded SMA constituent. The model captures the material nonlinearity of the material system with temperature and is capable of modeling constrained, restrained, or free recovery behavior from experimental measurement of fundamental engineering properties. A brief description of the model and analysis procedures is given, followed by an overview of a parallel effort to fabricate and characterize the material system of SMAHC specimens. Static and dynamic experimental configurations for the SMAHC specimens are described and experimental results for thermal post-buckling and random response are presented. Excellent agreement is achieved between the measured and predicted results, fully validating the theoretical model for constrained recovery behavior of SMAHC structures.

  9. System Modeling, Validation, and Design of Shape Controllers for NSTX

    Science.gov (United States)

    Walker, M. L.; Humphreys, D. A.; Eidietis, N. W.; Leuer, J. A.; Welander, A. S.; Kolemen, E.

    2011-10-01

    Modeling of the linearized control response of plasma shape and position has become fairly routine in the last several years. However, such response models rely on the input of accurate values of model parameters such as conductor and diagnostic sensor geometry and conductor resistivity or resistance. Confidence in use of such a model therefore requires that some effort be spent in validating that the model has been correctly constructed. We describe the process of constructing and validating a response model for NSTX plasma shape and position control, and subsequent use of that model for the development of shape and position controllers. The model development, validation, and control design processes are all integrated within a Matlab-based toolset known as TokSys. The control design method described emphasizes use of so-called decoupling control, in which combinations of coil current modifications are designed to modify only one control parameter at a time, without perturbing any other control parameter values. Work supported by US DOE under DE-FG02-99ER54522 and DE-AC02-09CH11466.

  10. Progress in Geant4 Electromagnetic Physics Modelling and Validation

    Science.gov (United States)

    Apostolakis, J.; Asai, M.; Bagulya, A.; Brown, J. M. C.; Burkhardt, H.; Chikuma, N.; Cortes-Giraldo, M. A.; Elles, S.; Grichine, V.; Guatelli, S.; Incerti, S.; Ivanchenko, V. N.; Jacquemier, J.; Kadri, O.; Maire, M.; Pandola, L.; Sawkey, D.; Toshito, T.; Urban, L.; Yamashita, T.

    2015-12-01

    In this work we report on recent improvements in the electromagnetic (EM) physics models of Geant4 and new validations of EM physics. Improvements have been made in models of the photoelectric effect, Compton scattering, gamma conversion to electron and muon pairs, fluctuations of energy loss, multiple scattering, synchrotron radiation, and high energy positron annihilation. The results of these developments are included in the new Geant4 version 10.1 and in patches to previous versions 9.6 and 10.0 that are planned to be used for production for run-2 at LHC. The Geant4 validation suite for EM physics has been extended and new validation results are shown in this work. In particular, the effect of gamma-nuclear interactions on EM shower shape at LHC energies is discussed.

  11. Progress in Geant4 Electromagnetic Physics Modelling and Validation

    CERN Document Server

    Apostolakis, J; Bagulya, A; Brown, J M C; Burkhardt, H; Chikuma, N; Cortes-Giraldo, M A; Elles, S; Grichine, V; Guatelli, S; Incerti, S; Ivanchenko, V N; Jacquemier, J; Kadri, O; Maire, M; Pandola, L; Sawkey, D; Toshito, T; Urban, L; Yamashita, T

    2015-01-01

    In this work we report on recent improvements in the electromagnetic (EM) physics models of Geant4 and new validations of EM physics. Improvements have been made in models of the photoelectric effect, Compton scattering, gamma conversion to electron and muon pairs, fluctuations of energy loss, multiple scattering, synchrotron radiation, and high energy positron annihilation. The results of these developments are included in the new Geant4 version 10.1 and in patches to previous versions 9.6 and 10.0 that are planned to be used for production for run-2 at LHC. The Geant4 validation suite for EM physics has been extended and new validation results are shown in this work. In particular, the effect of gamma-nuclear interactions on EM shower shape at LHC energies is discussed.

  12. Progress in Geant4 Electromagnetic Physics Modelling and Validation

    CERN Document Server

    Apostolakis, J; Bagulya, A; Brown, J M C; Burkhardt, H; Chikuma, N; Cortes-Giraldo, M A; Elles, S; Grichine, V; Guatelli, S; Incerti, S; Ivanchenko, V N; Jacquemier, J; Kadri, O; Maire, M; Pandola, L; Sawkey, D; Toshito, T; Urban, L; Yamashita, T

    2015-01-01

    In this work we report on recent improvements in the electromagnetic (EM) physics models of Geant4 and new validations of EM physics. Improvements have been made in models of the photoelectric effect, Compton scattering, gamma conversion to electron and muon pairs, fluctuations of energy loss, multiple scattering, synchrotron radiation, and high energy positron annihilation. The results of these developments are included in the new Geant4 version 10.1 and in patches to previous versions 9.6 and 10.0 that are planned to be used for production for run-2 at LHC. The Geant4 validation suite for EM physics has been extended and new validation results are shown in this work. In particular, the effect of gamma-nuclear interactions on EM shower shape at LHC energies is discussed.

  13. Cross-validation model assessment for modular networks

    CERN Document Server

    Kawamoto, Tatsuro

    2016-01-01

    Model assessment of the stochastic block model is a crucial step in identification of modular structures in networks. Although this has typically been done according to the principle that a parsimonious model with a large marginal likelihood or a short description length should be selected, another principle is that a model with a small prediction error should be selected. We show that the leave-one-out cross-validation estimate of the prediction error can be efficiently obtained using belief propagation for sparse networks. Furthermore, the relations among the objectives for model assessment enable us to determine the exact cause of overfitting.

  14. Model Validation for Shipboard Power Cables Using Scattering Parameters%Model Validation for Shipboard Power Cables Using Scattering Parameters

    Institute of Scientific and Technical Information of China (English)

    Lukas Graber; Diomar Infante; Michael Steurer; William W. Brey

    2011-01-01

    Careful analysis of transients in shipboard power systems is important to achieve long life times of the com ponents in future all-electric ships. In order to accomplish results with high accuracy, it is recommended to validate cable models as they have significant influence on the amplitude and frequency spectrum of voltage transients. The authors propose comparison of model and measurement using scattering parameters. They can be easily obtained from measurement and simulation and deliver broadband information about the accuracy of the model. The measurement can be performed using a vector network analyzer. The process to extract scattering parameters from simulation models is explained in detail. Three different simulation models of a 5 kV XLPE power cable have been validated. The chosen approach delivers an efficient tool to quickly estimate the quality of a model.

  15. Avoiding unintentional eviction from integral projection models.

    Science.gov (United States)

    Williams, Jennifer L; Miller, Tom E X; Ellner, Stephen P

    2012-09-01

    Integral projection models (IPMs) are increasingly being applied to study size-structured populations. Here we call attention to a potential problem in their construction that can have important consequences for model results. IPMs are implemented using an approximating matrix and bounded size range. Individuals near the size limits can be unknowingly "evicted" from the model because their predicted future size is outside the range. We provide simple measures for the magnitude of eviction and the sensitivity of the population growth rate (lambda) to eviction, allowing modelers to assess the severity of the problem in their IPM. For IPMs of three plant species, we found that eviction occurred in all cases and caused underestimation of the population growth rate (lambda) relative to eviction-free models; it is likely that other models are similarly affected. Models with frequent eviction should be modified because eviction is only possible when size transitions are badly mis-specified. We offer several solutions to eviction problems, but we emphasize that the modeler must choose the most appropriate solution based on an understanding of why eviction occurs in the first place. We recommend testing IPMs for eviction problems and resolving them, so that population dynamics are modeled more accurately.

  16. Validation of statistical models for creep rupture by parametric analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bolton, J., E-mail: john.bolton@uwclub.net [65, Fisher Ave., Rugby, Warks CV22 5HW (United Kingdom)

    2012-01-15

    Statistical analysis is an efficient method for the optimisation of any candidate mathematical model of creep rupture data, and for the comparative ranking of competing models. However, when a series of candidate models has been examined and the best of the series has been identified, there is no statistical criterion to determine whether a yet more accurate model might be devised. Hence there remains some uncertainty that the best of any series examined is sufficiently accurate to be considered reliable as a basis for extrapolation. This paper proposes that models should be validated primarily by parametric graphical comparison to rupture data and rupture gradient data. It proposes that no mathematical model should be considered reliable for extrapolation unless the visible divergence between model and data is so small as to leave no apparent scope for further reduction. This study is based on the data for a 12% Cr alloy steel used in BS PD6605:1998 to exemplify its recommended statistical analysis procedure. The models considered in this paper include a) a relatively simple model, b) the PD6605 recommended model and c) a more accurate model of somewhat greater complexity. - Highlights: Black-Right-Pointing-Pointer The paper discusses the validation of creep rupture models derived from statistical analysis. Black-Right-Pointing-Pointer It demonstrates that models can be satisfactorily validated by a visual-graphic comparison of models to data. Black-Right-Pointing-Pointer The method proposed utilises test data both as conventional rupture stress and as rupture stress gradient. Black-Right-Pointing-Pointer The approach is shown to be more reliable than a well-established and widely used method (BS PD6605).

  17. Projecting Policy Effects with Statistical Models Projecting Policy Effects with Statistical Models

    Directory of Open Access Journals (Sweden)

    Christopher Sims

    1988-03-01

    Full Text Available This paper attempts to briefly discus the current frontiers in quantitative modeling for forecastina and policy analvsis. It does so by summarizing some recent developmenrs in three areas: reduced form forecasting models; theoretical models including elements of stochastic optimization; and identification. In the process, the paper tries to provide some remarks on the direction we seem to be headed. Projecting Policy Effects with Statistical Models

  18. Pyramid projection - validation of a new method of skin defect measurement.

    Science.gov (United States)

    Růzicka, J; Nový, P; Vávra, F; Bolek, L; Benes, J

    2007-01-01

    This paper presents a new method for the determination of the volume, surface area and depth of skin defects. The method is based on the description of a spatial defect using a pyramid (made, for example, from injection needles), which is placed over the defect. The projection of the pyramid on to the defect is photographed using a digital camera and subsequently compared with the projection of the same pyramid on to a sheet of grid paper. The defect is mathematically reconstructed on a computer, and an optimal body shape describing the defect is found, using a number of simplifications and assumptions. The method was then validated using a plaster mold of a real foot with 19 defects simulating real wounds. These plaster wounds were molded using alginate hydrocolloid, and the volume, surface area and depth were measured and compared with the results of the pyramid projection by means of regression analysis.This method correlates in all variables with correlation coefficients higher than 0.9. It can be concluded that the projection pyramid method correlates well with the reference mold method and can be used with good results for a whole range of variables.

  19. The Chancellor's Model School Project (CMSP)

    Science.gov (United States)

    Lopez, Gil

    1999-01-01

    What does it take to create and implement a 7th to 8th grade middle school program where the great majority of students achieve at high academic levels regardless of their previous elementary school backgrounds? This was the major question that guided the research and development of a 7-year long project effort entitled the Chancellor's Model School Project (CMSP) from September 1991 to August 1998. The CMSP effort conducted largely in two New York City public schools was aimed at creating and testing a prototype 7th and 8th grade model program that was organized and test-implemented in two distinct project phases: Phase I of the CMSP effort was conducted from 1991 to 1995 as a 7th to 8th grade extension of an existing K-6 elementary school, and Phase II was conducted from 1995 to 1998 as a 7th to 8th grade middle school program that became an integral part of a newly established 7-12th grade high school. In Phase I, the CMSP demonstrated that with a highly structured curriculum coupled with strong academic support and increased learning time, students participating in the CMSP were able to develop a strong foundation for rigorous high school coursework within the space of 2 years (at the 7th and 8th grades). Mathematics and Reading test score data during Phase I of the project, clearly indicated that significant academic gains were obtained by almost all students -- at both the high and low ends of the spectrum -- regardless of their previous academic performance in the K-6 elementary school experience. The CMSP effort expanded in Phase II to include a fully operating 7-12 high school model. Achievement gains at the 7th and 8th grade levels in Phase II were tempered by the fact that incoming 7th grade students' academic background at the CMSP High School was significantly lower than students participating in Phase 1. Student performance in Phase II was also affected by the broadening of the CMSP effort from a 7-8th grade program to a fully functioning 7-12 high

  20. Validation of a tuber blight (Phytophthora infestans) prediction model

    Science.gov (United States)

    Potato tuber blight caused by Phytophthora infestans accounts for significant losses in storage. There is limited published quantitative data on predicting tuber blight. We validated a tuber blight prediction model developed in New York with cultivars Allegany, NY 101, and Katahdin using independent...

  1. Technical Note: Calibration and validation of geophysical observation models

    NARCIS (Netherlands)

    Salama, M.S.; van der Velde, R.; van der Woerd, H.J.; Kromkamp, J.C.; Philippart, C.J.M.; Joseph, A.T.; O'Neill, P.E.; Lang, R.H.; Gish, T.; Werdell, P.J.; Su, Z.

    2012-01-01

    We present a method to calibrate and validate observational models that interrelate remotely sensed energy fluxes to geophysical variables of land and water surfaces. Coincident sets of remote sensing observation of visible and microwave radiations and geophysical data are assembled and subdivided i

  2. Validation of Geant4 hadronic physics models at intermediate energies

    Science.gov (United States)

    Banerjee, Sunanda; Geant4 Hadronic Group

    2010-04-01

    GEANT4 provides a number of physics models at intermediate energies (corresponding to incident momenta in the range 1-20 GeV/c). Recently, these models have been validated with existing data from a number of experiments: (a) inclusive proton and neutron production with a variety of beams (π-, π+, p) at different energies between 1 and 9 GeV/c on a number of nuclear targets (from beryllium to uranium); (2) inclusive pion/kaon/proton production from 14.6 GeV/c proton beams on nuclear targets (from beryllium to gold); (3) inclusive pion production from pion beams between 3-13 GeV/c on a number of nuclear targets (from beryllium to lead). The results of simulation/data comparison for different GEANT4 models are discussed in the context of validating the models and determining their usage in physics lists for high energy application. Due to the increasing number of validations becoming available, and the requirement that they be done at regular intervals corresponding to the GEANT4 release schedule, automated methods of validation are being developed.

  3. Empirical validation data sets for double skin facade models

    DEFF Research Database (Denmark)

    Kalyanova, Olena; Jensen, Rasmus Lund; Heiselberg, Per

    2008-01-01

    During recent years application of double skin facades (DSF) has greatly increased. However, successful application depends heavily on reliable and validated models for simulation of the DSF performance and this in turn requires access to high quality experimental data. Three sets of accurate emp...

  4. Validation of Models : Statistical Techniques and Data Availability

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    1999-01-01

    This paper shows which statistical techniques can be used to validate simulation models, depending on which real-life data are available. Concerning this availability three situations are distinguished (i) no data, (ii) only output data, and (iii) both input and output data. In case (i) - no real

  5. Model validation studies of solar systems, Phase III. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Lantz, L.J.; Winn, C.B.

    1978-12-01

    Results obtained from a validation study of the TRNSYS, SIMSHAC, and SOLCOST solar system simulation and design are presented. Also included are comparisons between the FCHART and SOLCOST solar system design programs and some changes that were made to the SOLCOST program. Finally, results obtained from the analysis of several solar radiation models are presented. Separate abstracts were prepared for ten papers.

  6. Improving Perovskite Solar Cells: Insights From a Validated Device Model

    NARCIS (Netherlands)

    Sherkar, Tejas S.; Momblona, Cristina; Gil-Escrig, Lidon; Bolink, Henk J.; Koster, L. Jan Anton

    2017-01-01

    To improve the efficiency of existing perovskite solar cells (PSCs), a detailed understanding of the underlying device physics during their operation is essential. Here, a device model has been developed and validated that describes the operation of PSCs and quantitatively explains the role of conta

  7. ID Model Construction and Validation: A Multiple Intelligences Case

    Science.gov (United States)

    Tracey, Monica W.; Richey, Rita C.

    2007-01-01

    This is a report of a developmental research study that aimed to construct and validate an instructional design (ID) model that incorporates the theory and practice of multiple intelligences (MI). The study consisted of three phases. In phase one, the theoretical foundations of multiple Intelligences and ID were examined to guide the development…

  8. Bibliometric Modeling Processes and the Empirical Validity of Lotka's Law.

    Science.gov (United States)

    Nicholls, Paul Travis

    1989-01-01

    Examines the elements involved in fitting a bibliometric model to empirical data, proposes a consistent methodology for applying Lotka's law, and presents the results of an empirical test of the methodology. The results are discussed in terms of the validity of Lotka's law and the suitability of the proposed methodology. (49 references) (CLB)

  9. Improving Perovskite Solar Cells: Insights From a Validated Device Model

    NARCIS (Netherlands)

    Sherkar, Tejas S.; Momblona, Cristina; Gil-Escrig, Lidon; Bolink, Henk J.; Koster, L. Jan Anton

    2017-01-01

    To improve the efficiency of existing perovskite solar cells (PSCs), a detailed understanding of the underlying device physics during their operation is essential. Here, a device model has been developed and validated that describes the operation of PSCs and quantitatively explains the role of conta

  10. Development of integrated software project planning model

    OpenAIRE

    Manalif, Ekananta; Capretz, Luiz Fernando; Ho, Danny

    2012-01-01

    As the most uncertain and complex project when compared to other types of projects, software development project is highly depend on the result of software project planning phase that helping project managers by predicting the project demands with respect to the budgeting, scheduling, and the allocation of resources. The two main activities in software project planning are effort estimation and risk assessment which has to be executed together because the accuracy of the effort estimation is ...

  11. Validating firn compaction model with remote sensing data

    DEFF Research Database (Denmark)

    Simonsen, S. B.; Stenseng, Lars; Sørensen, Louise Sandberg

    A comprehensive understanding of firn processes is of outmost importance, when estimating present and future changes of the Greenland Ice Sheet. Especially, when remote sensing altimetry is used to assess the state of ice sheets and their contribution to global sea level rise, firn compaction...... models have been shown to be a key component. Now, remote sensing data can also be used to validate the firn models. Radar penetrating the upper part of the firn column in the interior part of Greenland shows a clear layering. The observed layers from the radar data can be used as an in-situ validation...... correction relative to the changes in the elevation of the surface observed with remote sensing altimetry? What model time resolution is necessary to resolved the observed layering? What model refinements are necessary to give better estimates of the surface mass balance of the Greenland ice sheet from...

  12. Integral Reactor Containment Condensation Model and Experimental Validation

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Qiao [Oregon State Univ., Corvallis, OR (United States); Corradini, Michael [Univ. of Wisconsin, Madison, WI (United States)

    2016-05-02

    This NEUP funded project, NEUP 12-3630, is for experimental, numerical and analytical studies on high-pressure steam condensation phenomena in a steel containment vessel connected to a water cooling tank, carried out at Oregon State University (OrSU) and the University of Wisconsin at Madison (UW-Madison). In the three years of investigation duration, following the original proposal, the planned tasks have been completed: (1) Performed a scaling study for the full pressure test facility applicable to the reference design for the condensation heat transfer process during design basis accidents (DBAs), modified the existing test facility to route the steady-state secondary steam flow into the high pressure containment for controllable condensation tests, and extended the operations at negative gage pressure conditions (OrSU). (2) Conducted a series of DBA and quasi-steady experiments using the full pressure test facility to provide a reliable high pressure condensation database (OrSU). (3) Analyzed experimental data and evaluated condensation model for the experimental conditions, and predicted the prototypic containment performance under accidental conditions (UW-Madison). A film flow model was developed for the scaling analysis, and the results suggest that the 1/3 scaled test facility covers large portion of laminar film flow, leading to a lower average heat transfer coefficient comparing to the prototypic value. Although it is conservative in reactor safety analysis, the significant reduction of heat transfer coefficient (50%) could under estimate the prototypic condensation heat transfer rate, resulting in inaccurate prediction of the decay heat removal capability. Further investigation is thus needed to quantify the scaling distortion for safety analysis code validation. Experimental investigations were performed in the existing MASLWR test facility at OrST with minor modifications. A total of 13 containment condensation tests were conducted for pressure

  13. Validation of a Model for Ice Formation around Finned Tubes

    OpenAIRE

    Kamal A. R. Ismai; Fatima A. M. Lino

    2016-01-01

    Phase change materials although attaractive option for thermal storage applications its main drawback is the slow thermal response during charging and discharging processes due to their low thermal conductivity. The present study validates a model developed by the authors some years ago on radial fins as a method to meliorate the thermal performance of PCM in horizontal storage system. The developed model for the radial finned tube is based on pure conduction, the enthalpy approach and was di...

  14. Toward metrics and model validation in web-site QEM

    OpenAIRE

    Olsina Santos, Luis Antonio; Pons, Claudia; Rossi, Gustavo Héctor

    2000-01-01

    In this work, a conceptual framework and the associated strategies for metrics and model validation are analyzed regarding website measurement and evaluation. Particularly, we have conducted three case studies in different Web domains in order to evaluate and compare the quality of sites. For such an end the quantitative, model-based methodology, so-called Web-site QEM (Quality Evaluation Methodology), was utilized. In the assessment process of sites, definition of attributes and measurements...

  15. Validating firn compaction model with remote sensing data

    OpenAIRE

    2011-01-01

    A comprehensive understanding of firn processes is of outmost importance, when estimating present and future changes of the Greenland Ice Sheet. Especially, when remote sensing altimetry is used to assess the state of ice sheets and their contribution to global sea level rise, firn compaction models have been shown to be a key component. Now, remote sensing data can also be used to validate the firn models. Radar penetrating the upper part of the firn column in the interior part of Greenland ...

  16. The Transgenic RNAi Project at Harvard Medical School: Resources and Validation.

    Science.gov (United States)

    Perkins, Lizabeth A; Holderbaum, Laura; Tao, Rong; Hu, Yanhui; Sopko, Richelle; McCall, Kim; Yang-Zhou, Donghui; Flockhart, Ian; Binari, Richard; Shim, Hye-Seok; Miller, Audrey; Housden, Amy; Foos, Marianna; Randkelv, Sakara; Kelley, Colleen; Namgyal, Pema; Villalta, Christians; Liu, Lu-Ping; Jiang, Xia; Huan-Huan, Qiao; Wang, Xia; Fujiyama, Asao; Toyoda, Atsushi; Ayers, Kathleen; Blum, Allison; Czech, Benjamin; Neumuller, Ralph; Yan, Dong; Cavallaro, Amanda; Hibbard, Karen; Hall, Don; Cooley, Lynn; Hannon, Gregory J; Lehmann, Ruth; Parks, Annette; Mohr, Stephanie E; Ueda, Ryu; Kondo, Shu; Ni, Jian-Quan; Perrimon, Norbert

    2015-11-01

    To facilitate large-scale functional studies in Drosophila, the Drosophila Transgenic RNAi Project (TRiP) at Harvard Medical School (HMS) was established along with several goals: developing efficient vectors for RNAi that work in all tissues, generating a genome-scale collection of RNAi stocks with input from the community, distributing the lines as they are generated through existing stock centers, validating as many lines as possible using RT-qPCR and phenotypic analyses, and developing tools and web resources for identifying RNAi lines and retrieving existing information on their quality. With these goals in mind, here we describe in detail the various tools we developed and the status of the collection, which is currently composed of 11,491 lines and covering 71% of Drosophila genes. Data on the characterization of the lines either by RT-qPCR or phenotype is available on a dedicated website, the RNAi Stock Validation and Phenotypes Project (RSVP, http://www.flyrnai.org/RSVP.html), and stocks are available from three stock centers, the Bloomington Drosophila Stock Center (United States), National Institute of Genetics (Japan), and TsingHua Fly Center (China).

  17. Validation of models with constant bias: an applied approach

    Directory of Open Access Journals (Sweden)

    Salvador Medina-Peralta

    2014-06-01

    Full Text Available Objective. This paper presents extensions to the statistical validation method based on the procedure of Freese when a model shows constant bias (CB in its predictions and illustrate the method with data from a new mechanistic model that predict weight gain in cattle. Materials and methods. The extensions were the hypothesis tests and maximum anticipated error for the alternative approach, and the confidence interval for a quantile of the distribution of errors. Results. The model evaluated showed CB, once the CB is removed and with a confidence level of 95%, the magnitude of the error does not exceed 0.575 kg. Therefore, the validated model can be used to predict the daily weight gain of cattle, although it will require an adjustment in its structure based on the presence of CB to increase the accuracy of its forecasts. Conclusions. The confidence interval for the 1-α quantile of the distribution of errors after correcting the constant bias, allows determining the top limit for the magnitude of the error of prediction and use it to evaluate the evolution of the model in the forecasting of the system. The confidence interval approach to validate a model is more informative than the hypothesis tests for the same purpose.

  18. Validity of the Bersohn–Zewail model beyond justification

    DEFF Research Database (Denmark)

    Petersen, Jakob; Henriksen, Niels Engholm; Møller, Klaus Braagaard

    2012-01-01

    The absorption of probe pulses in ultrafast pump–probe experiments can be determined from the Bersohn–Zewail (BZ) model. The model relies on classical mechanics to describe the dynamics of the nuclei in the excited electronic state prepared by the ultrashort pump pulse. The BZ model provides...... excellent agreement between the classical trajectory and the average position of the excited state wave packet. By investigating the approximations connecting the nuclear dynamics described by quantum mechanics and the BZ model, we conclude that this agreement goes far beyond the validity of the individual...

  19. Experimentally validated finite element model of electrocaloric multilayer ceramic structures

    Energy Technology Data Exchange (ETDEWEB)

    Smith, N. A. S., E-mail: nadia.smith@npl.co.uk, E-mail: maciej.rokosz@npl.co.uk, E-mail: tatiana.correia@npl.co.uk; Correia, T. M., E-mail: nadia.smith@npl.co.uk, E-mail: maciej.rokosz@npl.co.uk, E-mail: tatiana.correia@npl.co.uk [National Physical Laboratory, Hampton Road, TW11 0LW Middlesex (United Kingdom); Rokosz, M. K., E-mail: nadia.smith@npl.co.uk, E-mail: maciej.rokosz@npl.co.uk, E-mail: tatiana.correia@npl.co.uk [National Physical Laboratory, Hampton Road, TW11 0LW Middlesex (United Kingdom); Department of Materials, Imperial College London, London SW7 2AZ (United Kingdom)

    2014-07-28

    A novel finite element model to simulate the electrocaloric response of a multilayer ceramic capacitor (MLCC) under real environment and operational conditions has been developed. The two-dimensional transient conductive heat transfer model presented includes the electrocaloric effect as a source term, as well as accounting for radiative and convective effects. The model has been validated with experimental data obtained from the direct imaging of MLCC transient temperature variation under application of an electric field. The good agreement between simulated and experimental data, suggests that the novel experimental direct measurement methodology and the finite element model could be used to support the design of optimised electrocaloric units and operating conditions.

  20. Project-matrix models of marketing organization

    Directory of Open Access Journals (Sweden)

    Gutić Dragutin

    2009-01-01

    Full Text Available Unlike theory and practice of corporation organization, in marketing organization numerous forms and contents at its disposal are not reached until this day. It can be well estimated that marketing organization today in most of our companies and in almost all its parts, noticeably gets behind corporation organization. Marketing managers have always been occupied by basic, narrow marketing activities as: sales growth, market analysis, market growth and market share, marketing research, introduction of new products, modification of products, promotion, distribution etc. They rarely found it necessary to focus a bit more to different aspects of marketing management, for example: marketing planning and marketing control, marketing organization and leading. This paper deals with aspects of project - matrix marketing organization management. Two-dimensional and more-dimensional models are presented. Among two-dimensional, these models are analyzed: Market management/products management model; Products management/management of product lifecycle phases on market model; Customers management/marketing functions management model; Demand management/marketing functions management model; Market positions management/marketing functions management model. .

  1. Building information models for astronomy projects

    Science.gov (United States)

    Ariño, Javier; Murga, Gaizka; Campo, Ramón; Eletxigerra, Iñigo; Ampuero, Pedro

    2012-09-01

    A Building Information Model is a digital representation of physical and functional characteristics of a building. BIMs represent the geometrical characteristics of the Building, but also properties like bills of quantities, definition of COTS components, status of material in the different stages of the project, project economic data, etc. The BIM methodology, which is well established in the Architecture Engineering and Construction (AEC) domain for conventional buildings, has been brought one step forward in its application for Astronomical/Scientific facilities. In these facilities steel/concrete structures have high dynamic and seismic requirements, M&E installations are complex and there is a large amount of special equipment and mechanisms involved as a fundamental part of the facility. The detail design definition is typically implemented by different design teams in specialized design software packages. In order to allow the coordinated work of different engineering teams, the overall model, and its associated engineering database, is progressively integrated using a coordination and roaming software which can be used before starting construction phase for checking interferences, planning the construction sequence, studying maintenance operation, reporting to the project office, etc. This integrated design & construction approach will allow to efficiently plan construction sequence (4D). This is a powerful tool to study and analyze in detail alternative construction sequences and ideally coordinate the work of different construction teams. In addition engineering, construction and operational database can be linked to the virtual model (6D), what gives to the end users a invaluable tool for the lifecycle management, as all the facility information can be easily accessed, added or replaced. This paper presents the BIM methodology as implemented by IDOM with the E-ELT and ATST Enclosures as application examples.

  2. Validation of a Model for Teaching Canine Fundoscopy.

    Science.gov (United States)

    Nibblett, Belle Marie D; Pereira, Mary Mauldin; Williamson, Julie A; Sithole, Fortune

    2015-01-01

    A validated teaching model for canine fundoscopic examination was developed to improve Day One fundoscopy skills while at the same time reducing use of teaching dogs. This novel eye model was created from a hollow plastic ball with a cutout for the pupil, a suspended 20-diopter lens, and paint and paper simulation of relevant eye structures. This eye model was mounted on a wooden stand with canine head landmarks useful in performing fundoscopy. Veterinary educators performed fundoscopy using this model and completed a survey to establish face and content validity. Subsequently, veterinary students were randomly assigned to pre-laboratory training with or without the use of this teaching model. After completion of an ophthalmology laboratory on teaching dogs, student outcome was assessed by measuring students' ability to see a symbol inserted on the simulated retina in the model. Students also completed a survey regarding their experience with the model and the laboratory. Overall, veterinary educators agreed that this eye model was well constructed and useful in teaching good fundoscopic technique. Student performance of fundoscopy was not negatively impacted by the use of the model. This novel canine model shows promise as a teaching and assessment tool for fundoscopy.

  3. FDA 2011 process validation guidance: lifecycle compliance model.

    Science.gov (United States)

    Campbell, Cliff

    2014-01-01

    This article has been written as a contribution to the industry's efforts in migrating from a document-driven to a data-driven compliance mindset. A combination of target product profile, control engineering, and general sum principle techniques is presented as the basis of a simple but scalable lifecycle compliance model in support of modernized process validation. Unit operations and significant variables occupy pole position within the model, documentation requirements being treated as a derivative or consequence of the modeling process. The quality system is repositioned as a subordinate of system quality, this being defined as the integral of related "system qualities". The article represents a structured interpretation of the U.S. Food and Drug Administration's 2011 Guidance for Industry on Process Validation and is based on the author's educational background and his manufacturing/consulting experience in the validation field. The U.S. Food and Drug Administration's Guidance for Industry on Process Validation (2011) provides a wide-ranging and rigorous outline of compliant drug manufacturing requirements relative to its 20(th) century predecessor (1987). Its declared focus is patient safety, and it identifies three inter-related (and obvious) stages of the compliance lifecycle. Firstly, processes must be designed, both from a technical and quality perspective. Secondly, processes must be qualified, providing evidence that the manufacturing facility is fully "roadworthy" and fit for its intended purpose. Thirdly, processes must be verified, meaning that commercial batches must be monitored to ensure that processes remain in a state of control throughout their lifetime.

  4. Integrative neural networks models for stream assessment in restoration projects

    Science.gov (United States)

    Gazendam, Ed; Gharabaghi, Bahram; Ackerman, Josef D.; Whiteley, Hugh

    2016-05-01

    Stream-habitat assessment for evaluation of restoration projects requires the examination of many parameters, both watershed-scale and reach-scale, to incorporate the complex non-linear effects of geomorphic, riparian, watershed and hydrologic factors on aquatic ecosystems. Rapid geomorphic assessment tools used by many jurisdictions to assess natural channel design projects seldom include watershed-level parameters, which have been shown to have a significant effect on benthic habitat in stream systems. In this study, Artificial Neural Network (ANN) models were developed to integrate complex non-linear relationships between the aquatic ecosystem health indices and key watershed-scale and reach-scale parameters. Physical stream parameters, based on QHEI parameters, and watershed characteristics data were collected at 112 sites on 62 stream systems located in Southern Ontario. Benthic data were collected separately and benthic invertebrate summary indices, specifically Hilsenhoff's Biotic Index (HBI) and Richness, were determined. The ANN models were trained on the randomly selected 3/4 of the dataset of 112 streams in Ontario, Canada and validated on the remaining 1/4. The R2 values for the developed ANN model predictions were 0.86 for HBI and 0.92 for Richness. Sensitivity analysis of the trained ANN models revealed that Richness was directly proportional to Erosion and Riparian Width and inversely proportional to Floodplain Quality and Substrate parameters. HBI was directly proportional to Velocity Types and Erosion and inversely proportional to Substrate, % Treed and 1:2 Year Flood Flow parameters. The ANN models can be useful tools for watershed managers in stream assessment and restoration projects by allowing consideration of watershed properties in the stream assessment.

  5. Systemic change increases model projection uncertainty

    Science.gov (United States)

    Verstegen, Judith; Karssenberg, Derek; van der Hilst, Floor; Faaij, André

    2014-05-01

    Most spatio-temporal models are based on the assumption that the relationship between system state change and its explanatory processes is stationary. This means that model structure and parameterization are usually kept constant over time, ignoring potential systemic changes in this relationship resulting from e.g., climatic or societal changes, thereby overlooking a source of uncertainty. We define systemic change as a change in the system indicated by a system state change that cannot be simulated using a constant model structure. We have developed a method to detect systemic change, using a Bayesian data assimilation technique, the particle filter. The particle filter was used to update the prior knowledge about the model structure. In contrast to the traditional particle filter approach (e.g., Verstegen et al., 2014), we apply the filter separately for each point in time for which observations are available, obtaining the optimal model structure for each of the time periods in between. This allows us to create a time series of the evolution of the model structure. The Runs test (Wald and Wolfowitz, 1940), a stationarity test, is used to check whether variation in this time series can be attributed to randomness or not. If not, this indicates systemic change. The uncertainty that the systemic change adds to the existing model projection uncertainty can be determined by comparing model outcomes of a model with a stationary model structure and a model with a model structure changing according to the variation found in the time series. To test the systemic change detection methodology, we apply it to a land use change cellular automaton (CA) (Verstegen et al., 2012) and use observations of real land use from all years from 2004 to 2012 and associated uncertainty as observational data in the particle filter. A systemic change was detected for the period 2006 to 2008. In this period the influence on the location of sugar cane expansion of the driver sugar cane in

  6. Verification and Validation of Heat Transfer Model of AGREE Code

    Energy Technology Data Exchange (ETDEWEB)

    Tak, N. I. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Seker, V.; Drzewiecki, T. J.; Downar, T. J. [Department of Nuclear Engineering and Radiological Sciences, Univ. of Michigan, Michigan (United States); Kelly, J. M. [US Nuclear Regulatory Commission, Washington (United States)

    2013-05-15

    The AGREE code was originally developed as a multi physics simulation code to perform design and safety analysis of Pebble Bed Reactors (PBR). Currently, additional capability for the analysis of Prismatic Modular Reactor (PMR) core is in progress. Newly implemented fluid model for a PMR core is based on a subchannel approach which has been widely used in the analyses of light water reactor (LWR) cores. A hexagonal fuel (or graphite block) is discretized into triangular prism nodes having effective conductivities. Then, a meso-scale heat transfer model is applied to the unit cell geometry of a prismatic fuel block. Both unit cell geometries of multi-hole and pin-in-hole types of prismatic fuel blocks are considered in AGREE. The main objective of this work is to verify and validate the heat transfer model newly implemented for a PMR core in the AGREE code. The measured data in the HENDEL experiment were used for the validation of the heat transfer model for a pin-in-hole fuel block. However, the HENDEL tests were limited to only steady-state conditions of pin-in-hole fuel blocks. There exist no available experimental data regarding a heat transfer in multi-hole fuel blocks. Therefore, numerical benchmarks using conceptual problems are considered to verify the heat transfer model of AGREE for multi-hole fuel blocks as well as transient conditions. The CORONA and GAMMA+ codes were used to compare the numerical results. In this work, the verification and validation study were performed for the heat transfer model of the AGREE code using the HENDEL experiment and the numerical benchmarks of selected conceptual problems. The results of the present work show that the heat transfer model of AGREE is accurate and reliable for prismatic fuel blocks. Further validation of AGREE is in progress for a whole reactor problem using the HTTR safety test data such as control rod withdrawal tests and loss-of-forced convection tests.

  7. Propeller aircraft interior noise model utilization study and validation

    Science.gov (United States)

    Pope, L. D.

    1984-01-01

    Utilization and validation of a computer program designed for aircraft interior noise prediction is considered. The program, entitled PAIN (an acronym for Propeller Aircraft Interior Noise), permits (in theory) predictions of sound levels inside propeller driven aircraft arising from sidewall transmission. The objective of the work reported was to determine the practicality of making predictions for various airplanes and the extent of the program's capabilities. The ultimate purpose was to discern the quality of predictions for tonal levels inside an aircraft occurring at the propeller blade passage frequency and its harmonics. The effort involved three tasks: (1) program validation through comparisons of predictions with scale-model test results; (2) development of utilization schemes for large (full scale) fuselages; and (3) validation through comparisons of predictions with measurements taken in flight tests on a turboprop aircraft. Findings should enable future users of the program to efficiently undertake and correctly interpret predictions.

  8. R&D for computational cognitive and social models : foundations for model evaluation through verification and validation (final LDRD report).

    Energy Technology Data Exchange (ETDEWEB)

    Slepoy, Alexander; Mitchell, Scott A.; Backus, George A.; McNamara, Laura A.; Trucano, Timothy Guy

    2008-09-01

    Sandia National Laboratories is investing in projects that aim to develop computational modeling and simulation applications that explore human cognitive and social phenomena. While some of these modeling and simulation projects are explicitly research oriented, others are intended to support or provide insight for people involved in high consequence decision-making. This raises the issue of how to evaluate computational modeling and simulation applications in both research and applied settings where human behavior is the focus of the model: when is a simulation 'good enough' for the goals its designers want to achieve? In this report, we discuss two years' worth of review and assessment of the ASC program's approach to computational model verification and validation, uncertainty quantification, and decision making. We present a framework that extends the principles of the ASC approach into the area of computational social and cognitive modeling and simulation. In doing so, we argue that the potential for evaluation is a function of how the modeling and simulation software will be used in a particular setting. In making this argument, we move from strict, engineering and physics oriented approaches to V&V to a broader project of model evaluation, which asserts that the systematic, rigorous, and transparent accumulation of evidence about a model's performance under conditions of uncertainty is a reasonable and necessary goal for model evaluation, regardless of discipline. How to achieve the accumulation of evidence in areas outside physics and engineering is a significant research challenge, but one that requires addressing as modeling and simulation tools move out of research laboratories and into the hands of decision makers. This report provides an assessment of our thinking on ASC Verification and Validation, and argues for further extending V&V research in the physical and engineering sciences toward a broader program of model

  9. Dynamic validation of the Planck/LFI thermal model

    CERN Document Server

    Tomasi, M; Gregorio, A; Colombo, F; Lapolla, M; Terenzi, L; Morgante, G; Bersanelli, M; Butler, R C; Galeotta, S; Mandolesi, N; Maris, M; Mennella, A; Valenziano, L; Zacchei, A; 10.1088/1748-0221/5/01/T01002

    2010-01-01

    The Low Frequency Instrument (LFI) is an array of cryogenically cooled radiometers on board the Planck satellite, designed to measure the temperature and polarization anisotropies of the cosmic microwave backgrond (CMB) at 30, 44 and 70 GHz. The thermal requirements of the LFI, and in particular the stringent limits to acceptable thermal fluctuations in the 20 K focal plane, are a critical element to achieve the instrument scientific performance. Thermal tests were carried out as part of the on-ground calibration campaign at various stages of instrument integration. In this paper we describe the results and analysis of the tests on the LFI flight model (FM) performed at Thales Laboratories in Milan (Italy) during 2006, with the purpose of experimentally sampling the thermal transfer functions and consequently validating the numerical thermal model describing the dynamic response of the LFI focal plane. This model has been used extensively to assess the ability of LFI to achieve its scientific goals: its valid...

  10. Validation of a finite element model of the human metacarpal.

    Science.gov (United States)

    Barker, D S; Netherway, D J; Krishnan, J; Hearn, T C

    2005-03-01

    Implant loosening and mechanical failure of components are frequently reported following metacarpophalangeal (MCP) joint replacement. Studies of the mechanical environment of the MCP implant-bone construct are rare. The objective of this study was to evaluate the predictive ability of a finite element model of the intact second human metacarpal to provide a validated baseline for further mechanical studies. A right index human metacarpal was subjected to torsion and combined axial/bending loading using strain gauge (SG) and 3D finite element (FE) analysis. Four different representations of bone material properties were considered. Regression analyses were performed comparing maximum and minimum principal surface strains taken from the SG and FE models. Regression slopes close to unity and high correlation coefficients were found when the diaphyseal cortical shell was modelled as anisotropic and cancellous bone properties were derived from quantitative computed tomography. The inclusion of anisotropy for cortical bone was strongly influential in producing high model validity whereas variation in methods of assigning stiffness to cancellous bone had only a minor influence. The validated FE model provides a tool for future investigations of current and novel MCP joint prostheses.

  11. Beyond Corroboration: Strengthening Model Validation by Looking for Unexpected Patterns.

    Directory of Open Access Journals (Sweden)

    Guillaume Chérel

    Full Text Available Models of emergent phenomena are designed to provide an explanation to global-scale phenomena from local-scale processes. Model validation is commonly done by verifying that the model is able to reproduce the patterns to be explained. We argue that robust validation must not only be based on corroboration, but also on attempting to falsify the model, i.e. making sure that the model behaves soundly for any reasonable input and parameter values. We propose an open-ended evolutionary method based on Novelty Search to look for the diverse patterns a model can produce. The Pattern Space Exploration method was tested on a model of collective motion and compared to three common a priori sampling experiment designs. The method successfully discovered all known qualitatively different kinds of collective motion, and performed much better than the a priori sampling methods. The method was then applied to a case study of city system dynamics to explore the model's predicted values of city hierarchisation and population growth. This case study showed that the method can provide insights on potential predictive scenarios as well as falsifiers of the model when the simulated dynamics are highly unrealistic.

  12. Using an Instrumented Mine to Validate Models Predicting Mine Burial

    Science.gov (United States)

    2016-06-07

    surface- gravity -wave-induced momentary and cyclic liquefaction models. APPROACH In FY98 we designed and constructed a mine analogue instrumented with...1 c m S p ac in g Ju ly Augu st 1 3 5 7 9 113028 RELATED PROJECTS 1 – The 1999 NRL 6.1 High -Frequency Acoustics Spatial Variability field

  13. Predicting the ungauged basin: model validation and realism assessment

    Science.gov (United States)

    van Emmerik, Tim; Mulder, Gert; Eilander, Dirk; Piet, Marijn; Savenije, Hubert

    2016-04-01

    The hydrological decade on Predictions in Ungauged Basins (PUB) [1] led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of model outcome has not been discussed to a great extent. With this study [2] we aim to contribute to the discussion on how one can determine the value and validity of a hydrological model developed for an ungauged basin. As in many cases no local, or even regional, data are available, alternative methods should be applied. Using a PUB case study in a genuinely ungauged basin in southern Cambodia, we give several examples of how one can use different types of soft data to improve model design, calibrate and validate the model, and assess the realism of the model output. A rainfall-runoff model was coupled to an irrigation reservoir, allowing the use of additional and unconventional data. The model was mainly forced with remote sensing data, and local knowledge was used to constrain the parameters. Model realism assessment was done using data from surveys. This resulted in a successful reconstruction of the reservoir dynamics, and revealed the different hydrological characteristics of the two topographical classes. We do not present a generic approach that can be transferred to other ungauged catchments, but we aim to show how clever model design and alternative data acquisition can result in a valuable hydrological model for ungauged catchments. [1] Sivapalan, M., Takeuchi, K., Franks, S., Gupta, V., Karambiri, H., Lakshmi, V., et al. (2003). IAHS decade on predictions in ungauged basins (PUB), 2003-2012: shaping an exciting future for the hydrological sciences. Hydrol. Sci. J. 48, 857-880. doi: 10.1623/hysj.48.6.857.51421 [2] van Emmerik, T., Mulder, G., Eilander, D., Piet, M. and Savenije, H. (2015). Predicting the ungauged basin: model validation and realism assessment

  14. A verification and validation process for model-driven engineering

    Science.gov (United States)

    Delmas, R.; Pires, A. F.; Polacsek, T.

    2013-12-01

    Model Driven Engineering practitioners already benefit from many well established verification tools, for Object Constraint Language (OCL), for instance. Recently, constraint satisfaction techniques have been brought to Model-Driven Engineering (MDE) and have shown promising results on model verification tasks. With all these tools, it becomes possible to provide users with formal support from early model design phases to model instantiation phases. In this paper, a selection of such tools and methods is presented, and an attempt is made to define a verification and validation process for model design and instance creation centered on UML (Unified Modeling Language) class diagrams and declarative constraints, and involving the selected tools. The suggested process is illustrated with a simple example.

  15. A decision model for energy companies that sorts projects, classifies the project manager and recommends the final match between project and project manager

    Directory of Open Access Journals (Sweden)

    Elaine Cristina Batista de Oliveira

    2016-03-01

    Full Text Available Abstract This study presents an integrated model to support the process of classifying projects and selecting project managers for these projects in accordance with their characteristics and skills using a multiple criteria decision aid (MCDA approach. Such criteria are often conflicting. The model also supports the process of allocating project managers to projects by evaluating the characteristics/types of projects. The framework consists of a set of structured techniques and methods that are deemed very appropriate within the context of project management. A practical application of the proposed model was performed in a Brazilian electric energy company, which has a portfolio of projects that are specifically related to the company´s defined strategic plan. As a result, it was possible to classify the projects and project managers into definable categories, thus enabling more effective management as different projects require different levels of skills and abilities.

  16. Cloud radiative effects and changes simulated by the Coupled Model Intercomparison Project Phase 5 models

    Science.gov (United States)

    Shin, Sun-Hee; Kim, Ok-Yeon; Kim, Dongmin; Lee, Myong-In

    2017-07-01

    Using 32 CMIP5 (Coupled Model Intercomparison Project Phase 5) models, this study examines the veracity in the simulation of cloud amount and their radiative effects (CREs) in the historical run driven by observed external radiative forcing for 1850-2005, and their future changes in the RCP (Representative Concentration Pathway) 4.5 scenario runs for 2006-2100. Validation metrics for the historical run are designed to examine the accuracy in the representation of spatial patterns for climatological mean, and annual and interannual variations of clouds and CREs. The models show large spread in the simulation of cloud amounts, specifically in the low cloud amount. The observed relationship between cloud amount and the controlling large-scale environment are also reproduced diversely by various models. Based on the validation metrics, four models—ACCESS1.0, ACCESS1.3, HadGEM2-CC, and HadGEM2-ES—are selected as best models, and the average of the four models performs more skillfully than the multimodel ensemble average. All models project global-mean SST warming at the increase of the greenhouse gases, but the magnitude varies across the simulations between 1 and 2 K, which is largely attributable to the difference in the change of cloud amount and distribution. The models that simulate more SST warming show a greater increase in the net CRE due to reduced low cloud and increased incoming shortwave radiation, particularly over the regions of marine boundary layer in the subtropics. Selected best-performing models project a significant reduction in global-mean cloud amount of about -0.99% K-1 and net radiative warming of 0.46 W m-2 K-1, suggesting a role of positive feedback to global warming.

  17. Improvement and Validation of Weld Residual Stress Modelling Procedure

    Energy Technology Data Exchange (ETDEWEB)

    Zang, Weilin; Gunnars, Jens (Inspecta Technology AB, Stockholm (Sweden)); Dong, Pingsha; Hong, Jeong K. (Center for Welded Structures Research, Battelle, Columbus, OH (United States))

    2009-06-15

    The objective of this work is to identify and evaluate improvements for the residual stress modelling procedure currently used in Sweden. There is a growing demand to eliminate any unnecessary conservatism involved in residual stress assumptions. The study was focused on the development and validation of an improved weld residual stress modelling procedure, by taking advantage of the recent advances in residual stress modelling and stress measurement techniques. The major changes applied in the new weld residual stress modelling procedure are: - Improved procedure for heat source calibration based on use of analytical solutions. - Use of an isotropic hardening model where mixed hardening data is not available. - Use of an annealing model for improved simulation of strain relaxation in re-heated material. The new modelling procedure is demonstrated to capture the main characteristics of the through thickness stress distributions by validation to experimental measurements. Three austenitic stainless steel butt-welds cases are analysed, covering a large range of pipe geometries. From the cases it is evident that there can be large differences between the residual stresses predicted using the new procedure, and the earlier procedure or handbook recommendations. Previously recommended profiles could give misleading fracture assessment results. The stress profiles according to the new procedure agree well with the measured data. If data is available then a mixed hardening model should be used

  18. Validation of Advanced EM Models for UXO Discrimination

    CERN Document Server

    Weichman, Peter B

    2012-01-01

    The work reported here details basic validation of our advanced physics-based EMI forward and inverse models against data collected by the NRL TEMTADS system. The data was collected under laboratory-type conditions using both artificial spheroidal targets and real UXO. The artificial target models are essentially exact, and enable detailed comparison of theory and data in support of measurement platform characterization and target identification. Real UXO targets cannot be treated exactly, but it is demonstrated that quantitative comparisons of the data with the spheroid models nevertheless aids in extracting key target discrimination information, such as target geometry and hollow target shell thickness.

  19. Rocket Combustor Validation Data for Advanced Combustion Models Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The pace and cost of developing an engine system for future explorations is strongly influenced by the inadequacies of design tools and the supporting databases. The...

  20. Project Report of Virtual Experiments in Marine Bioacoustics: Model Validation

    Science.gov (United States)

    2010-08-01

    killer whale (Pseudorca crassidens). J. Acoust. Soc. Am. 98: 51-59. AU, W.W.L., R.H. PENNER, AND C.W. TURL. 1987. Propagation of beluga ...Atlantic Norfolk, VA Merel Dalebout University of New South Wales Sydney, Australia Robin W. Baird Cascadia Research Collective Olympia

  1. Experimental validation of a solar-chimney power plant model

    Science.gov (United States)

    Fathi, Nima; Wayne, Patrick; Trueba Monje, Ignacio; Vorobieff, Peter

    2016-11-01

    In a solar chimney power plant system (SCPPS), the energy of buoyant hot air is converted to electrical energy. SCPPS includes a collector at ground level covered with a transparent roof. Solar radiation heats the air inside and the ground underneath. There is a tall chimney at the center of the collector, and a turbine located at the base of the chimney. Lack of detailed experimental data for validation is one of the important issues in modeling this type of power plants. We present a small-scale experimental prototype developed to perform validation analysis for modeling and simulation of SCCPS. Detailed velocity measurements are acquired using particle image velocimetry (PIV) at a prescribed Reynolds number. Convection is driven by a temperature-controlled hot plate at the bottom of the prototype. Velocity field data are used to perform validation analysis and measure any mismatch of the experimental results and the CFD data. CFD Code verification is also performed, to assess the uncertainly of the numerical model with respect to our grid and the applied mathematical model. The dimensionless output power of the prototype is calculated and compared with a recent analytical solution and the experimental results.

  2. Assessment of the Value, Impact, and Validity of the Jobs and Economic Development Impacts (JEDI) Suite of Models

    Energy Technology Data Exchange (ETDEWEB)

    Billman, L.; Keyser, D.

    2013-08-01

    The Jobs and Economic Development Impacts (JEDI) models, developed by the National Renewable Energy Laboratory (NREL) for the U.S. Department of Energy (DOE) Office of Energy Efficiency and Renewable Energy (EERE), use input-output methodology to estimate gross (not net) jobs and economic impacts of building and operating selected types of renewable electricity generation and fuel plants. This analysis provides the DOE with an assessment of the value, impact, and validity of the JEDI suite of models. While the models produce estimates of jobs, earnings, and economic output, this analysis focuses only on jobs estimates. This validation report includes an introduction to JEDI models, an analysis of the value and impact of the JEDI models, and an analysis of the validity of job estimates generated by JEDI model through comparison to other modeled estimates and comparison to empirical, observed jobs data as reported or estimated for a commercial project, a state, or a region.

  3. Army Synthetic Validity Project: Report of Phase 2 Results. Volume 1

    Science.gov (United States)

    1990-06-01

    Interest in Efficiency & Org. 394 198.59 28.42 Prformane Criterio Measure CEP: Core Technical Prof. 394 102.72 15.03 6-12 Army Syheric Validation...measures: The modeling of performance. Paper presented at the Second Annual Conference of the Society for Industrial and Organizational Psychology, Atlanta ...the Society of Industrial and Organizational Psychology, Atlanta . Peterson, N. G., Owens-Kurtz, C. K., Hoffman, R. G., Arabian, J. M. & Whetzel, D. L

  4. Calibration of Predictor Models Using Multiple Validation Experiments

    Science.gov (United States)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2015-01-01

    This paper presents a framework for calibrating computational models using data from several and possibly dissimilar validation experiments. The offset between model predictions and observations, which might be caused by measurement noise, model-form uncertainty, and numerical error, drives the process by which uncertainty in the models parameters is characterized. The resulting description of uncertainty along with the computational model constitute a predictor model. Two types of predictor models are studied: Interval Predictor Models (IPMs) and Random Predictor Models (RPMs). IPMs use sets to characterize uncertainty, whereas RPMs use random vectors. The propagation of a set through a model makes the response an interval valued function of the state, whereas the propagation of a random vector yields a random process. Optimization-based strategies for calculating both types of predictor models are proposed. Whereas the formulations used to calculate IPMs target solutions leading to the interval value function of minimal spread containing all observations, those for RPMs seek to maximize the models' ability to reproduce the distribution of observations. Regarding RPMs, we choose a structure for the random vector (i.e., the assignment of probability to points in the parameter space) solely dependent on the prediction error. As such, the probabilistic description of uncertainty is not a subjective assignment of belief, nor is it expected to asymptotically converge to a fixed value, but instead it casts the model's ability to reproduce the experimental data. This framework enables evaluating the spread and distribution of the predicted response of target applications depending on the same parameters beyond the validation domain.

  5. Finite Element Model and Validation of Nasal Tip Deformation.

    Science.gov (United States)

    Manuel, Cyrus T; Harb, Rani; Badran, Alan; Ho, David; Wong, Brian J F

    2017-03-01

    Nasal tip mechanical stability is important for functional and cosmetic nasal airway surgery. Palpation of the nasal tip provides information on tip strength to the surgeon, though it is a purely subjective assessment. Providing a means to simulate nasal tip deformation with a validated model can offer a more objective approach in understanding the mechanics and nuances of the nasal tip support and eventual nasal mechanics as a whole. Herein we present validation of a finite element (FE) model of the nose using physical measurements recorded using an ABS plastic-silicone nasal phantom. Three-dimensional photogrammetry was used to capture the geometry of the phantom at rest and while under steady state load. The silicone used to make the phantom was mechanically tested and characterized using a linear elastic constitutive model. Surface point clouds of the silicone and FE model were compared for both the loaded and unloaded state. The average Hausdorff distance between actual measurements and FE simulations across the nose were 0.39 ± 1.04 mm and deviated up to 2 mm at the outermost boundaries of the model. FE simulation and measurements were in near complete agreement in the immediate vicinity of the nasal tip with millimeter accuracy. We have demonstrated validation of a two-component nasal FE model, which could be used to model more complex modes of deformation where direct measurement may be challenging. This is the first step in developing a nasal model to simulate nasal mechanics and ultimately the interaction between geometry and airflow.

  6. Validation of the Osteopenia Sheep Model for Orthopaedic Biomaterial Research

    DEFF Research Database (Denmark)

    Ding, Ming; Danielsen, C.C.; Cheng, L.

    2009-01-01

    Validation of the Osteopenia Sheep Model for Orthopaedic Biomaterial Research +1Ding, M; 2Danielsen, CC; 1Cheng, L; 3Bollen, P; 4Schwarz, P; 1Overgaard, S +1Dept of Orthopaedics O, Odense University Hospital, Denmark, 2Dept of Connective Tissue Biology, University of Aarhus, Denmark, 3Biomedicine...... Lab, University of Southern Denmark, 4Dept of Geriatrics, Glostrup University Hospital, Denmark ming.ding@ouh.regionsyddanmark.dk   Introduction:  Currently, majority orthopaedic prosthesis and biomaterial researches have been based on investigation in normal animals. In most clinical situations, most...... resemble osteoporosis in humans. This study aimed to validate glucocorticoid-induced osteopenia sheep model for orthopaedic implant and biomaterial research. We hypothesized that a 7-month GC treatment together with restricted diet but without OVX would induce osteopenia. Materials and Methods: Eighteen...

  7. Seine estuary modelling and AirSWOT measurements validation

    Science.gov (United States)

    Chevalier, Laetitia; Lyard, Florent; Laignel, Benoit

    2013-04-01

    In the context of global climate change, knowing water fluxes and storage, from the global scale to the local scale, is a crucial issue. The future satellite SWOT (Surface Water and Ocean Topography) mission, dedicated to the surface water observation, is proposed to meet this challenge. SWOT main payload will be a Ka-band Radar Interferometer (KaRIn). To validate this new kind of measurements, preparatory airborne campaigns (called AirSWOT) are currently being designed. AirSWOT will carry an interferometer similar to Karin: Kaspar-Ka-band SWOT Phenomenology Airborne Radar. Some campaigns are planned in France in 2014. During these campaigns, the plane will fly over the Seine River basin, especially to observe its estuary, the upstream river main channel (to quantify river-aquifer exchange) and some wetlands. The present work objective is to validate the ability of AirSWOT and SWOT, using a Seine estuary hydrodynamic modelling. In this context, field measurements will be collected by different teams such as GIP (Public Interest Group) Seine Aval, the GPMR (Rouen Seaport), SHOM (Hydrographic and Oceanographic Service of the Navy), the IFREMER (French Research Institute for Sea Exploitation), Mercator-Ocean, LEGOS (Laboratory of Space Study in Geophysics and Oceanography), ADES (Data Access Groundwater) ... . These datasets will be used first to validate locally AirSWOT measurements, and then to improve a hydrodynamic simulations (using tidal boundary conditions, river and groundwater inflows ...) for AirSWOT data 2D validation. This modelling will also be used to estimate the benefit of the future SWOT mission for mid-latitude river hydrology. To do this modelling,the TUGOm barotropic model (Toulouse Unstructured Grid Ocean model 2D) is used. Preliminary simulations have been performed by first modelling and then combining to different regions: first the Seine River and its estuarine area and secondly the English Channel. These two simulations h are currently being

  8. WEPP model implementation project with the USDA-Natural Resources Conservation Service

    Science.gov (United States)

    The Water Erosion Prediction Project (WEPP) is a physical process-based soil erosion model that can be used to estimate runoff, soil loss, and sediment yield from hillslope profiles, fields, and small watersheds. Initially developed from 1985-1995, WEPP has been applied and validated across a wide r...

  9. Certified reduced basis model validation: A frequentistic uncertainty framework

    OpenAIRE

    Patera, A. T.; Huynh, Dinh Bao Phuong; Knezevic, David; Patera, Anthony T.

    2011-01-01

    We introduce a frequentistic validation framework for assessment — acceptance or rejection — of the consistency of a proposed parametrized partial differential equation model with respect to (noisy) experimental data from a physical system. Our method builds upon the Hotelling T[superscript 2] statistical hypothesis test for bias first introduced by Balci and Sargent in 1984 and subsequently extended by McFarland and Mahadevan (2008). Our approach introduces two new elements: a spectral repre...

  10. Trailing Edge Noise Model Validation and Application to Airfoil Optimization

    DEFF Research Database (Denmark)

    Bertagnolio, Franck; Aagaard Madsen, Helge; Bak, Christian

    2010-01-01

    The aim of this article is twofold. First, an existing trailing edge noise model is validated by comparing with airfoil surface pressure fluctuations and far field sound pressure levels measured in three different experiments. The agreement is satisfactory in one case but poor in two other cases...... across the boundary layer near the trailing edge and to a lesser extent by a smaller boundary layer displacement thickness. ©2010 American Society of Mechanical Engineers...

  11. Projections of global changes in precipitation extremes from Coupled Model Intercomparison Project Phase 5 models

    NARCIS (Netherlands)

    Toreti, A.; Naveau, P.; Zampieri, M.; Schindler, A.; Scoccimarro, E.; Xoplaki, E.; Dijkstra, H.A.|info:eu-repo/dai/nl/073504467; Gualdi, S.; Luterbacher, J.

    2013-01-01

    Precipitation extremes are expected to increase in a warming climate; thus, it is essential to characterize their potential future changes. Here we evaluate eight high-resolution global climate model simulations in the twentieth century and provide new evidence on projected global precipitation

  12. Projections of global changes in precipitation extremes from Coupled Model Intercomparison Project Phase 5 models

    NARCIS (Netherlands)

    Toreti, A.; Naveau, P.; Zampieri, M.; Schindler, A.; Scoccimarro, E.; Xoplaki, E.; Dijkstra, H.A.; Gualdi, S.; Luterbacher, J.

    2013-01-01

    Precipitation extremes are expected to increase in a warming climate; thus, it is essential to characterize their potential future changes. Here we evaluate eight high-resolution global climate model simulations in the twentieth century and provide new evidence on projected global precipitation extr

  13. The sigma model on complex projective superspaces

    Energy Technology Data Exchange (ETDEWEB)

    Candu, Constantin; Mitev, Vladimir; Schomerus, Volker [DESY, Hamburg (Germany). Theory Group; Quella, Thomas [Amsterdam Univ. (Netherlands). Inst. for Theoretical Physics; Saleur, Hubert [CEA Saclay, 91 - Gif-sur-Yvette (France). Inst. de Physique Theorique; USC, Los Angeles, CA (United States). Physics Dept.

    2009-08-15

    The sigma model on projective superspaces CP{sup S-1} {sup vertical} {sup stroke} {sup S} gives rise to a continuous family of interacting 2D conformal field theories which are parametrized by the curvature radius R and the theta angle {theta}. Our main goal is to determine the spectrum of the model, non-perturbatively as a function of both parameters. We succeed to do so for all open boundary conditions preserving the full global symmetry of the model. In string theory parlor, these correspond to volume filling branes that are equipped with a monopole line bundle and connection. The paper consists of two parts. In the first part, we approach the problem within the continuum formulation. Combining combinatorial arguments with perturbative studies and some simple free field calculations, we determine a closed formula for the partition function of the theory. This is then tested numerically in the second part. There we propose a spin chain regularization of the CP{sup S-1} {sup vertical} {sup stroke} {sup S} model with open boundary conditions and use it to determine the spectrum at the conformal fixed point. The numerical results are in remarkable agreement with the continuum analysis. (orig.)

  14. The Lunar Mapping and Modeling Project Update

    Science.gov (United States)

    Noble, S.; French, R.; Nall, M.; Muery, K.

    2010-01-01

    The Lunar Mapping and Modeling Project (LMMP) is managing the development of a suite of lunar mapping and modeling tools and data products that support lunar exploration activities, including the planning, design, development, test, and operations associated with crewed and/or robotic operations on the lunar surface. In addition, LMMP should prove to be a convenient and useful tool for scientific analysis and for education and public outreach (E/PO) activities. LMMP will utilize data predominately from the Lunar Reconnaissance Orbiter, but also historical and international lunar mission data (e.g. Lunar Prospector, Clementine, Apollo, Lunar Orbiter, Kaguya, and Chandrayaan-1) as available and appropriate. LMMP will provide such products as image mosaics, DEMs, hazard assessment maps, temperature maps, lighting maps and models, gravity models, and resource maps. We are working closely with the LRO team to prevent duplication of efforts and ensure the highest quality data products. A beta version of the LMMP software was released for limited distribution in December 2009, with the public release of version 1 expected in the Fall of 2010.

  15. Methods for Geometric Data Validation of 3d City Models

    Science.gov (United States)

    Wagner, D.; Alam, N.; Wewetzer, M.; Pries, M.; Coors, V.

    2015-12-01

    Geometric quality of 3D city models is crucial for data analysis and simulation tasks, which are part of modern applications of the data (e.g. potential heating energy consumption of city quarters, solar potential, etc.). Geometric quality in these contexts is however a different concept as it is for 2D maps. In the latter case, aspects such as positional or temporal accuracy and correctness represent typical quality metrics of the data. They are defined in ISO 19157 and should be mentioned as part of the metadata. 3D data has a far wider range of aspects which influence their quality, plus the idea of quality itself is application dependent. Thus, concepts for definition of quality are needed, including methods to validate these definitions. Quality on this sense means internal validation and detection of inconsistent or wrong geometry according to a predefined set of rules. A useful starting point would be to have correct geometry in accordance with ISO 19107. A valid solid should consist of planar faces which touch their neighbours exclusively in defined corner points and edges. No gaps between them are allowed, and the whole feature must be 2-manifold. In this paper, we present methods to validate common geometric requirements for building geometry. Different checks based on several algorithms have been implemented to validate a set of rules derived from the solid definition mentioned above (e.g. water tightness of the solid or planarity of its polygons), as they were developed for the software tool CityDoctor. The method of each check is specified, with a special focus on the discussion of tolerance values where they are necessary. The checks include polygon level checks to validate the correctness of each polygon, i.e. closeness of the bounding linear ring and planarity. On the solid level, which is only validated if the polygons have passed validation, correct polygon orientation is checked, after self-intersections outside of defined corner points and edges

  16. Validating induced seismicity forecast models - Induced Seismicity Test Bench

    CERN Document Server

    Kiraly-Proag, Eszter; Gischig, Valentin; Wiemer, Stefan; Karvounis, Dimitrios; Doetsch, Joseph

    2016-01-01

    Induced earthquakes often accompany fluid injection, and the seismic hazard they pose threatens various underground engineering projects. Models to monitor and control induced seismic hazard with traffic light systems should be probabilistic, forward-looking, and updated as new data arrive. In this study, we propose an Induced Seismicity Test Bench to test and rank such models; this test bench can be used for model development, model selection, and ensemble model building. We apply the test bench to data from the Basel 2006 and Soultz-sous-For\\^ets 2004 geothermal stimulation projects, and we assess forecasts from two models: Shapiro and Smoothed Seismicity (SaSS) and Hydraulics and Seismics (HySei). These models incorporate a different mix of physics-based elements and stochastic representation of the induced sequences. Our results show that neither model is fully superior to the other. Generally, HySei forecasts the seismicity rate better after shut-in, but is only mediocre at forecasting the spatial distri...

  17. A validated approach for modeling collapse of steel structures

    Science.gov (United States)

    Saykin, Vitaliy Victorovich

    A civil engineering structure is faced with many hazardous conditions such as blasts, earthquakes, hurricanes, tornadoes, floods, and fires during its lifetime. Even though structures are designed for credible events that can happen during a lifetime of the structure, extreme events do happen and cause catastrophic failures. Understanding the causes and effects of structural collapse is now at the core of critical areas of national need. One factor that makes studying structural collapse difficult is the lack of full-scale structural collapse experimental test results against which researchers could validate their proposed collapse modeling approaches. The goal of this work is the creation of an element deletion strategy based on fracture models for use in validated prediction of collapse of steel structures. The current work reviews the state-of-the-art of finite element deletion strategies for use in collapse modeling of structures. It is shown that current approaches to element deletion in collapse modeling do not take into account stress triaxiality in vulnerable areas of the structure, which is important for proper fracture and element deletion modeling. The report then reviews triaxiality and its role in fracture prediction. It is shown that fracture in ductile materials is a function of triaxiality. It is also shown that, depending on the triaxiality range, different fracture mechanisms are active and should be accounted for. An approach using semi-empirical fracture models as a function of triaxiality are employed. The models to determine fracture initiation, softening and subsequent finite element deletion are outlined. This procedure allows for stress-displacement softening at an integration point of a finite element in order to subsequently remove the element. This approach avoids abrupt changes in the stress that would create dynamic instabilities, thus making the results more reliable and accurate. The calibration and validation of these models are

  18. Modelling in Evaluating a Working Life Project in Higher Education

    Science.gov (United States)

    Sarja, Anneli; Janhonen, Sirpa; Havukainen, Pirjo; Vesterinen, Anne

    2012-01-01

    This article describes an evaluation method based on collaboration between the higher education, a care home and university, in a R&D project. The aim of the project was to elaborate modelling as a tool of developmental evaluation for innovation and competence in project cooperation. The approach was based on activity theory. Modelling enabled a…

  19. Pataha Creek Model Watershed : 1998 Habitat Conservation Projects.

    Energy Technology Data Exchange (ETDEWEB)

    Bartels, Duane G.

    1999-12-01

    The projects outlined in detail on the attached project reports are a few of the many projects implemented in the Pataha Creek Model Watershed since it was selected as a model in 1993. 1998 was a year where a focused effort was made to work on the upland conservation practices to reduce the sedimentation into Pataha Creek.

  20. Full-scale validation of a model of algal productivity.

    Science.gov (United States)

    Béchet, Quentin; Shilton, Andy; Guieysse, Benoit

    2014-12-02

    While modeling algal productivity outdoors is crucial to assess the economic and environmental performance of full-scale cultivation, most of the models hitherto developed for this purpose have not been validated under fully relevant conditions, especially with regard to temperature variations. The objective of this study was to independently validate a model of algal biomass productivity accounting for both light and temperature and constructed using parameters experimentally derived using short-term indoor experiments. To do this, the accuracy of a model developed for Chlorella vulgaris was assessed against data collected from photobioreactors operated outdoor (New Zealand) over different seasons, years, and operating conditions (temperature-control/no temperature-control, batch, and fed-batch regimes). The model accurately predicted experimental productivities under all conditions tested, yielding an overall accuracy of ±8.4% over 148 days of cultivation. For the purpose of assessing the feasibility of full-scale algal cultivation, the use of the productivity model was therefore shown to markedly reduce uncertainty in cost of biofuel production while also eliminating uncertainties in water demand, a critical element of environmental impact assessments. Simulations at five climatic locations demonstrated that temperature-control in outdoor photobioreactors would require tremendous amounts of energy without considerable increase of algal biomass. Prior assessments neglecting the impact of temperature variations on algal productivity in photobioreactors may therefore be erroneous.

  1. Validation of a Hertzian contact model with nonlinear damping

    Science.gov (United States)

    Sierakowski, Adam

    2015-11-01

    Due to limited spatial resolution, most disperse particle simulation methods rely on simplified models for incorporating short-range particle interactions. In this presentation, we introduce a contact model that combines the Hertz elastic restoring force with a nonlinear damping force, requiring only material properties and no tunable parameters. We have implemented the model in a resolved-particle flow solver that implements the Physalis method, which accurately captures hydrodynamic interactions by analytically enforcing the no-slip condition on the particle surface. We summarize the results of a few numerical studies that suggest the validity of the contact model over a range of particle interaction intensities (i.e., collision Stokes numbers) when compared with experimental data. This work was supported by the National Science Foundation under Grant Number CBET1335965 and the Johns Hopkins University Modeling Complex Systems IGERT program.

  2. Statistical validation of high-dimensional models of growing networks

    CERN Document Server

    Medo, Matus

    2013-01-01

    The abundance of models of complex networks and the current insufficient validation standards make it difficult to judge which models are strongly supported by data and which are not. We focus here on likelihood maximization methods for models of growing networks with many parameters and compare their performance on artificial and real datasets. While high dimensionality of the parameter space harms the performance of direct likelihood maximization on artificial data, this can be improved by introducing a suitable penalization term. Likelihood maximization on real data shows that the presented approach is able to discriminate among available network models. To make large-scale datasets accessible to this kind of analysis, we propose a subset sampling technique and show that it yields substantial model evidence in a fraction of time necessary for the analysis of the complete data.

  3. Filament winding cylinders. II - Validation of the process model

    Science.gov (United States)

    Calius, Emilio P.; Lee, Soo-Yong; Springer, George S.

    1990-01-01

    Analytical and experimental studies were performed to validate the model developed by Lee and Springer for simulating the manufacturing process of filament wound composite cylinders. First, results calculated by the Lee-Springer model were compared to results of the Calius-Springer thin cylinder model. Second, temperatures and strains calculated by the Lee-Springer model were compared to data. The data used in these comparisons were generated during the course of this investigation with cylinders made of Hercules IM-6G/HBRF-55 and Fiberite T-300/976 graphite-epoxy tows. Good agreement was found between the calculated and measured stresses and strains, indicating that the model is a useful representation of the winding and curing processes.

  4. Model selection, identification and validation in anaerobic digestion: a review.

    Science.gov (United States)

    Donoso-Bravo, Andres; Mailier, Johan; Martin, Cristina; Rodríguez, Jorge; Aceves-Lara, César Arturo; Vande Wouwer, Alain

    2011-11-01

    Anaerobic digestion enables waste (water) treatment and energy production in the form of biogas. The successful implementation of this process has lead to an increasing interest worldwide. However, anaerobic digestion is a complex biological process, where hundreds of microbial populations are involved, and whose start-up and operation are delicate issues. In order to better understand the process dynamics and to optimize the operating conditions, the availability of dynamic models is of paramount importance. Such models have to be inferred from prior knowledge and experimental data collected from real plants. Modeling and parameter identification are vast subjects, offering a realm of approaches and methods, which can be difficult to fully understand by scientists and engineers dedicated to the plant operation and improvements. This review article discusses existing modeling frameworks and methodologies for parameter estimation and model validation in the field of anaerobic digestion processes. The point of view is pragmatic, intentionally focusing on simple but efficient methods.

  5. Validation of the WATEQ4 geochemical model for uranium

    Energy Technology Data Exchange (ETDEWEB)

    Krupka, K.M.; Jenne, E.A.; Deutsch, W.J.

    1983-09-01

    As part of the Geochemical Modeling and Nuclide/Rock/Groundwater Interactions Studies Program, a study was conducted to partially validate the WATEQ4 aqueous speciation-solubility geochemical model for uranium. The solubility controls determined with the WATEQ4 geochemical model were in excellent agreement with those laboratory studies in which the solids schoepite (UO/sub 2/(OH)/sub 2/ . H/sub 2/O), UO/sub 2/(OH)/sub 2/, and rutherfordine ((UO/sub 2/CO/sub 3/) were identified as actual solubility controls for uranium. The results of modeling solution analyses from laboratory studies of uranyl phosphate solids, however, identified possible errors in the characterization of solids in the original solubility experiments. As part of this study, significant deficiencies in the WATEQ4 thermodynamic data base for uranium solutes and solids were corrected. Revisions included recalculation of selected uranium reactions. Additionally, thermodynamic data for the hydroxyl complexes of U(VI), including anionic (VI) species, were evaluated (to the extent permitted by the available data). Vanadium reactions were also added to the thermodynamic data base because uranium-vanadium solids can exist in natural ground-water systems. This study is only a partial validation of the WATEQ4 geochemical model because the available laboratory solubility studies do not cover the range of solid phases, alkaline pH values, and concentrations of inorganic complexing ligands needed to evaluate the potential solubility of uranium in ground waters associated with various proposed nuclear waste repositories. Further validation of this or other geochemical models for uranium will require careful determinations of uraninite solubility over the pH range of 7 to 10 under highly reducing conditions and of uranyl hydroxide and phosphate solubilities over the pH range of 7 to 10 under oxygenated conditions.

  6. Pharmacokinetic modeling of gentamicin in treatment of infective endocarditis: Model development and validation of existing models

    Science.gov (United States)

    van der Wijk, Lars; Proost, Johannes H.; Sinha, Bhanu; Touw, Daan J.

    2017-01-01

    Gentamicin shows large variations in half-life and volume of distribution (Vd) within and between individuals. Thus, monitoring and accurately predicting serum levels are required to optimize effectiveness and minimize toxicity. Currently, two population pharmacokinetic models are applied for predicting gentamicin doses in adults. For endocarditis patients the optimal model is unknown. We aimed at: 1) creating an optimal model for endocarditis patients; and 2) assessing whether the endocarditis and existing models can accurately predict serum levels. We performed a retrospective observational two-cohort study: one cohort to parameterize the endocarditis model by iterative two-stage Bayesian analysis, and a second cohort to validate and compare all three models. The Akaike Information Criterion and the weighted sum of squares of the residuals divided by the degrees of freedom were used to select the endocarditis model. Median Prediction Error (MDPE) and Median Absolute Prediction Error (MDAPE) were used to test all models with the validation dataset. We built the endocarditis model based on data from the modeling cohort (65 patients) with a fixed 0.277 L/h/70kg metabolic clearance, 0.698 (±0.358) renal clearance as fraction of creatinine clearance, and Vd 0.312 (±0.076) L/kg corrected lean body mass. External validation with data from 14 validation cohort patients showed a similar predictive power of the endocarditis model (MDPE -1.77%, MDAPE 4.68%) as compared to the intensive-care (MDPE -1.33%, MDAPE 4.37%) and standard (MDPE -0.90%, MDAPE 4.82%) models. All models acceptably predicted pharmacokinetic parameters for gentamicin in endocarditis patients. However, these patients appear to have an increased Vd, similar to intensive care patients. Vd mainly determines the height of peak serum levels, which in turn correlate with bactericidal activity. In order to maintain simplicity, we advise to use the existing intensive-care model in clinical practice to avoid

  7. US-VISIT Independent Verification and Validation Project: Test Bed Establishment Report

    Energy Technology Data Exchange (ETDEWEB)

    Jensen, N W; Gansemer, J D

    2011-01-21

    This document describes the computational and data systems available at the Lawrence Livermore National Laboratory for use on the US-VISIT Independent Verification and Validation (IV&V) project. This system - composed of data, software and hardware - is designed to be as close as a representation of the operational ADIS system as is required to verify and validate US-VISIT methodologies. It is not required to reproduce the computational capabilities of the enterprise-class operational system. During FY10, the test bed was simplified from the FY09 version by reducing the number of database host computers from three to one, significantly reducing the maintenance and overhead while simultaneously increasing system throughput. During the current performance period, a database transfer was performed as a set of Data Pump Export files. The previous RMAN backup from 2007 required the availability of an AIX system which is not required when using data pump. Due to efficiencies in the new system and process, loading of the database refresh was able to be accomplished in a much shorter time frame than was previously required. The FY10 Oracle Test Bed now consists of a single Linux platform hosting two Oracle databases including the 2007 copy as well as the October 2010 refresh.

  8. Validation of the European Cyberbullying Intervention Project Questionnaire for Colombian Adolescents.

    Science.gov (United States)

    Herrera-López, Mauricio; Casas, José A; Romera, Eva M; Ortega-Ruiz, Rosario; Del Rey, Rosario

    2017-02-01

    Cyberbullying is the act of using unjustified aggression to harm or harass via digital devices. Currently regarded as a widespread problem, the phenomenon has attracted growing research interest in different measures of cyberbullying and the similarities and differences across countries and cultures. This article presents the Colombian validation of the European Cyberbullying Intervention Project Questionnaire (ECIPQ) involving 3,830 high school students (M = 13.9 years old, standard deviation = 1.61; 48.9 percent male), of which 1,931 were Colombian and 1,899 Spanish. Confirmatory factor analysis (CFA), content validation, and multigroup analysis were performed with each of the sample subgroups. The optimal fits and psychometric properties obtained confirm the robustness and suitability of the assessment instrument to jointly measure cyber-aggression and cyber-victimization. The results corroborated the theoretical construct and the two-dimensional and universal nature of cyberbullying. The multigroup analysis showed that cyberbullying dynamics are similar in both countries. The comparative analyses of prevalence revealed that Colombian students are less involved in cyberbullying. The results indicate the suitability of the instrument and the advantages of using such a tool to evaluate and guide psychoeducational interventions aimed at preventing cyberbullying in countries where few studies have been performed.

  9. Experimental validation of flexible robot arm modeling and control

    Science.gov (United States)

    Ulsoy, A. Galip

    1989-01-01

    Flexibility is important for high speed, high precision operation of lightweight manipulators. Accurate dynamic modeling of flexible robot arms is needed. Previous work has mostly been based on linear elasticity with prescribed rigid body motions (i.e., no effect of flexible motion on rigid body motion). Little or no experimental validation of dynamic models for flexible arms is available. Experimental results are also limited for flexible arm control. Researchers include the effects of prismatic as well as revolute joints. They investigate the effect of full coupling between the rigid and flexible motions, and of axial shortening, and consider the control of flexible arms using only additional sensors.

  10. Validation of Power Requirement Model for Active Loudspeakers

    DEFF Research Database (Denmark)

    Schneider, Henrik; Madsen, Anders Normann; Bjerregaard, Ruben

    2015-01-01

    The actual power requirement of an active loudspeaker during playback of music has not received much attention in the literature. This is probably because no single and simple solution exists and because a complete system knowledge from input voltage to output sound pressure level is required....... There are however many advantages that could be harvested from such knowledge like size, cost and efficiency improvements. In this paper a recently proposed power requirement model for active loudspeakers is experimentally validated and the model is expanded to include the closed and vented type enclosures...

  11. Validation of a Business Model for Cultural Heritage Institutions

    Directory of Open Access Journals (Sweden)

    Cristian CIUREA

    2015-01-01

    Full Text Available The paper proposes a business model for the efficiency optimization of the interaction between all actors involved in cultural heritage sector, such as galleries, libraries, archives and museums (GLAM. The validation of the business model is subject of analyses and implementations in a real environment made by different cultural institutions. The implementation of virtual exhibitions on mobile devices is described and analyzed as a key factor for increasing the cultural heritage visibility. New perspectives on the development of virtual exhibitions for mobile devices are considered. A study on the number of visitors of cultural institutions is carried out and ways to increase the number of visitors are described.

  12. Validating a spatially distributed hydrological model with soil morphology data

    Directory of Open Access Journals (Sweden)

    T. Doppler

    2013-10-01

    Full Text Available Spatially distributed hydrological models are popular tools in hydrology and they are claimed to be useful to support management decisions. Despite the high spatial resolution of the computed variables, calibration and validation is often carried out only on discharge time-series at specific locations due to the lack of spatially distributed reference data. Because of this restriction, the predictive power of these models, with regard to predicted spatial patterns, can usually not be judged. An example of spatial predictions in hydrology is the prediction of saturated areas in agricultural catchments. These areas can be important source areas for the transport of agrochemicals to the stream. We set up a spatially distributed model to predict saturated areas in a 1.2 km2 catchment in Switzerland with moderate topography. Around 40% of the catchment area are artificially drained. We measured weather data, discharge and groundwater levels in 11 piezometers for 1.5 yr. For broadening the spatially distributed data sets that can be used for model calibration and validation, we translated soil morphological data available from soil maps into an estimate of the duration of soil saturation in the soil horizons. We used redox-morphology signs for these estimates. This resulted in a data set with high spatial coverage on which the model predictions were validated. In general, these saturation estimates corresponded well to the measured groundwater levels. We worked with a model that would be applicable for management decisions because of its fast calculation speed and rather low data requirements. We simultaneously calibrated the model to the groundwater levels in the piezometers and discharge. The model was able to reproduce the general hydrological behavior of the catchment in terms of discharge and absolute groundwater levels. However, the accuracy of the groundwater level predictions was not high enough to be used for the prediction of saturated areas

  13. Organic acid modeling and model validation: Workshop summary. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Sullivan, T.J.; Eilers, J.M.

    1992-08-14

    A workshop was held in Corvallis, Oregon on April 9--10, 1992 at the offices of E&S Environmental Chemistry, Inc. The purpose of this workshop was to initiate research efforts on the entitled ``Incorporation of an organic acid representation into MAGIC (Model of Acidification of Groundwater in Catchments) and testing of the revised model using Independent data sources.`` The workshop was attended by a team of internationally-recognized experts in the fields of surface water acid-bass chemistry, organic acids, and watershed modeling. The rationale for the proposed research is based on the recent comparison between MAGIC model hindcasts and paleolimnological inferences of historical acidification for a set of 33 statistically-selected Adirondack lakes. Agreement between diatom-inferred and MAGIC-hindcast lakewater chemistry in the earlier research had been less than satisfactory. Based on preliminary analyses, it was concluded that incorporation of a reasonable organic acid representation into the version of MAGIC used for hindcasting was the logical next step toward improving model agreement.

  14. Organic acid modeling and model validation: Workshop summary

    Energy Technology Data Exchange (ETDEWEB)

    Sullivan, T.J.; Eilers, J.M.

    1992-08-14

    A workshop was held in Corvallis, Oregon on April 9--10, 1992 at the offices of E S Environmental Chemistry, Inc. The purpose of this workshop was to initiate research efforts on the entitled Incorporation of an organic acid representation into MAGIC (Model of Acidification of Groundwater in Catchments) and testing of the revised model using Independent data sources.'' The workshop was attended by a team of internationally-recognized experts in the fields of surface water acid-bass chemistry, organic acids, and watershed modeling. The rationale for the proposed research is based on the recent comparison between MAGIC model hindcasts and paleolimnological inferences of historical acidification for a set of 33 statistically-selected Adirondack lakes. Agreement between diatom-inferred and MAGIC-hindcast lakewater chemistry in the earlier research had been less than satisfactory. Based on preliminary analyses, it was concluded that incorporation of a reasonable organic acid representation into the version of MAGIC used for hindcasting was the logical next step toward improving model agreement.

  15. Packed bed heat storage: Continuum mechanics model and validation

    Science.gov (United States)

    Knödler, Philipp; Dreißigacker, Volker; Zunft, Stefan

    2016-05-01

    Thermal energy storage (TES) systems are key elements for various types of new power plant concepts. As possible cost-effective storage inventory option, packed beds of miscellaneous material come into consideration. However, high technical risks arise due to thermal expansion and shrinking of the packed bed's particles during cyclic thermal operation, possibly leading to material failure. Therefore, suitable tools for designing the heat storage system are mandatory. While particle discrete models offer detailed simulation results, the computing time for large scale applications is inefficient. In contrast, continuous models offer time-efficient simulation results but are in need of effective packed bed parameters. This work focuses on providing insight into some basic methods and tools on how to obtain such parameters and on how they are implemented into a continuum model. In this context, a particle discrete model as well as a test rig for carrying out uniaxial compression tests (UCT) is introduced. Performing of experimental validation tests indicate good agreement with simulated UCT results. In this process, effective parameters required for a continuous packed bed model were identified and used for continuum simulation. This approach is validated by comparing the simulated results with experimental data from another test rig. The presented method significantly simplifies subsequent design studies.

  16. Model calibration and validation of an impact test simulation

    Energy Technology Data Exchange (ETDEWEB)

    Hemez, F. M. (François M.); Wilson, A. C. (Amanda C.); Havrilla, G. N. (George N.)

    2001-01-01

    This paper illustrates the methodology being developed at Los Alamos National Laboratory for the validation of numerical simulations for engineering structural dynamics. The application involves the transmission of a shock wave through an assembly that consists of a steel cylinder and a layer of elastomeric (hyper-foam) material. The assembly is mounted on an impact table to generate the shock wave. The input acceleration and three output accelerations are measured. The main objective of the experiment is to develop a finite element representation of the system capable of reproducing the test data with acceptable accuracy. Foam layers of various thicknesses and several drop heights are considered during impact testing. Each experiment is replicated several times to estimate the experimental variability. Instead of focusing on the calibration of input parameters for a single configuration, the numerical model is validated for its ability to predict the response of three different configurations (various combinations of foam thickness and drop height). Design of Experiments is implemented to perform parametric and statistical variance studies. Surrogate models are developed to replace the computationally expensive numerical simulation. Variables of the finite element model are separated into calibration variables and control variables, The models are calibrated to provide numerical simulations that correctly reproduce the statistical variation of the test configurations. The calibration step also provides inference for the parameters of a high strain-rate dependent material model of the hyper-foam. After calibration, the validity of the numerical simulation is assessed through its ability to predict the response of a fourth test setup.

  17. Image decomposition as a tool for validating stress analysis models

    Directory of Open Access Journals (Sweden)

    Mottershead J.

    2010-06-01

    Full Text Available It is good practice to validate analytical and numerical models used in stress analysis for engineering design by comparison with measurements obtained from real components either in-service or in the laboratory. In reality, this critical step is often neglected or reduced to placing a single strain gage at the predicted hot-spot of stress. Modern techniques of optical analysis allow full-field maps of displacement, strain and, or stress to be obtained from real components with relative ease and at modest cost. However, validations continued to be performed only at predicted and, or observed hot-spots and most of the wealth of data is ignored. It is proposed that image decomposition methods, commonly employed in techniques such as fingerprinting and iris recognition, can be employed to validate stress analysis models by comparing all of the key features in the data from the experiment and the model. Image decomposition techniques such as Zernike moments and Fourier transforms have been used to decompose full-field distributions for strain generated from optical techniques such as digital image correlation and thermoelastic stress analysis as well as from analytical and numerical models by treating the strain distributions as images. The result of the decomposition is 101 to 102 image descriptors instead of the 105 or 106 pixels in the original data. As a consequence, it is relatively easy to make a statistical comparison of the image descriptors from the experiment and from the analytical/numerical model and to provide a quantitative assessment of the stress analysis.

  18. Calibration and validation of DRAINMOD to model bioretention hydrology

    Science.gov (United States)

    Brown, R. A.; Skaggs, R. W.; Hunt, W. F.

    2013-04-01

    SummaryPrevious field studies have shown that the hydrologic performance of bioretention cells varies greatly because of factors such as underlying soil type, physiographic region, drainage configuration, surface storage volume, drainage area to bioretention surface area ratio, and media depth. To more accurately describe bioretention hydrologic response, a long-term hydrologic model that generates a water balance is needed. Some current bioretention models lack the ability to perform long-term simulations and others have never been calibrated from field monitored bioretention cells with underdrains. All peer-reviewed models lack the ability to simultaneously perform both of the following functions: (1) model an internal water storage (IWS) zone drainage configuration and (2) account for soil-water content using the soil-water characteristic curve. DRAINMOD, a widely-accepted agricultural drainage model, was used to simulate the hydrologic response of runoff entering a bioretention cell. The concepts of water movement in bioretention cells are very similar to those of agricultural fields with drainage pipes, so many bioretention design specifications corresponded directly to DRAINMOD inputs. Detailed hydrologic measurements were collected from two bioretention field sites in Nashville and Rocky Mount, North Carolina, to calibrate and test the model. Each field site had two sets of bioretention cells with varying media depths, media types, drainage configurations, underlying soil types, and surface storage volumes. After 12 months, one of these characteristics was altered - surface storage volume at Nashville and IWS zone depth at Rocky Mount. At Nashville, during the second year (post-repair period), the Nash-Sutcliffe coefficients for drainage and exfiltration/evapotranspiration (ET) both exceeded 0.8 during the calibration and validation periods. During the first year (pre-repair period), the Nash-Sutcliffe coefficients for drainage, overflow, and exfiltration

  19. Validating and Verifying Biomathematical Models of Human Fatigue

    Science.gov (United States)

    Martinez, Siera Brooke; Quintero, Luis Ortiz; Flynn-Evans, Erin

    2015-01-01

    Airline pilots experience acute and chronic sleep deprivation, sleep inertia, and circadian desynchrony due to the need to schedule flight operations around the clock. This sleep loss and circadian desynchrony gives rise to cognitive impairments, reduced vigilance and inconsistent performance. Several biomathematical models, based principally on patterns observed in circadian rhythms and homeostatic drive, have been developed to predict a pilots levels of fatigue or alertness. These models allow for the Federal Aviation Administration (FAA) and commercial airlines to make decisions about pilot capabilities and flight schedules. Although these models have been validated in a laboratory setting, they have not been thoroughly tested in operational environments where uncontrolled factors, such as environmental sleep disrupters, caffeine use and napping, may impact actual pilot alertness and performance. We will compare the predictions of three prominent biomathematical fatigue models (McCauley Model, Harvard Model, and the privately-sold SAFTE-FAST Model) to actual measures of alertness and performance. We collected sleep logs, movement and light recordings, psychomotor vigilance task (PVT), and urinary melatonin (a marker of circadian phase) from 44 pilots in a short-haul commercial airline over one month. We will statistically compare with the model predictions to lapses on the PVT and circadian phase. We will calculate the sensitivity and specificity of each model prediction under different scheduling conditions. Our findings will aid operational decision-makers in determining the reliability of each model under real-world scheduling situations.

  20. Multicomponent aerosol dynamics model UHMA: model development and validation

    Directory of Open Access Journals (Sweden)

    H. Korhonen

    2004-01-01

    Full Text Available A size-segregated aerosol dynamics model UHMA (University of Helsinki Multicomponent Aerosol model was developed for studies of multicomponent tropospheric aerosol particles. The model includes major aerosol microphysical processes in the atmosphere with a focus on new particle formation and growth; thus it incorporates particle coagulation and multicomponent condensation, applying a revised treatment of condensation flux onto free molecular regime particles and the activation of nanosized clusters by organic vapours (Nano-Köhler theory, as well as recent parameterizations for binary H2SO4-H2O and ternary H2SO4-NH3-H2O homogeneous nucleation and dry deposition. The representation of particle size distribution can be chosen from three sectional methods: the hybrid method, the moving center method, and the retracking method in which moving sections are retracked to a fixed grid after a certain time interval. All these methods can treat particle emissions and atmospheric transport consistently, and are therefore suitable for use in large scale atmospheric models. In a test simulation against an accurate high resolution solution, all the methods showed reasonable treatment of new particle formation with 20 size sections although the hybrid and the retracking methods suffered from artificial widening of the distribution. The moving center approach, on the other hand, showed extra dents in the particle size distribution and failed to predict the onset of detectable particle formation. In a separate test simulation of an observed nucleation event, the model captured the key qualitative behaviour of the system well. Furthermore, its prediction of the organic volume fraction in newly formed particles, suggesting values as high as 0.5 for 3–4 nm particles and approximately 0.8 for 10 nm particles, agrees with recent indirect composition measurements.

  1. Multicomponent aerosol dynamics model UHMA: model development and validation

    Directory of Open Access Journals (Sweden)

    H. Korhonen

    2004-01-01

    Full Text Available A size-segregated aerosol dynamics model UHMA (University of Helsinki Multicomponent Aerosol model was developed for studies of multicomponent tropospheric aerosol particles. The model includes major aerosol microphysical processes in the atmosphere with a focus on new particle formation and growth; thus it incorporates particle coagulation and multicomponent condensation, applying a revised treatment of condensation flux onto free molecular regime particles and the activation of nanosized clusters by organic vapours (Nano-Köhler theory, as well as recent parameterizations for binary H2SO4–H2O and ternary H2SO4–NH3-H2O homogeneous nucleation and dry deposition. The representation of particle size distribution can be chosen from three sectional methods: the hybrid method, the moving center method, and the retracking method in which moving sections are retracked to a fixed grid after a certain time interval. All these methods can treat particle emissions and transport consistently, and are therefore suitable for use in large scale atmospheric models. In a test simulation against an accurate high resolution solution, all the methods showed reasonable treatment of new particle formation with 20 size sections although the hybrid and the retracking methods suffered from artificial widening of the distribution. The moving center approach, on the other hand, showed extra dents in the particle size distribution and failed to predict the onset of detectable particle formation. In a separate test simulation of an observed nucleation event, the model captured the key qualitative behaviour of the system well. Furthermore, its prediction of the organic volume fraction in newly formed particles, suggesting values as high as 0.5 for 3–4 nm particles and approximately 0.8 for 10 nm particles, agrees with recent indirect composition measurements.

  2. Multicomponent aerosol dynamics model UHMA: model development and validation

    Science.gov (United States)

    Korhonen, H.; Lehtinen, K. E. J.; Kulmala, M.

    2004-05-01

    A size-segregated aerosol dynamics model UHMA (University of Helsinki Multicomponent Aerosol model) was developed for studies of multicomponent tropospheric aerosol particles. The model includes major aerosol microphysical processes in the atmosphere with a focus on new particle formation and growth; thus it incorporates particle coagulation and multicomponent condensation, applying a revised treatment of condensation flux onto free molecular regime particles and the activation of nanosized clusters by organic vapours (Nano-Köhler theory), as well as recent parameterizations for binary H2SO4-H2O and ternary H2SO4-NH3-H2O homogeneous nucleation and dry deposition. The representation of particle size distribution can be chosen from three sectional methods: the hybrid method, the moving center method, and the retracking method in which moving sections are retracked to a fixed grid after a certain time interval. All these methods can treat particle emissions and atmospheric transport consistently, and are therefore suitable for use in large scale atmospheric models. In a test simulation against an accurate high resolution solution, all the methods showed reasonable treatment of new particle formation with 20 size sections although the hybrid and the retracking methods suffered from artificial widening of the distribution. The moving center approach, on the other hand, showed extra dents in the particle size distribution and failed to predict the onset of detectable particle formation. In a separate test simulation of an observed nucleation event, the model captured the key qualitative behaviour of the system well. Furthermore, its prediction of the organic volume fraction in newly formed particles, suggesting values as high as 0.5 for 3-4 nm particles and approximately 0.8 for 10 nm particles, agrees with recent indirect composition measurements.

  3. Solar Module Modeling, Simulation And Validation Under Matlab / Simulink

    Directory of Open Access Journals (Sweden)

    M.Diaw

    2016-09-01

    Full Text Available Solar modules are systems which convert sunlight into electricity using the physics of semiconductors. Mathematical modeling of these systems uses weather data such as irradiance and temperature as inputs. It provides the current, voltage or power as outputs, which allows plot the characteristic giving the intensity I as a function of voltage V for photovoltaic cells. In this work, we have developed a model for a diode of a Photovoltaic module under the Matlab / Simulink environment. From this model, we have plotted the characteristic curves I-V and P-V of solar cell for different values of temperature and sunlight. The validation has been done by comparing the experimental curve with power from a solar panel HORONYA 20W type with that obtained by the model.

  4. Model development and validation of a solar cooling plant

    Energy Technology Data Exchange (ETDEWEB)

    Zambrano, Darine; Garcia-Gabin, Winston [Escuela de Ingenieria Electrica, Facultad de Ingenieria, Universidad de Los Andes, La Hechicera, Merida 5101 (Venezuela); Bordons, Carlos; Camacho, Eduardo F. [Departamento de Ingenieria de Sistemas y Automatica, Escuela Superior de Ingenieros, Universidad de Sevilla, Camino de Los Descubrimientos s/n, Sevilla 41092 (Spain)

    2008-03-15

    This paper describes the dynamic model of a solar cooling plant that has been built for demonstration purposes using market-available technology and has been successfully operational since 2001. The plant uses hot water coming from a field of solar flat collectors which feed a single-effect absorption chiller of 35 kW nominal cooling capacity. The work includes model development based on first principles and model validation with a set of experiments carried out on the real plant. The simulation model has been done in a modular way, and can be adapted to other solar cooling-plants since the main modules (solar field, absorption machine, accumulators and auxiliary heater) can be easily replaced. This simulator is a powerful tool for solar cooling systems both during the design phase, when it can be used for component selection, and also for the development and testing of control strategies. (author)

  5. Dynamic Modeling of Wind Turbine Gearboxes and Experimental Validation

    DEFF Research Database (Denmark)

    Pedersen, Rune

    is presented. The model takes into account the effects of load and applied grinding corrections. The results are verified by comparing to simulated and experimental results reported in the existing literature. Using gear data loosely based on a 1 MW wind turbine gearbox, the gear mesh stiffness is expanded...... analysis in relation to gear dynamics. A multibody model of two complete 2.3MWwind turbine gearboxes mounted back-to-back in a test rig is built. The mean values of the proposed gear mesh stiffnesses are included. The model is validated by comparing with calculated and measured eigenfrequencies and mode...... shapes. The measured eigenfrequencies have been identified in accelerometer signals obtained during run-up tests. Since the calculated eigenfrequencies do not match the measured eigenfrequencies with sufficient accuracy, a model updating technique is applied to ensure a better match by adjusting...

  6. Characterizing explanatory models of illness in healthcare: development and validation of the CONNECT instrument.

    Science.gov (United States)

    Haidet, Paul; O'Malley, Kimberly J; Sharf, Barbara F; Gladney, Alicia P; Greisinger, Anthony J; Street, Richard L

    2008-11-01

    A growing body of qualitative and quantitative research suggests that individual patients and physicians often have differing perspectives, or 'explanatory models,' regarding the patient's health condition or illness. Discordance between explanatory models may lead to difficulties in communication and poor disease outcomes. However, due to a lack of tools to systematically measure concordance in patient and physician explanatory models, a large-scale study of explanatory models of illness has not been previously possible. The objective of this project was to develop and pilot-test a survey-based tool (the CONNECT Instrument) that measures salient aspects of explanatory models of illness. We conducted a multi-method survey development project that included qualitative and quantitative item development, refinement, pilot testing, and psychometric evaluation. We evaluated the instrument in two unique, consecutive cohorts of primary care patients in a variety of private and public settings in Houston, TX. We also used the instrument to examine concordance between patient and physician explanatory models in the second cohort. The final version of the CONNECT Instrument contains nineteen items that focus on six dimensions of explanatory models. Cronbach alphas ranged from 0.65 to 0.89 for the six CONNECT dimensions. The instrument demonstrated evidence of criterion-related validity when individual CONNECT dimension scores were compared with scores from previously published instruments, and demonstrated expected differences between patients 'and physicians' explanatory models of illness. The CONNECT instrument is a tool with good psychometric properties that enables researchers to measure important aspects of patients 'and physicians' explanatory models of illness. Our continuing work will focus on gathering additional validity evidence and evaluating associations between explanatory model concordance and health outcomes. The CONNECT instrument can be used to improve

  7. Deviatoric constitutive model: domain of strain rate validity

    Energy Technology Data Exchange (ETDEWEB)

    Zocher, Marvin A [Los Alamos National Laboratory

    2009-01-01

    A case is made for using an enhanced methodology in determining the parameters that appear in a deviatoric constitutive model. Predictability rests on our ability to solve a properly posed initial boundary value problem (IBVP), which incorporates an accurate reflection of material constitutive behavior. That reflection is provided through the constitutive model. Moreover, the constitutive model is required for mathematical closure of the IBVP. Common practice in the shock physics community is to divide the Cauchy tensor into spherical and deviatoric parts, and to develop separate models for spherical and deviatoric constitutive response. Our focus shall be on the Cauchy deviator and deviatoric constitutive behavior. Discussions related to the spherical part of the Cauchy tensor are reserved for another time. A number of deviatoric constitutive models have been developed for utilization in the solution of IBVPs that are of interest to those working in the field of shock physics, e.g. All of these models are phenomenological and contain a number of parameters that must be determined in light of experimental data. The methodology employed in determining these parameters dictates the loading regime over which the model can be expected to be accurate. The focus of this paper is the methodology employed in determining model parameters and the consequences of that methodology as it relates to the domain of strain rate validity. We shall begin by describing the methodology that is typically employed. We shall discuss limitations imposed upon predictive capability by the typically employed methodology. We shall propose a modification to the typically employed methodology that significantly extends the domain of strain rate validity.

  8. Modeling, Robust Control, and Experimental Validation of a Supercavitating Vehicle

    Science.gov (United States)

    Escobar Sanabria, David

    This dissertation considers the mathematical modeling, control under uncertainty, and experimental validation of an underwater supercavitating vehicle. By traveling inside a gas cavity, a supercavitating vehicle reduces hydrodynamic drag, increases speed, and minimizes power consumption. The attainable speed and power efficiency make these vehicles attractive for undersea exploration, high-speed transportation, and defense. However, the benefits of traveling inside a cavity come with difficulties in controlling the vehicle dynamics. The main challenge is the nonlinear force that arises when the back-end of the vehicle pierces the cavity. This force, referred to as planing, leads to oscillatory motion and instability. Control technologies that are robust to planing and suited for practical implementation need to be developed. To enable these technologies, a low-order vehicle model that accounts for inaccuracy in the characterization of planing is required. Additionally, an experimental method to evaluate possible pitfalls in the models and controllers is necessary before undersea testing. The major contribution of this dissertation is a unified framework for mathematical modeling, robust control synthesis, and experimental validation of a supercavitating vehicle. First, we introduce affordable experimental methods for mathematical modeling and controller testing under planing and realistic flow conditions. Then, using experimental observations and physical principles, we create a low-order nonlinear model of the longitudinal vehicle motion. This model quantifies the planing uncertainty and is suitable for robust controller synthesis. Next, based on the vehicle model, we develop automated tools for synthesizing controllers that deliver a certificate of performance in the face of nonlinear and uncertain planing forces. We demonstrate theoretically and experimentally that the proposed controllers ensure higher performance when the uncertain planing dynamics are

  9. Validation of thermal models for a prototypical MEMS thermal actuator.

    Energy Technology Data Exchange (ETDEWEB)

    Gallis, Michail A.; Torczynski, John Robert; Piekos, Edward Stanley; Serrano, Justin Raymond; Gorby, Allen D.; Phinney, Leslie Mary

    2008-09-01

    This report documents technical work performed to complete the ASC Level 2 Milestone 2841: validation of thermal models for a prototypical MEMS thermal actuator. This effort requires completion of the following task: the comparison between calculated and measured temperature profiles of a heated stationary microbeam in air. Such heated microbeams are prototypical structures in virtually all electrically driven microscale thermal actuators. This task is divided into four major subtasks. (1) Perform validation experiments on prototypical heated stationary microbeams in which material properties such as thermal conductivity and electrical resistivity are measured if not known and temperature profiles along the beams are measured as a function of electrical power and gas pressure. (2) Develop a noncontinuum gas-phase heat-transfer model for typical MEMS situations including effects such as temperature discontinuities at gas-solid interfaces across which heat is flowing, and incorporate this model into the ASC FEM heat-conduction code Calore to enable it to simulate these effects with good accuracy. (3) Develop a noncontinuum solid-phase heat transfer model for typical MEMS situations including an effective thermal conductivity that depends on device geometry and grain size, and incorporate this model into the FEM heat-conduction code Calore to enable it to simulate these effects with good accuracy. (4) Perform combined gas-solid heat-transfer simulations using Calore with these models for the experimentally investigated devices, and compare simulation and experimental temperature profiles to assess model accuracy. These subtasks have been completed successfully, thereby completing the milestone task. Model and experimental temperature profiles are found to be in reasonable agreement for all cases examined. Modest systematic differences appear to be related to uncertainties in the geometric dimensions of the test structures and in the thermal conductivity of the

  10. Black liquor combustion validated recovery boiler modeling, five-year report

    Energy Technology Data Exchange (ETDEWEB)

    Grace, T.M.; Frederick, W.J.; Salcudean, M.; Wessel, R.A.

    1996-08-01

    The objective of this project was to develop a new computer model of a recovery boiler furnace using a computational fluid dynamics (CFD) code specifically tailored to the requirements for solving recovery boiler flows, and using improved submodels for black liquor combustion based on continued laboratory fundamental studies. The project originated in October 1990 and was scheduled to run for four years. At that time, there was considerable emphasis on developing accurate predictions of the physical carryover of macroscopic particles of partially burnt black liquor and smelt droplets out of the furnace, since this was seen as the main cause of boiler plugging. This placed a major emphasis on gas flow patterns within the furnace and on the mass loss rates and swelling and shrinking rates of burning black liquor drops. As work proceeded on developing the recovery boiler furnace model, it became apparent that some recovery boilers encounter serious plugging problems even when physical carryover was minimal. After the original four-year period was completed, the project was extended to address this issue. The objective of the extended project was to improve the utility of the models by including the black liquor chemistry relevant to air emissions predictions and aerosol formation, and by developing the knowledge base and computational tools to relate furnace model outputs to fouling and plugging of the convective sections of the boilers. The work done to date includes CFD model development and validation, acquisition of information on black liquor combustion fundamentals and development of improved burning models, char bed model development, and model application and simplification.

  11. Validation and Scenario Analysis of a Soil Organic Carbon Model

    Institute of Scientific and Technical Information of China (English)

    HUANG Yao; LIU Shi-liang; SHEN Qi-rong; ZONG Liang-gang; JIANG Ding-an; HUANG Hong-guang

    2002-01-01

    A model developed by the authors was validated against independent data sets. The data sets were obtained from field experiments of crop residue decomposition and a 7-year soil improvement in Yixing City, Jiangsu Province. Model validation indicated that soil organic carbon dynamics can be simulated from the weather variables of temperature, sunlight and precipitation, soil clay content and bulk density, grain yield of previous crops, qualities and quantities of the added organic matter. Model simulation in general agreed with the measurements. The comparison between computed and measured resulted in correlation coefficient γ2 values of 0.9291 * * * (n = 48) and 0. 6431 * * (n = 65) for the two experiments, respectively. Model prediction under three scenarios of no additional organic matter input, with an annual incorporation of rice and wheat straw at rates of 6.75t/ha and 9.0t/ha suggested that the soil organic carbon in Wanshi Township of Yixing City would be from an initial value of 7.85g/kg in 1983 to 6.30g/kg, 11.42g/kg and 13g/kg in 2014, respectively. Consequently, total nitrogen content of the soil was predicted to be respectively 0.49g/kg,0.89g/kg and 1.01g/kg under the three scenarios.

  12. Validation, Optimization and Simulation of a Solar Thermoelectric Generator Model

    Science.gov (United States)

    Madkhali, Hadi Ali; Hamil, Ali; Lee, HoSung

    2017-08-01

    This study explores thermoelectrics as a viable option for small-scale solar thermal applications. Thermoelectric technology is based on the Seebeck effect, which states that a voltage is induced when a temperature gradient is applied to the junctions of two differing materials. This research proposes to analyze, validate, simulate, and optimize a prototype solar thermoelectric generator (STEG) model in order to increase efficiency. The intent is to further develop STEGs as a viable and productive energy source that limits pollution and reduces the cost of energy production. An empirical study (Kraemer et al. in Nat Mater 10:532, 2011) on the solar thermoelectric generator reported a high efficiency performance of 4.6%. The system had a vacuum glass enclosure, a flat panel (absorber), thermoelectric generator and water circulation for the cold side. The theoretical and numerical approach of this current study validated the experimental results from Kraemer's study to a high degree. The numerical simulation process utilizes a two-stage approach in ANSYS software for Fluent and Thermal-Electric Systems. The solar load model technique uses solar radiation under AM 1.5G conditions in Fluent. This analytical model applies Dr. Ho Sung Lee's theory of optimal design to improve the performance of the STEG system by using dimensionless parameters. Applying this theory, using two cover glasses and radiation shields, the STEG model can achieve a highest efficiency of 7%.

  13. Projective Market Model Approach to AHP Decision-Making

    CERN Document Server

    Szczypinska, Anna

    2007-01-01

    In this paper we describe market in projective geometry language and give definition of a matrix of market rate, which is related to the matrix rate of return and the matrix of judgements in the Analytic Hierarchy Process (AHP). We use these observations to extend the AHP model to projective geometry formalism and generalise it to intransitive case. We give financial interpretations of such generalised model in the Projective Model of Market (PMM) and propose its simplification. The unification of the AHP model and projective aspect of portfolio theory suggests a wide spectrum of new applications such extended model.

  14. Validation of the BASALT model for simulating off-axis hydrothermal circulation in oceanic crust

    Science.gov (United States)

    Farahat, Navah X.; Archer, David; Abbot, Dorian S.

    2017-08-01

    Fluid recharge and discharge between the deep ocean and the porous upper layer of off-axis oceanic crust tends to concentrate in small volumes of rock, such as seamounts and fractures, that are unimpeded by low-permeability sediments. Basement structure, sediment burial, heat flow, and other regional characteristics of off-axis hydrothermal systems appear to produce considerable diversity of circulation behaviors. Circulation of seawater and seawater-derived fluids controls the extent of fluid-rock interaction, resulting in significant geochemical impacts. However, the primary regional characteristics that control how seawater is distributed within upper oceanic crust are still poorly understood. In this paper we present the details of the two-dimensional (2-D) BASALT (Basement Activity Simulated At Low Temperatures) numerical model of heat and fluid transport in an off-axis hydrothermal system. This model is designed to simulate a wide range of conditions in order to explore the dominant controls on circulation. We validate the BASALT model's ability to reproduce observations by configuring it to represent a thoroughly studied transect of the Juan de Fuca Ridge eastern flank. The results demonstrate that including series of narrow, ridge-parallel fractures as subgrid features produces a realistic circulation scenario at the validation site. In future projects, a full reactive transport version of the validated BASALT model will be used to explore geochemical fluxes in a variety of off-axis hydrothermal environments.

  15. Independent Verification and Validation Of SAPHIRE 8 Software Project Plan Project Number: N6423 U.S. Nuclear Regulatory Commission

    Energy Technology Data Exchange (ETDEWEB)

    Carl Wharton; Kent Norris

    2009-12-01

    This report provides an evaluation of the Project Plan. The Project Plan is intended to provide the high-level direction that documents the required software activities to meet the contractual commitments prepared by the sponsor; the Nuclear Regulatory Commission.

  16. Independent Verification and Validation Of SAPHIRE 8 Software Project Plan Project Number: N6423 U.S. Nuclear Regulatory Commission

    Energy Technology Data Exchange (ETDEWEB)

    Carl Wharton

    2009-10-01

    This report provides an evaluation of the Project Plan. The Project Plan is intended to provide the high-level direction that documents the required software activities to meet the contractual commitments prepared by the sponsor; the Nuclear Regulatory Commission.

  17. Independent Verification and Validation Of SAPHIRE 8 Software Project Plan Project Number: N6423 U.S. Nuclear Regulatory Commission

    Energy Technology Data Exchange (ETDEWEB)

    Carl Wharton; Kent Norris

    2010-03-01

    This report provides an evaluation of the Project Plan. The Project Plan is intended to provide the high-level direction that documents the required software activities to meet the contractual commitments prepared by the sponsor; the Nuclear Regulatory Commission.

  18. Radiation Background and Attenuation Model Validation and Development

    Energy Technology Data Exchange (ETDEWEB)

    Peplow, Douglas E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Santiago, Claudio P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-08-05

    This report describes the initial results of a study being conducted as part of the Urban Search Planning Tool project. The study is comparing the Urban Scene Simulator (USS), a one-dimensional (1D) radiation transport model developed at LLNL, with the three-dimensional (3D) radiation transport model from ORNL using the MCNP, SCALE/ORIGEN and SCALE/MAVRIC simulation codes. In this study, we have analyzed the differences between the two approaches at every step, from source term representation, to estimating flux and detector count rates at a fixed distance from a simple surface (slab), and at points throughout more complex 3D scenes.

  19. A new validation-assessment tool for health-economic decision models

    NARCIS (Netherlands)

    Mauskopf, J.; Vemer, P.; Voorn, van G.A.K.; Corro Ramos, I.

    2014-01-01

    A validation-assessment tool is being developed for decision makers to transparently and consistently evaluate the validation status of different health-economic decision models. It is designed as a list of validation techniques covering all relevant aspects of model validation to be filled in by

  20. California Diploma Project Technical Report III: Validity Study--Validity Study of the Health Sciences and Medical Technology Standards

    Science.gov (United States)

    McGaughy, Charis; Bryck, Rick; de Gonzalez, Alicia

    2012-01-01

    This study is a validity study of the recently revised version of the Health Science Standards. The purpose of this study is to understand how the Health Science Standards relate to college and career readiness, as represented by survey ratings submitted by entry-level college instructors of health science courses and industry representatives. For…

  1. The Canadian Arctic ACE/OSIRIS Validation Project at PEARL: Validating Satellite Observations Over the High Arctic

    Science.gov (United States)

    Walker, Kaley A.; Strong, Kimberly; Fogal, Pierre F.; Drummond, James R.

    2016-04-01

    Ground-based measurements provide critical data for the validation of satellite retrievals of atmospheric trace gases and for the assessment of long-term stability of these measurements. As of February 2016, the Canadian-led Atmospheric Chemistry Experiment (ACE) satellite mission has been making measurements of the Earth's atmosphere for nearly twelve years and Canada's Optical Spectrograph and InfraRed Imager System (OSIRIS) instrument on the Odin satellite has been operating for fourteen years. As ACE and OSIRIS operations have extended beyond their planned two-year missions, there is an ongoing need to validate the trace gas data profiles from the ACE-Fourier Transform Spectrometer (ACE-FTS), the Measurement of Aerosol Extinction in the Stratosphere and Troposphere Retrieved by Occultation (ACE-MAESTRO) and OSIRIS. In particular, validation comparisons are needed during Arctic springtime to understand better the measurements of species involved in stratospheric ozone chemistry. To this end, thirteen Canadian Arctic ACE/OSIRIS Validation Campaigns have been conducted during the spring period (February - April in 2004 - 2016) at the Polar Environment Atmospheric Research Laboratory (PEARL) in Eureka, Nunavut (80N, 86W). For the past decade, these campaigns have been undertaken in collaboration with the Canadian Network for the Detection of Atmospheric Change (CANDAC). The spring period coincides with the most chemically active time of year in the Arctic, as well as a significant number of satellite overpasses. A suite of as many as 12 ground-based instruments, as well as frequent balloon-borne ozonesonde and radiosonde launches, have been used in each campaign. These instruments include: a ground-based version of the ACE-FTS (PARIS - Portable Atmospheric Research Interferometric Spectrometer), a terrestrial version of the ACE-MAESTRO, a SunPhotoSpectrometer, two CANDAC zenith-viewing UV-visible grating spectrometers, a Bomem DA8 Fourier transform spectrometer

  2. Validation of symptom validity tests using a "child-model" of adult cognitive impairments

    NARCIS (Netherlands)

    Rienstra, A.; Spaan, P.E.J.; Schmand, B.

    2010-01-01

    Validation studies of symptom validity tests (SVTs) in children are uncommon. However, since children’s cognitive abilities are not yet fully developed, their performance may provide additional support for the validity of these measures in adult populations. Four SVTs, the Test of Memory Malingering

  3. Clinical application and validation of an iterative forward projection matching algorithm for permanent brachytherapy seed localization from conebeam-CT x-ray projections

    Energy Technology Data Exchange (ETDEWEB)

    Pokhrel, Damodar; Murphy, Martin J.; Todor, Dorin A.; Weiss, Elisabeth; Williamson, Jeffrey F. [Department of Radiation Oncology, School of Medicine, Virginia Commonwealth University, Richmond, Virginia 23298 (United States)

    2010-09-15

    Purpose: To experimentally validate a new algorithm for reconstructing the 3D positions of implanted brachytherapy seeds from postoperatively acquired 2D conebeam-CT (CBCT) projection images. Methods: The iterative forward projection matching (IFPM) algorithm finds the 3D seed geometry that minimizes the sum of the squared intensity differences between computed projections of an initial estimate of the seed configuration and radiographic projections of the implant. In-house machined phantoms, containing arrays of 12 and 72 seeds, respectively, are used to validate this method. Also, four {sup 103}Pd postimplant patients are scanned using an ACUITY digital simulator. Three to ten x-ray images are selected from the CBCT projection set and processed to create binary seed-only images. To quantify IFPM accuracy, the reconstructed seed positions are forward projected and overlaid on the measured seed images to find the nearest-neighbor distance between measured and computed seed positions for each image pair. Also, the estimated 3D seed coordinates are compared to known seed positions in the phantom and clinically obtained VariSeed planning coordinates for the patient data. Results: For the phantom study, seed localization error is (0.58{+-}0.33) mm. For all four patient cases, the mean registration error is better than 1 mm while compared against the measured seed projections. IFPM converges in 20-28 iterations, with a computation time of about 1.9-2.8 min/iteration on a 1 GHz processor. Conclusions: The IFPM algorithm avoids the need to match corresponding seeds in each projection as required by standard back-projection methods. The authors' results demonstrate {approx}1 mm accuracy in reconstructing the 3D positions of brachytherapy seeds from the measured 2D projections. This algorithm also successfully localizes overlapping clustered and highly migrated seeds in the implant.

  4. MOLECULAR VALIDATED MODEL FOR ADSORPTION OF PROTONATED DYE ON LDH

    Directory of Open Access Journals (Sweden)

    B. M. Braga

    Full Text Available Abstract Hydrotalcite-like compounds are anionic clays of scientific and technological interest for their use as ion exchange materials, catalysts and modified electrodes. Surface phenomenon are important for all these applications. Although conventional analytical methods have enabled progress in understanding the behavior of anionic clays in solution, an evaluation at the atomic scale of the dynamics of their ionic interactions has never been performed. Molecular simulation has become an extremely useful tool to provide this perspective. Our purpose is to validate a simplified model for the adsorption of 5-benzoyl-4-hydroxy-2-methoxy-benzenesulfonic acid (MBSA, a prototype molecule of anionic dyes, onto a hydrotalcite surface. Monte Carlo simulations were performed in the canonical ensemble with MBSA ions and a pore model of hydrotalcite using UFF and ClayFF force fields. The proposed molecular model has allowed us to reproduce experimental data of atomic force microscopy. Influences of protonation during the adsorption process are also presented.

  5. Experimental Validation of a Dynamic Model for Lightweight Robots

    Directory of Open Access Journals (Sweden)

    Alessandro Gasparetto

    2013-03-01

    Full Text Available Nowadays, one of the main topics in robotics research is dynamic performance improvement by means of a lightening of the overall system structure. The effective motion and control of these lightweight robotic systems occurs with the use of suitable motion planning and control process. In order to do so, model-based approaches can be adopted by exploiting accurate dynamic models that take into account the inertial and elastic terms that are usually neglected in a heavy rigid link configuration. In this paper, an effective method for modelling spatial lightweight industrial robots based on an Equivalent Rigid Link System approach is considered from an experimental validation perspective. A dynamic simulator implementing the formulation is used and an experimental test-bench is set-up. Experimental tests are carried out with a benchmark L-shape mechanism.

  6. Criteria of validity for animal models of psychiatric disorders: focus on anxiety disorders and depression

    Directory of Open Access Journals (Sweden)

    Belzung Catherine

    2011-11-01

    Full Text Available Abstract Animal models of psychiatric disorders are usually discussed with regard to three criteria first elaborated by Willner; face, predictive and construct validity. Here, we draw the history of these concepts and then try to redraw and refine these criteria, using the framework of the diathesis model of depression that has been proposed by several authors. We thus propose a set of five major criteria (with sub-categories for some of them; homological validity (including species validity and strain validity, pathogenic validity (including ontopathogenic validity and triggering validity, mechanistic validity, face validity (including ethological and biomarker validity and predictive validity (including induction and remission validity. Homological validity requires that an adequate species and strain be chosen: considering species validity, primates will be considered to have a higher score than drosophila, and considering strains, a high stress reactivity in a strain scores higher than a low stress reactivity in another strain. Pathological validity corresponds to the fact that, in order to shape pathological characteristics, the organism has been manipulated both during the developmental period (for example, maternal separation: ontopathogenic validity and during adulthood (for example, stress: triggering validity. Mechanistic validity corresponds to the fact that the cognitive (for example, cognitive bias or biological mechanisms (such as dysfunction of the hormonal stress axis regulation underlying the disorder are identical in both humans and animals. Face validity corresponds to the observable behavioral (ethological validity or biological (biomarker validity outcomes: for example anhedonic behavior (ethological validity or elevated corticosterone (biomarker validity. Finally, predictive validity corresponds to the identity of the relationship between the triggering factor and the outcome (induction validity and between the effects of

  7. Dynamic validation of the Planck-LFI thermal model

    Energy Technology Data Exchange (ETDEWEB)

    Tomasi, M; Bersanelli, M; Mennella, A [Universita degli Studi di Milano, Via Celoria 16, 20133 Milano (Italy); Cappellini, B [INAF IASF Milano, Via Bassini, 15, 20133, Milano (Italy); Gregorio, A [University of Trieste, Department of Physics, via Valerio 2, 34127 Trieste (Italy); Colombo, F; Lapolla, M [Thales Alenia Space Italia S.p.A., IUEL - Scientific Instruments, S.S. Padana Superiore 290, 20090 Vimodrone (Mi) (Italy); Terenzi, L; Morgante, G; Butler, R C; Mandolesi, N; Valenziano, L [INAF IASF Bologna, via Gobetti 101, 40129 Bologna (Italy); Galeotta, S; Maris, M; Zacchei, A [LFI-DPC INAF-OATs, via Tiepolo 11, 34131 Trieste (Italy)

    2010-01-15

    The Low Frequency Instrument (LFI) is an array of cryogenically cooled radiometers on board the Planck satellite, designed to measure the temperature and polarization anisotropies of the cosmic microwave background (CMB) at 30, 44 and 70 GHz. The thermal requirements of the LFI, and in particular the stringent limits to acceptable thermal fluctuations in the 20 K focal plane, are a critical element to achieve the instrument scientific performance. Thermal tests were carried out as part of the on-ground calibration campaign at various stages of instrument integration. In this paper we describe the results and analysis of the tests on the LFI flight model (FM) performed at Thales Laboratories in Milan (Italy) during 2006, with the purpose of experimentally sampling the thermal transfer functions and consequently validating the numerical thermal model describing the dynamic response of the LFI focal plane. This model has been used extensively to assess the ability of LFI to achieve its scientific goals: its validation is therefore extremely important in the context of the Planck mission. Our analysis shows that the measured thermal properties of the instrument show a thermal damping level better than predicted, therefore further reducing the expected systematic effect induced in the LFI maps. We then propose an explanation of the increased damping in terms of non-ideal thermal contacts.

  8. The Atmospheric Radionuclide Transport Model (ARTM) - Validation of a long-term atmospheric dispersion model

    Science.gov (United States)

    Hettrich, Sebastian; Wildermuth, Hans; Strobl, Christopher; Wenig, Mark

    2016-04-01

    In the last couple of years, the Atmospheric Radionuclide Transport Model (ARTM) has been developed by the German Federal Office for Radiation Protection (BfS) and the Society for Plant and Reactor Security (GRS). ARTM is an atmospheric dispersion model for continuous long-term releases of radionuclides into the atmosphere, based on the Lagrangian particle model. This model, developed in the first place as a more realistic replacement for the out-dated Gaussian plume models, is currently being optimised for further scientific purposes to study atmospheric dispersion in short-range scenarios. It includes a diagnostic wind field model, allows for the application of building structures and multiple sources (including linear, 2-and 3-dimensional source geometries), and considers orography and surface roughness. As an output it calculates the activity concentration, dry and wet deposition and can model also the radioactive decay of Rn-222. As such, ARTM requires to undergo an intense validation process. While for short-term and short-range models, which were mainly developed for examining nuclear accidents or explosions, a few measurement data-sets are available for validation, data-sets for validating long-term models are very sparse and the existing ones mostly prove to be not applicable for validation. Here we present a strategy for the validation of long-term Lagrangian particle models based on the work with ARTM. In our validation study, the first part we present is a comprehensive analysis of the model sensitivities on different parameters like e.g. (simulation grid size resolution, starting random number, amount of simulation particles, etc.). This study provides a good estimation for the uncertainties of the simulation results and consequently can be used to generate model outputs comparable to the available measurements data at various distances from the emission source. This comparison between measurement data from selected scenarios and simulation results

  9. Radiative transfer model for contaminated slabs: experimental validations

    Science.gov (United States)

    Andrieu, F.; Schmidt, F.; Schmitt, B.; Douté, S.; Brissaud, O.

    2015-09-01

    This article presents a set of spectro-goniometric measurements of different water ice samples and the comparison with an approximated radiative transfer model. The experiments were done using the spectro-radiogoniometer described in Brissaud et al. (2004). The radiative transfer model assumes an isotropization of the flux after the second interface and is fully described in Andrieu et al. (2015). Two kinds of experiments were conducted. First, the specular spot was closely investigated, at high angular resolution, at the wavelength of 1.5 μm, where ice behaves as a very absorbing media. Second, the bidirectional reflectance was sampled at various geometries, including low phase angles on 61 wavelengths ranging from 0.8 to 2.0 μm. In order to validate the model, we made qualitative tests to demonstrate the relative isotropization of the flux. We also conducted quantitative assessments by using a Bayesian inversion method in order to estimate the parameters (e.g., sample thickness, surface roughness) from the radiative measurements only. A simple comparison between the retrieved parameters and the direct independent measurements allowed us to validate the model. We developed an innovative Bayesian inversion approach to quantitatively estimate the uncertainties in the parameters avoiding the usual slow Monte Carlo approach. First we built lookup tables, and then we searched the best fits and calculated a posteriori density probability functions. The results show that the model is able to reproduce the geometrical energy distribution in the specular spot, as well as the spectral behavior of water ice slabs. In addition, the different parameters of the model are compatible with independent measurements.

  10. A validation study of a stochastic model of human interaction

    Science.gov (United States)

    Burchfield, Mitchel Talmadge

    The purpose of this dissertation is to validate a stochastic model of human interactions which is part of a developmentalism paradigm. Incorporating elements of ancient and contemporary philosophy and science, developmentalism defines human development as a progression of increasing competence and utilizes compatible theories of developmental psychology, cognitive psychology, educational psychology, social psychology, curriculum development, neurology, psychophysics, and physics. To validate a stochastic model of human interactions, the study addressed four research questions: (a) Does attitude vary over time? (b) What are the distributional assumptions underlying attitudes? (c) Does the stochastic model, {-}N{intlimitssbsp{-infty}{infty}}varphi(chi,tau)\\ Psi(tau)dtau, have utility for the study of attitudinal distributions and dynamics? (d) Are the Maxwell-Boltzmann, Fermi-Dirac, and Bose-Einstein theories applicable to human groups? Approximately 25,000 attitude observations were made using the Semantic Differential Scale. Positions of individuals varied over time and the logistic model predicted observed distributions with correlations between 0.98 and 1.0, with estimated standard errors significantly less than the magnitudes of the parameters. The results bring into question the applicability of Fisherian research designs (Fisher, 1922, 1928, 1938) for behavioral research based on the apparent failure of two fundamental assumptions-the noninteractive nature of the objects being studied and normal distribution of attributes. The findings indicate that individual belief structures are representable in terms of a psychological space which has the same or similar properties as physical space. The psychological space not only has dimension, but individuals interact by force equations similar to those described in theoretical physics models. Nonlinear regression techniques were used to estimate Fermi-Dirac parameters from the data. The model explained a high degree

  11. Validation & verification of a Bayesian network model for aircraft vulnerability

    CSIR Research Space (South Africa)

    Schietekat, Sunelle

    2016-09-01

    Full Text Available : Series B (Statistical Methodology), 50(2), pp. 157-224. 12th INCOSE SA Systems Engineering Conference ISBN 978-0-620-72719-8 Page 103 M&SCO. 2013. Verification, Validation, & Accreditation (VV&A) Recommended Practices Guide (RPG). Retrieved from U....S. DoD Modelling & Simulation Coordination Office. http://www.msco.mil/VVA_RPG.html (last accessed April 8, 2016). Pearl, J. 1988. Probabilistic reasoning in intelligent systems: Networks of plausible inference. Morgan Kaufmann. Sargent, R. G. 1981...

  12. Introduction to Financial Projection Models. Business Management Instructional Software.

    Science.gov (United States)

    Pomeroy, Robert W., III

    This guidebook and teacher's guide accompany a personal computer software program and introduce the key elements of financial projection modeling to project the financial statements of an industrial enterprise. The student will then build a model on an electronic spreadsheet. The guidebook teaches the purpose of a financial model and the steps…

  13. Development and Full Body Validation of a 5th Percentile Female Finite Element Model.

    Science.gov (United States)

    Davis, Matthew L; Koya, Bharath; Schap, Jeremy M; Gayzik, F Scott

    2016-11-01

    To mitigate the societal impact of vehicle crash, researchers are using a variety of tools, including finite element models (FEMs). As part of the Global Human Body Models Consortium (GHBMC) project, comprehensive medical image and anthropometrical data of the 5th percentile female (F05) were acquired for the explicit purpose of FEM development. The F05-O (occupant) FEM model consists of 981 parts, 2.6 million elements, 1.4 million nodes, and has a mass of 51.1 kg. The model was compared to experimental data in 10 validation cases ranging from localized rigid hub impacts to full body sled cases. In order to make direct comparisons to experimental data, which represent the mass of an average male, the model was compared to experimental corridors using two methods: 1) post-hoc scaling the outputs from the baseline F05-O model and 2) geometrically morphing the model to the body habitus of the average male to allow direct comparisons. This second step required running the morphed full body model in all 10 simulations for a total of 20 full body simulations presented. Overall, geometrically morphing the model was found to more closely match the target data with an average ISO score for the rigid impacts of 0.76 compared to 0.67 for the scaled responses. Based on these data, the morphed model was then used for model validation in the vehicle sled cases. Overall, the morphed model attained an average weighted score of 0.69 for the two sled impacts. Hard tissue injuries were also assessed and the baseline F05-O model was found to predict a greater occurrence of pelvic fractures compared to the GHBMC average male model, but predicted fewer rib fractures.

  14. Modeling Uncertainty when Estimating IT Projects Costs

    OpenAIRE

    Winter, Michel; Mirbel, Isabelle; Crescenzo, Pierre

    2014-01-01

    In the current economic context, optimizing projects' cost is an obligation for a company to remain competitive in its market. Introducing statistical uncertainty in cost estimation is a good way to tackle the risk of going too far while minimizing the project budget: it allows the company to determine the best possible trade-off between estimated cost and acceptable risk. In this paper, we present new statistical estimators derived from the way IT companies estimate the projects' costs. In t...

  15. LANL*V2.0: global modeling and validation

    Directory of Open Access Journals (Sweden)

    S. Zaharia

    2011-08-01

    Full Text Available We describe in this paper the new version of LANL*, an artificial neural network (ANN for calculating the magnetic drift invariant L*. This quantity is used for modeling radiation belt dynamics and for space weather applications. We have implemented the following enhancements in the new version: (1 we have removed the limitation to geosynchronous orbit and the model can now be used for a much larger region. (2 The new version is based on the improved magnetic field model by Tsyganenko and Sitnov (2005 (TS05 instead of the older model by Tsyganenko et al. (2003. We have validated the model and compared our results to L* calculations with the TS05 model based on ephemerides for CRRES, Polar, GPS, a LANL geosynchronous satellite, and a virtual RBSP type orbit. We find that the neural network performs very well for all these orbits with an error typically ΔL* * V2.0 artificial neural network is orders of magnitudes faster than traditional numerical field line integration techniques with the TS05 model. It has applications to real-time radiation belt forecasting, analysis of data sets involving decades of satellite of observations, and other problems in space weather.

  16. Numerical simulation and experimental validation of aircraft ground deicing model

    Directory of Open Access Journals (Sweden)

    Bin Chen

    2016-05-01

    Full Text Available Aircraft ground deicing plays an important role of guaranteeing the aircraft safety. In practice, most airports generally use as many deicing fluids as possible to remove the ice, which causes the waste of the deicing fluids and the pollution of the environment. Therefore, the model of aircraft ground deicing should be built to establish the foundation for the subsequent research, such as the optimization of the deicing fluid consumption. In this article, the heat balance of the deicing process is depicted, and the dynamic model of the deicing process is provided based on the analysis of the deicing mechanism. In the dynamic model, the surface temperature of the deicing fluids and the ice thickness are regarded as the state parameters, while the fluid flow rate, the initial temperature, and the injection time of the deicing fluids are treated as control parameters. Ignoring the heat exchange between the deicing fluids and the environment, the simplified model is obtained. The rationality of the simplified model is verified by the numerical simulation and the impacts of the flow rate, the initial temperature and the injection time on the deicing process are investigated. To verify the model, the semi-physical experiment system is established, consisting of the low-constant temperature test chamber, the ice simulation system, the deicing fluid heating and spraying system, the simulated wing, the test sensors, and the computer measure and control system. The actual test data verify the validity of the dynamic model and the accuracy of the simulation analysis.

  17. Evaluation and cross-validation of Environmental Models

    Science.gov (United States)

    Lemaire, Joseph

    Before scientific models (statistical or empirical models based on experimental measurements; physical or mathematical models) can be proposed and selected as ISO Environmental Standards, a Commission of professional experts appointed by an established International Union or Association (e.g. IAGA for Geomagnetism and Aeronomy, . . . ) should have been able to study, document, evaluate and validate the best alternative models available at a given epoch. Examples will be given, indicating that different values for the Earth radius have been employed in different data processing laboratories, institutes or agencies, to process, analyse or retrieve series of experimental observations. Furthermore, invariant magnetic coordinates like B and L, commonly used in the study of Earth's radiation belts fluxes and for their mapping, differ from one space mission data center to the other, from team to team, and from country to country. Worse, users of empirical models generally fail to use the original magnetic model which had been employed to compile B and L , and thus to build these environmental models. These are just some flagrant examples of inconsistencies and misuses identified so far; there are probably more of them to be uncovered by careful, independent examination and benchmarking. A meter prototype, the standard unit length that has been determined on 20 May 1875, during the Diplomatic Conference of the Meter, and deposited at the BIPM (Bureau International des Poids et Mesures). In the same token, to coordinate and safeguard progress in the field of Space Weather, similar initiatives need to be undertaken, to prevent wild, uncontrolled dissemination of pseudo Environmental Models and Standards. Indeed, unless validation tests have been performed, there is guaranty, a priori, that all models on the market place have been built consistently with the same units system, and that they are based on identical definitions for the coordinates systems, etc... Therefore

  18. A maturity model for SCPMS project-an empirical investigation in large sized Moroccan companies

    Directory of Open Access Journals (Sweden)

    Chafik Okar

    2011-03-01

    Full Text Available In the recent years many studies on maturity model have been carried out. Some refer specifically to maturity models for supply chain and performance measurement system. Starting from an analysis of the existing literature, the aim of this paper is to develop a maturity model for the supply chain performance measurement system (SCPMS project based on the concept of critical success factors (CSFs. This model will be validated by two approaches. The first is a pilot test of the model in a Moroccan supply chain to demonstrate his capacity of assessing the maturity of SCPMS project and whether it can develop an improvement roadmap. The second is an empirical investigation in large sized Moroccan companies by using a survey to depict whether it can evaluate the maturity of SCPMS project in different industries.

  19. Downplaying model power in IT project work

    DEFF Research Database (Denmark)

    Richter, Anne; Buhl, Henrik

    2004-01-01

    Executives and information technology specialists often manage IT projects in project teams. Integrative IT systems provide opportunities to manage and restructure work functions, but the process of change often causes serious problems in implementation and diffusion. A central issue in the resea......Executives and information technology specialists often manage IT projects in project teams. Integrative IT systems provide opportunities to manage and restructure work functions, but the process of change often causes serious problems in implementation and diffusion. A central issue...... possible to put issues such as team functions and quality of work on the agenda. Simultaneously, participation competencies seem to have been enhanced....

  20. Dynamic Damage Modeling for IRAC Simulations Project

    Data.gov (United States)

    National Aeronautics and Space Administration — NASA's Integrated Resilient Aircraft Control (IRAC) Project, Preliminary Technical Plan Summary identifies several causal and contributing factors that can lead to...

  1. Soil process modelling in CZO research: gains in data harmonisation and model validation

    Science.gov (United States)

    van Gaans, Pauline; Andrianaki, Maria; Kobierska, Florian; Kram, Pavel; Lamacova, Anna; Lair, Georg; Nikolaidis, Nikos; Duffy, Chris; Regelink, Inge; van Leeuwen, Jeroen P.; de Ruiter, Peter

    2014-05-01

    Various soil process models were applied to four European Critical Zone observatories (CZOs), the core research sites of the FP7 project SoilTrEC: the Damma glacier forefield (CH), a set of three forested catchments on geochemically contrasing bedrocks in the Slavkov Forest (CZ), a chronosequence of soils in the former floodplain of the Danube of Fuchsenbigl/Marchfeld (AT), and the Koiliaris catchments in the north-western part of Crete, (GR). The aim of the modelling exercises was to apply and test soil process models with data from the CZOs for calibration/validation, identify potential limits to the application scope of the models, interpret soil state and soil functions at key stages of the soil life cycle, represented by the four SoilTrEC CZOs, contribute towards harmonisation of data and data acquisition. The models identified as specifically relevant were: The Penn State Integrated Hydrologic Model (PIHM), a fully coupled, multiprocess, multi-scale hydrologic model, to get a better understanding of water flow and pathways, The Soil and Water Assessment Tool (SWAT), a deterministic, continuous time (daily time step) basin scale model, to evaluate the impact of soil management practices, The Rothamsted Carbon model (Roth-C) to simulate organic carbon turnover and the Carbon, Aggregation, and Structure Turnover (CAST) model to include the role of soil aggregates in carbon dynamics, The Ligand Charge Distribution (LCD) model, to understand the interaction between organic matter and oxide surfaces in soil aggregate formation, and The Terrestrial Ecology Model (TEM) to obtain insight into the link between foodweb structure and carbon and nutrient turnover. With some exceptions all models were applied to all four CZOs. The need for specific model input contributed largely to data harmonisation. The comparisons between the CZOs turned out to be of great value for understanding the strength and limitations of the models, as well as the differences in soil conditions

  2. A Report on the Validation of Beryllium Strength Models

    Energy Technology Data Exchange (ETDEWEB)

    Armstrong, Derek Elswick [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-02-05

    This report discusses work on validating beryllium strength models with flyer plate and Taylor rod experimental data. Strength models are calibrated with Hopkinson bar and quasi-static data. The Hopkinson bar data for beryllium provides strain rates up to about 4000 per second. A limitation of the Hopkinson bar data for beryllium is that it only provides information on strain up to about 0.15. The lack of high strain data at high strain rates makes it difficult to distinguish between various strength model settings. The PTW model has been calibrated many different times over the last 12 years. The lack of high strain data for high strain rates has resulted in these calibrated PTW models for beryllium exhibiting significantly different behavior when extrapolated to high strain. For beryllium, the α parameter of PTW has recently been calibrated to high precision shear modulus data. In the past the α value for beryllium was set based on expert judgment. The new α value for beryllium was used in a calibration of the beryllium PTW model by Sky Sjue. The calibration by Sjue used EOS table information to model the temperature dependence of the heat capacity. Also, the calibration by Sjue used EOS table information to model the density changes of the beryllium sample during the Hopkinson bar and quasi-static experiments. In this paper, the calibrated PTW model by Sjue is compared against experimental data and other strength models. The other strength models being considered are a PTW model calibrated by Shuh- Rong Chen and a Steinberg-Guinan type model by John Pedicini. The three strength models are used in a comparison against flyer plate and Taylor rod data. The results show that the Chen PTW model provides better agreement to this data. The Chen PTW model settings have been previously adjusted to provide a better fit to flyer plate data, whereas the Sjue PTW model has not been changed based on flyer plate data. However, the Sjue model provides a reasonable fit to

  3. Validation of NEPTUNE-CFD Two-Phase Flow Models Using Experimental Data

    Directory of Open Access Journals (Sweden)

    Jorge Pérez Mañes

    2014-01-01

    Full Text Available This paper deals with the validation of the two-phase flow models of the CFD code NEPTUNEC-CFD using experimental data provided by the OECD BWR BFBT and PSBT Benchmark. Since the two-phase models of CFD codes are extensively being improved, the validation is a key step for the acceptability of such codes. The validation work is performed in the frame of the European NURISP Project and it was focused on the steady state and transient void fraction tests. The influence of different NEPTUNE-CFD model parameters on the void fraction prediction is investigated and discussed in detail. Due to the coupling of heat conduction solver SYRTHES with NEPTUNE-CFD, the description of the coupled fluid dynamics and heat transfer between the fuel rod and the fluid is improved significantly. The averaged void fraction predicted by NEPTUNE-CFD for selected PSBT and BFBT tests is in good agreement with the experimental data. Finally, areas for future improvements of the NEPTUNE-CFD code were identified, too.

  4. Satellite Cloud Data Validation through MAGIC Ground Observation and the S'COOL Project: Scientific Benefits grounded in Citizen Science

    Science.gov (United States)

    Crecelius, S.; Chambers, L. H.; Lewis, P. M.; Rogerson, T.

    2013-12-01

    The Students' Cloud Observation On-Line (S'COOL) Project was launched in 1997 as the Formal Education and Public Outreach arm of the Clouds and the Earth's Radiant Energy System (CERES) Mission. ROVER, the Citizen Scientist area of S'COOL, started in 2007 and allows participants to make 'roving' observations from any location as opposed to a fixed, registered classroom. The S'COOL Project aids the CERES Mission in trying to answer the research question: 'What is the Effect of Clouds on the Earth's Climate'. Participants from all 50 states, most U.S. Territories, and 63 countries have reported more than 100,500 observations to the S'COOL Project over the past 16 years. The Project is supported by an intuitive website that provides curriculum support and guidance through the observation steps; 1) Request satellite overpass schedule, 2) Observe clouds, and 3) Report cloud observations. The S'COOL Website also hosts a robust database housing all participants' observations as well as the matching satellite data. While the S'COOL observation parameters are based on the data collected by 5 satellite missions, ground observations provide a unique perspective to data validation. Specifically, low to mid level clouds can be obscured by overcast high-level clouds, or difficult to observe from a satellite's perspective due to surface cover or albedo. In these cases, ground observations play an important role in filling the data gaps and providing a better, global picture of our atmosphere and clouds. S'COOL participants, operating within the boundary layer, have an advantage when observing low-level clouds that affect the area we live in, regional weather patterns, and climate change. S'COOL's long-term data set provides a valuable resource to the scientific community in improving the "poorly characterized and poorly represented [clouds] in climate and weather prediction models'. The MAGIC Team contacted S'COOL in early 2012 about making cloud observations as part of the MAGIC

  5. A Multi-Depth Underwater Spectroradiometer for Validation of Remotely-Sensed Ocean Color and Estimation of Seawater Biogeochemical Properties (A) Project

    Data.gov (United States)

    National Aeronautics and Space Administration — A Multi-Depth Underwater Spectroradiometer for Validation of Remotely-Sensed Ocean Color and Estimation of Seawater Biogeochemical Properties (A) Project

  6. Challenges in validating model results for first year ice

    Science.gov (United States)

    Melsom, Arne; Eastwood, Steinar; Xie, Jiping; Aaboe, Signe; Bertino, Laurent

    2017-04-01

    In order to assess the quality of model results for the distribution of first year ice, a comparison with a product based on observations from satellite-borne instruments has been performed. Such a comparison is not straightforward due to the contrasting algorithms that are used in the model product and the remote sensing product. The implementation of the validation is discussed in light of the differences between this set of products, and validation results are presented. The model product is the daily updated 10-day forecast from the Arctic Monitoring and Forecasting Centre in CMEMS. The forecasts are produced with the assimilative ocean prediction system TOPAZ. Presently, observations of sea ice concentration and sea ice drift are introduced in the assimilation step, but data for sea ice thickness and ice age (or roughness) are not included. The model computes the age of the ice by recording and updating the time passed after ice formation as sea ice grows and deteriorates as it is advected inside the model domain. Ice that is younger than 365 days is classified as first year ice. The fraction of first-year ice is recorded as a tracer in each grid cell. The Ocean and Sea Ice Thematic Assembly Centre in CMEMS redistributes a daily product from the EUMETSAT OSI SAF of gridded sea ice conditions which include "ice type", a representation of the separation of regions between those infested by first year ice, and those infested by multi-year ice. The ice type is parameterized based on data for the gradient ratio GR(19,37) from SSMIS observations, and from the ASCAT backscatter parameter. This product also includes information on ambiguity in the processing of the remote sensing data, and the product's confidence level, which have a strong seasonal dependency.

  7. Experimental validation of a numerical model for subway induced vibrations

    Science.gov (United States)

    Gupta, S.; Degrande, G.; Lombaert, G.

    2009-04-01

    This paper presents the experimental validation of a coupled periodic finite element-boundary element model for the prediction of subway induced vibrations. The model fully accounts for the dynamic interaction between the train, the track, the tunnel and the soil. The periodicity or invariance of the tunnel and the soil in the longitudinal direction is exploited using the Floquet transformation, which allows for an efficient formulation in the frequency-wavenumber domain. A general analytical formulation is used to compute the response of three-dimensional invariant or periodic media that are excited by moving loads. The numerical model is validated by means of several experiments that have been performed at a site in Regent's Park on the Bakerloo line of London Underground. Vibration measurements have been performed on the axle boxes of the train, on the rail, the tunnel invert and the tunnel wall, and in the free field, both at the surface and at a depth of 15 m. Prior to these vibration measurements, the dynamic soil characteristics and the track characteristics have been determined. The Bakerloo line tunnel of London Underground has been modelled using the coupled periodic finite element-boundary element approach and free field vibrations due to the passage of a train at different speeds have been predicted and compared to the measurements. The correspondence between the predicted and measured response in the tunnel is reasonably good, although some differences are observed in the free field. The discrepancies are explained on the basis of various uncertainties involved in the problem. The variation in the response with train speed is similar for the measurements as well as the predictions. This study demonstrates the applicability of the coupled periodic finite element-boundary element model to make realistic predictions of the vibrations from underground railways.

  8. Large Eddy Simulation Modeling of Flashback and Flame Stabilization in Hydrogen-Rich Gas Turbines Using a Hierarchical Validation Approach

    Energy Technology Data Exchange (ETDEWEB)

    Clemens, Noel [Univ. of Texas, Austin, TX (United States)

    2015-09-30

    This project was a combined computational and experimental effort to improve predictive capability for boundary layer flashback of premixed swirl flames relevant to gas-turbine power plants operating with high-hydrogen-content fuels. During the course of this project, significant progress in modeling was made on four major fronts: 1) use of direct numerical simulation of turbulent flames to understand the coupling between the flame and the turbulent boundary layer; 2) improved modeling capability for flame propagation in stratified pre-mixtures; 3) improved portability of computer codes using the OpenFOAM platform to facilitate transfer to industry and other researchers; and 4) application of LES to flashback in swirl combustors, and a detailed assessment of its capabilities and limitations for predictive purposes. A major component of the project was an experimental program that focused on developing a rich experimental database of boundary layer flashback in swirl flames. Both methane and high-hydrogen fuels, including effects of elevated pressure (1 to 5 atm), were explored. For this project, a new model swirl combustor was developed. Kilohertz-rate stereoscopic PIV and chemiluminescence imaging were used to investigate the flame propagation dynamics. In addition to the planar measurements, a technique capable of detecting the instantaneous, time-resolved 3D flame front topography was developed and applied successfully to investigate the flow-flame interaction. The UT measurements and legacy data were used in a hierarchical validation approach where flows with increasingly complex physics were used for validation. First component models were validated with DNS and literature data in simplified configurations, and this was followed by validation with the UT 1-atm flashback cases, and then the UT high-pressure flashback cases. The new models and portable code represent a major improvement over what was available before this project was initiated.

  9. Enterprise Projects Set Risk Element Transmission Chaotic Genetic Model

    Directory of Open Access Journals (Sweden)

    Cunbin Li

    2012-08-01

    Full Text Available In order to research projects set risk transfer process and improve risk management efficiency in projects management, combining chaos theory and genetic algorithm, put forward enterprise projects set risk element transmission chaos genetic model. Using logistic chaos mapping and chebyshev chaos mapping mixture, constructed a hybrid chaotic mapping system. The steps of adopting hybrid chaos mapping for genetic operation include projects set initialization, calculation of fitness, selection, crossover and mutation operators, fitness adjustment and condition judgment. The results showed that the model can simulate enterprise projects set risk transmission process very well and it also provides the basis for the enterprise managers to make decisions.

  10. External Validity and Model Validity: A Conceptual Approach for Systematic Review Methodology

    Directory of Open Access Journals (Sweden)

    Raheleh Khorsan

    2014-01-01

    Full Text Available Background. Evidence rankings do not consider equally internal (IV, external (EV, and model validity (MV for clinical studies including complementary and alternative medicine/integrative health care (CAM/IHC research. This paper describe this model and offers an EV assessment tool (EVAT© for weighing studies according to EV and MV in addition to IV. Methods. An abbreviated systematic review methodology was employed to search, assemble, and evaluate the literature that has been published on EV/MV criteria. Standard databases were searched for keywords relating to EV, MV, and bias-scoring from inception to Jan 2013. Tools identified and concepts described were pooled to assemble a robust tool for evaluating these quality criteria. Results. This study assembled a streamlined, objective tool to incorporate for the evaluation of quality of EV/MV research that is more sensitive to CAM/IHC research. Conclusion. Improved reporting on EV can help produce and provide information that will help guide policy makers, public health researchers, and other scientists in their selection, development, and improvement in their research-tested intervention. Overall, clinical studies with high EV have the potential to provide the most useful information about “real-world” consequences of health interventions. It is hoped that this novel tool which considers IV, EV, and MV on equal footing will better guide clinical decision making.

  11. Low Altitude Validation of Geomagnetic Cutoff Models Using SAMPEX Data

    Science.gov (United States)

    Young, S. L.; Kress, B. T.

    2011-12-01

    Single event upsets (SEUs) caused by MeV protons are a concern for satellite operators so AFRL is working to create a tool that can specify and/or forecast SEU probabilities. An important component of the tool's SEU probability calculation will be the local energetic ion spectrum. The portion of that spectrum due to trapped energetic ion population is relatively stable and predictable; however it is more difficult to account for the transient solar energetic particles (SEPs). These particles, which can be ejected from the solar atmosphere during a solar flare or filament eruption or can be energized by coronal mass ejection (CME) driven shocks, can penetrate the Earth's magnetosphere into regions not normally populated by energetic protons. The magnetosphere will provide energy dependent shielding that also depends on its magnetic configuration. During magnetic storms that configuration is modified and the SEP cutoff latitude for a given particle energy can be suppressed up to ~15 degrees equatorward exposing normally shielded regions. As a first step to creating the satellite SEU prediction tool, we are comparing the Smart et al. (Advances in Space Research, 2006) and CISM-Dartmouth (Kress et al., Space Weather, 2010) geomagnetic cutoff tools. While they have provided some of their own validations in the noted papers, our validation will be done consistently between models allowing us to better compare the models.

  12. A geomagnetically induced current warning system: model development and validation

    Science.gov (United States)

    McKay, A.; Clarke, E.; Reay, S.; Thomson, A.

    Geomagnetically Induced Currents (GIC), which can flow in technological systems at the Earth's surface, are a consequence of magnetic storms and Space Weather. A well-documented practical problem for the power transmission industry is that GIC can affect the lifetime and performance of transformers within the power grid. Operational mitigation is widely considered to be one of the best strategies to manage the Space Weather and GIC risk. Therefore in the UK a magnetic storm warning and GIC monitoring and analysis programme has been under development by the British Geological Survey and Scottish Power plc (the power grid operator for Central Scotland) since 1999. Under the auspices of the European Space Agency's service development activities BGS is developing the capability to meet two key user needs that have been identified. These needs are, firstly, the development of a near real-time solar wind shock/ geomagnetic storm warning, based on L1 solar wind data and, secondly, the development of an integrated surface geo-electric field and power grid network model that should allow prediction of GIC throughout the power grid in near real time. While the final goal is a `seamless package', the components of the package utilise diverse scientific techniques. We review progress to date with particular regard to the validation of the individual components of the package. The Scottish power grid response to the October 2003 magnetic storms is also discussed and model and validation data are presented.

  13. Physiologically Based Modelling of Dioxins. I. Validation of a rodent toxicokinetic model

    NARCIS (Netherlands)

    Zeilmaker MJ; Slob W

    1993-01-01

    In this report a rodent Physiologically Based PharmacoKinetic (PBPK) model for 2,3,7,8-tetrachlorodibenzodioxin is described. Validation studies, in which model simulations of TCDD disposition were compared with in vivo TCDD disposition in rodents exposed to TCDD, showed that the model adequately p

  14. Validation of full cavitation model in cryogenic fluids

    Institute of Scientific and Technical Information of China (English)

    CAO XiaoLi; ZHANG XiaoBin; QIU LiMin; GAN ZhiHua

    2009-01-01

    Numerical simulation of cavitation in cryogenic fluids is important in improving the stable operation of he propulsion system in liquid-fuel rocket. It also represents a broader class of problems where the fluid is operating close to its critical point and the thermal effects of cavitation are pronounced. The present article focuses on simulating cryogenic cavitation by implementing the "full cavitation model", coupled with energy equation, in conjunction with iteraUve update of the real fluid properties at local temperatures. Steady state computations are then conducted on hydrofoil and ogive in liquid nitrogen and hydrogen respectively, based on which we explore the mechanism of cavitation with thermal ef-fects. Comprehensive comparisons between the simulation results and experimental data as well as previous computations by other researchers validate the full cavitation model in cryogenic fluids. The sensitivity of cavity length to cavitation number is also examined.

  15. Modelling and validation of multiple reflections for enhanced laser welding

    Science.gov (United States)

    Milewski, J.; Sklar, E.

    1996-05-01

    The effects of multiple internal reflections within a laser weld joint as functions of joint geometry and processing conditions have been characterized. A computer-based ray tracing model is used to predict the reflective propagation of laser beam energy focused into the narrow gap of a metal joint for the purpose of predicting the location of melting and coalescence to form a weld. Quantitative comparisons are made between simulation cases. Experimental results are provided for qualitative model validation. This method is proposed as a way to enhance process efficiency and design laser welds which display deep penetration and high depth-to-width aspect ratios without high powered systems or keyhole mode melting.

  16. SR 97. Alternative models project. Stochastic continuum modelling of Aberg

    Energy Technology Data Exchange (ETDEWEB)

    Widen, H. [Kemakta AB, Stockholm (Sweden); Walker, D. [INTERA KB/DE and S (Sweden)

    1999-08-01

    As part of studies into the siting of a deep repository for nuclear waste, Swedish Nuclear Fuel and Waste Management Company (SKB) has commissioned the Alternative Models Project (AMP). The AMP is a comparison of three alternative modelling approaches to bedrock performance assessment for a single hypothetical repository, arbitrarily named Aberg. The Aberg repository will adopt input parameters from the Aespoe Hard Rock Laboratory in southern Sweden. The models are restricted to an explicit domain, boundary conditions and canister location to facilitate the comparison. The boundary conditions are based on the regional groundwater model provided in digital format. This study is the application of HYDRASTAR, a stochastic continuum groundwater flow and transport-modelling program. The study uses 34 realisations of 945 canister locations in the hypothetical repository to evaluate the uncertainty of the advective travel time, canister flux (Darcy velocity at a canister) and F-ratio. Several comparisons of variability are constructed between individual canister locations and individual realisations. For the ensemble of all realisations with all canister locations, the study found a median travel time of 27 years, a median canister flux of 7.1 x 10{sup -4} m/yr and a median F-ratio of 3.3 x 10{sup 5} yr/m. The overall pattern of regional flow is preserved in the site-scale model, as is reflected in flow paths and exit locations. The site-scale model slightly over-predicts the boundary fluxes from the single realisation of the regional model. The explicitly prescribed domain was seen to be slightly restrictive, with 6% of the stream tubes failing to exit the upper surface of the model. Sensitivity analysis and calibration are suggested as possible extensions of the modelling study.

  17. Assessing uncertainty in pollutant wash-off modelling via model validation.

    Science.gov (United States)

    Haddad, Khaled; Egodawatta, Prasanna; Rahman, Ataur; Goonetilleke, Ashantha

    2014-11-01

    Stormwater pollution is linked to stream ecosystem degradation. In predicting stormwater pollution, various types of modelling techniques are adopted. The accuracy of predictions provided by these models depends on the data quality, appropriate estimation of model parameters, and the validation undertaken. It is well understood that available water quality datasets in urban areas span only relatively short time scales unlike water quantity data, which limits the applicability of the developed models in engineering and ecological assessment of urban waterways. This paper presents the application of leave-one-out (LOO) and Monte Carlo cross validation (MCCV) procedures in a Monte Carlo framework for the validation and estimation of uncertainty associated with pollutant wash-off when models are developed using a limited dataset. It was found that the application of MCCV is likely to result in a more realistic measure of model coefficients than LOO. Most importantly, MCCV and LOO were found to be effective in model validation when dealing with a small sample size which hinders detailed model validation and can undermine the effectiveness of stormwater quality management strategies.

  18. Satellite information of sea ice for model validation

    Science.gov (United States)

    Saheed, P. P.; Mitra, Ashis K.; Momin, Imranali M.; Mahapatra, Debasis K.; Rajagopal, E. N.

    2016-05-01

    Emergence of extensively large computational facilities have enabled the scientific world to use earth system models for understating the prevailing dynamics of the earth's atmosphere, ocean and cryosphere and their inter relations. The sea ice in the arctic and the Antarctic has been identified as one of the main proxies to study the climate changes. The rapid sea-ice melting in the Arctic and disappearance of multi-year sea ice has become a matter of concern. The earth system models couple the ocean, atmosphere and sea-ice in order to bring out the possible inter connections between these three very important components and their role in the changing climate. The Indian monsoon is seen to be subjected to nonlinear changes in the recent years. The rapid ice melt in the Arctic sea ice is apparently linked to the changes in the weather and climate of the Indian subcontinent. The recent findings reveal the relation between the high events occurs in the Indian subcontinent and the Arctic sea ice melt episodes. The coupled models are being used in order to study the depth of these relations. However, the models have to be validated extensively by using measured parameters. The satellite measurements of sea-ice starts from way back in 1979. There have been many data sets available since then. Here in this study, an evaluation of the existing data sets is conducted. There are some uncertainties in these data sets. It could be associated with the absence of a single sensor for a long period of time and also the absence of accurate in-situ measurements in order to validate the satellite measurements.

  19. Development and validation of a liquid composite molding model

    Science.gov (United States)

    Bayldon, John Michael

    2007-12-01

    In composite manufacturing, Vacuum Assisted Resin Transfer Molding (VARTM) is becoming increasingly important as a cost effective manufacturing method of structural composites. In this process the dry preform (reinforcement) is placed on a rigid tool and covered by a flexible film to form an airtight vacuum bag. Liquid resin is drawn under vacuum through the preform inside the vacuum bag. Modeling of this process relies on a good understanding of closely coupled phenomena. The resin flow depends on the preform permeability, which in turn depends on the local fluid pressure and the preform compaction behavior. VARTM models for predicting the flow rate in this process do exist, however, they are not able to properly predict the flow for all classes of reinforcement material. In this thesis, the continuity equation used in VARTM models is reexamined and a modified form proposed. In addition, the compaction behavior of the preform in both saturated and dry states is studied in detail and new models are proposed for the compaction behavior. To assess the validity of the proposed models, the shadow moire method was adapted and used to perform full field measurement of the preform thickness during infusion, in addition to the usual measurements of flow front position. A new method was developed and evaluated for the analysis of the moire data related to the VARTM process, however, the method has wider applicability to other full field thickness measurements. The use of this measurement method demonstrated that although the new compaction models work well in the characterization tests, they do not properly describe all the preform features required for modeling the process. In particular the effect of varying saturation on the preform's behavior requires additional study. The flow models developed did, however, improve the prediction of the flow rate for the more compliant preform material tested, and the experimental techniques have shown where additional test methods

  20. Systematic approach to verification and validation: High explosive burn models

    Energy Technology Data Exchange (ETDEWEB)

    Menikoff, Ralph [Los Alamos National Laboratory; Scovel, Christina A. [Los Alamos National Laboratory

    2012-04-16

    Most material models used in numerical simulations are based on heuristics and empirically calibrated to experimental data. For a specific model, key questions are determining its domain of applicability and assessing its relative merits compared to other models. Answering these questions should be a part of model verification and validation (V and V). Here, we focus on V and V of high explosive models. Typically, model developers implemented their model in their own hydro code and use different sets of experiments to calibrate model parameters. Rarely can one find in the literature simulation results for different models of the same experiment. Consequently, it is difficult to assess objectively the relative merits of different models. This situation results in part from the fact that experimental data is scattered through the literature (articles in journals and conference proceedings) and that the printed literature does not allow the reader to obtain data from a figure in electronic form needed to make detailed comparisons among experiments and simulations. In addition, it is very time consuming to set up and run simulations to compare different models over sufficiently many experiments to cover the range of phenomena of interest. The first difficulty could be overcome if the research community were to support an online web based database. The second difficulty can be greatly reduced by automating procedures to set up and run simulations of similar types of experiments. Moreover, automated testing would be greatly facilitated if the data files obtained from a database were in a standard format that contained key experimental parameters as meta-data in a header to the data file. To illustrate our approach to V and V, we have developed a high explosive database (HED) at LANL. It now contains a large number of shock initiation experiments. Utilizing the header information in a data file from HED, we have written scripts to generate an input file for a hydro code

  1. Multi-Agent Modeling in Managing Six Sigma Projects

    Directory of Open Access Journals (Sweden)

    K. Y. Chau

    2009-10-01

    Full Text Available In this paper, a multi-agent model is proposed for considering the human resources factor in decision making in relation to the six sigma project. The proposed multi-agent system is expected to increase the acccuracy of project prioritization and to stabilize the human resources service level. A simulation of the proposed multiagent model is conducted. The results show that a multi-agent model which takes into consideration human resources when making decisions about project selection and project team formation is important in enabling efficient and effective project management. The multi-agent modeling approach provides an alternative approach for improving communication and the autonomy of six sigma projects in business organizations.

  2. Modeling for Optimal Control : A Validated Diesel-Electric Powertrain Model

    OpenAIRE

    Sivertsson, Martin; Eriksson, Lars

    2014-01-01

    An optimal control ready model of a diesel-electric powertrain is developed,validated and provided to the research community. The aim ofthe model is to facilitate studies of the transient control of diesel-electricpowertrains and also to provide a model for developers of optimizationtools. The resulting model is a four state three control mean valueengine model that captures the significant nonlinearity of the diesel engine, while still being continuously differentiable.

  3. Nonlinear dispersion effects in elastic plates: numerical modelling and validation

    Science.gov (United States)

    Kijanka, Piotr; Radecki, Rafal; Packo, Pawel; Staszewski, Wieslaw J.; Uhl, Tadeusz; Leamy, Michael J.

    2017-04-01

    Nonlinear features of elastic wave propagation have attracted significant attention recently. The particular interest herein relates to complex wave-structure interactions, which provide potential new opportunities for feature discovery and identification in a variety of applications. Due to significant complexity associated with wave propagation in nonlinear media, numerical modeling and simulations are employed to facilitate design and development of new measurement, monitoring and characterization systems. However, since very high spatio- temporal accuracy of numerical models is required, it is critical to evaluate their spectral properties and tune discretization parameters for compromise between accuracy and calculation time. Moreover, nonlinearities in structures give rise to various effects that are not present in linear systems, e.g. wave-wave interactions, higher harmonics generation, synchronism and | recently reported | shifts to dispersion characteristics. This paper discusses local computational model based on a new HYBRID approach for wave propagation in nonlinear media. The proposed approach combines advantages of the Local Interaction Simulation Approach (LISA) and Cellular Automata for Elastodynamics (CAFE). The methods are investigated in the context of their accuracy for predicting nonlinear wavefields, in particular shifts to dispersion characteristics for finite amplitude waves and secondary wavefields. The results are validated against Finite Element (FE) calculations for guided waves in copper plate. Critical modes i.e., modes determining accuracy of a model at given excitation frequency - are identified and guidelines for numerical model parameters are proposed.

  4. Stable Eutectoid Transformation in Nodular Cast Iron: Modeling and Validation

    Science.gov (United States)

    Carazo, Fernando D.; Dardati, Patricia M.; Celentano, Diego J.; Godoy, Luis A.

    2016-10-01

    This paper presents a new microstructural model of the stable eutectoid transformation in a spheroidal cast iron. The model takes into account the nucleation and growth of ferrite grains and the growth of graphite spheroids. Different laws are assumed for the growth of both phases during and below the intercritical stable eutectoid. At a microstructural level, the initial conditions for the phase transformations are obtained from the microstructural simulation of solidification of the material, which considers the divorced eutectic and the subsequent growth of graphite spheroids up to the initiation of the stable eutectoid transformation. The temperature field is obtained by solving the energy equation by means of finite elements. The microstructural (phase change) and macrostructural (energy balance) models are coupled by a sequential multiscale procedure. Experimental validation of the model is achieved by comparison with measured values of fractions and radius of 2D view of ferrite grains. Agreement with such experiments indicates that the present model is capable of predicting ferrite phase fraction and grain size with reasonable accuracy.

  5. Stable Eutectoid Transformation in Nodular Cast Iron: Modeling and Validation

    Science.gov (United States)

    Carazo, Fernando D.; Dardati, Patricia M.; Celentano, Diego J.; Godoy, Luis A.

    2017-01-01

    This paper presents a new microstructural model of the stable eutectoid transformation in a spheroidal cast iron. The model takes into account the nucleation and growth of ferrite grains and the growth of graphite spheroids. Different laws are assumed for the growth of both phases during and below the intercritical stable eutectoid. At a microstructural level, the initial conditions for the phase transformations are obtained from the microstructural simulation of solidification of the material, which considers the divorced eutectic and the subsequent growth of graphite spheroids up to the initiation of the stable eutectoid transformation. The temperature field is obtained by solving the energy equation by means of finite elements. The microstructural (phase change) and macrostructural (energy balance) models are coupled by a sequential multiscale procedure. Experimental validation of the model is achieved by comparison with measured values of fractions and radius of 2D view of ferrite grains. Agreement with such experiments indicates that the present model is capable of predicting ferrite phase fraction and grain size with reasonable accuracy.

  6. Lightweight ZERODUR: Validation of Mirror Performance and Mirror Modeling Predictions

    Science.gov (United States)

    Hull, Tony; Stahl, H. Philip; Westerhoff, Thomas; Valente, Martin; Brooks, Thomas; Eng, Ron

    2017-01-01

    Upcoming spaceborne missions, both moderate and large in scale, require extreme dimensional stability while relying both upon established lightweight mirror materials, and also upon accurate modeling methods to predict performance under varying boundary conditions. We describe tests, recently performed at NASA's XRCF chambers and laboratories in Huntsville Alabama, during which a 1.2 m diameter, f/1.2988% lightweighted SCHOTT lightweighted ZERODUR(TradeMark) mirror was tested for thermal stability under static loads in steps down to 230K. Test results are compared to model predictions, based upon recently published data on ZERODUR(TradeMark). In addition to monitoring the mirror surface for thermal perturbations in XRCF Thermal Vacuum tests, static load gravity deformations have been measured and compared to model predictions. Also the Modal Response(dynamic disturbance) was measured and compared to model. We will discuss the fabrication approach and optomechanical design of the ZERODUR(TradeMark) mirror substrate by SCHOTT, its optical preparation for test by Arizona Optical Systems (AOS). Summarize the outcome of NASA's XRCF tests and model validations

  7. Validation of DWPF Melter Off-Gas Combustion Model

    Energy Technology Data Exchange (ETDEWEB)

    Choi, A.S.

    2000-08-23

    The empirical melter off-gas combustion model currently used in the DWPF safety basis calculations is valid at melter vapor space temperatures above 570 degrees C, as measured in the thermowell. This lower temperature bound coincides with that of the off-gas data used as the basis of the model. In this study, the applicability of the empirical model in a wider temperature range was assessed using the off-gas data collected during two small-scale research melter runs. The first data set came from the Small Cylindrical Melter-2 run in 1985 with the sludge feed coupled with the precipitate hydrolysis product. The second data set came from the 774-A melter run in 1996 with the sludge-only feed prepared with the modified acid addition strategy during the feed pretreatment step. The results of the assessment showed that the data from these two melter runs agreed well with the existing model, and further provided the basis for extending the lower temperature bound of the model to the measured melter vapor space temperature of 445 degrees C.

  8. Comparison and Validation of Four Arctic Sea Ice Thickness Products of the EC POLAR ICE Project

    Science.gov (United States)

    Melsheimer, C.; Makynen, M.; Rasmussen, T. S.; Rudjord, Ø.; Simila, M.; Solberg, R.; Walker, N. P.

    2016-08-01

    Sea ice thickness (SIT) is an important parameter for monitoring Arctic change, modelling and predicting weather and climate, and for navigation and offshore operations. However, SIT is still not very well monitored operationally. In the European Commission (EC) FP7 project "POLAR ICE", three novel SIT products based on different satellite data as well as SIT from a state-of-the- art ocean and sea ice model are fed into a common data handling and distribution system for end users. Each SIT product has different scopes and limitations as to, e.g., spatial and temporal resolution, ice thickness range and geographical domain. The aim of this study is to compare the four different SIT products with each other and with SIT in-situ measurements in order to better understand the differences and limitations, and possibly give recommendations on how to best profit from the synergy of the different data.

  9. Effects of climate model interdependency on the uncertainty quantification of extreme reinfall projections

    DEFF Research Database (Denmark)

    Sunyer Pinya, Maria Antonia; Madsen, H.; Rosbjerg, Dan;

    The inherent uncertainty in climate models is one of the most important uncertainties in climate change impact studies. In recent years, several uncertainty quantification methods based on multi-model ensembles have been suggested. Most of these methods assume that the climate models...... are independent. This study investigates the validity of this assumption and its effects on the estimated probabilistic projections of the changes in the 95% quantile of wet days. The methodology is divided in two main parts. First, the interdependency of the ENSEMBLES RCMs is estimated using the methodology...... developed by Pennell and Reichler (2011). The results show that the projections from the ENSEMBLES RCMs cannot be assumed independent. This result is then used to estimate the uncertainty in climate model projections. A Bayesian approach has been developed using the procedure suggested by Tebaldi et al...

  10. First approximations in avalanche model validations using seismic information

    Science.gov (United States)

    Roig Lafon, Pere; Suriñach, Emma; Bartelt, Perry; Pérez-Guillén, Cristina; Tapia, Mar; Sovilla, Betty

    2017-04-01

    Avalanche dynamics modelling is an essential tool for snow hazard management. Scenario based numerical modelling provides quantitative arguments for decision-making. The software tool RAMMS (WSL Institute for Snow and Avalanche Research SLF) is one such tool, often used by government authorities and geotechnical offices. As avalanche models improve, the quality of the numerical results will depend increasingly on user experience on the specification of input (e.g. release and entrainment volumes, secondary releases, snow temperature and quality). New model developments must continue to be validated using real phenomena data, for improving performance and reliability. The avalanches group form University of Barcelona (RISKNAT - UB), has studied the seismic signals generated from avalanches since 1994. Presently, the group manages the seismic installation at SLF's Vallée de la Sionne experimental site (VDLS). At VDLS the recorded seismic signals can be correlated to other avalanche measurement techniques, including both advanced remote sensing methods (radars, videogrammetry) and obstacle based sensors (pressure, capacitance, optical sender-reflector barriers). This comparison between different measurement techniques allows the group to address the question if seismic analysis can be used alone, on more additional avalanche tracks, to gain insight and validate numerical avalanche dynamics models in different terrain conditions. In this study, we aim to add the seismic data as an external record of the phenomena, able to validate RAMMS models. The seismic sensors are considerable easy and cheaper to install than other physical measuring tools, and are able to record data from the phenomena in every atmospheric conditions (e.g. bad weather, low light, freezing make photography, and other kind of sensors not usable). With seismic signals, we record the temporal evolution of the inner and denser parts of the avalanche. We are able to recognize the approximate position

  11. Modeling and Validation of Sodium Plugging for Heat Exchangers in Sodium-cooled Fast Reactor Systems

    Energy Technology Data Exchange (ETDEWEB)

    Ferroni, Paolo [Westinghouse Electric Company LLC, Cranberry Township, PA (United States). Global Technology Development; Tatli, Emre [Westinghouse Electric Company LLC, Cranberry Township, PA (United States); Czerniak, Luke [Westinghouse Electric Company LLC, Cranberry Township, PA (United States); Sienicki, James J. [Argonne National Lab. (ANL), Argonne, IL (United States); Chien, Hual-Te [Argonne National Lab. (ANL), Argonne, IL (United States); Yoichi, Momozaki [Argonne National Lab. (ANL), Argonne, IL (United States); Bakhtiari, Sasan [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-06-29

    The projectModeling and Validation of Sodium Plugging for Heat Exchangers in Sodium-cooled Fast Reactor Systems” was conducted jointly by Westinghouse Electric Company (Westinghouse) and Argonne National Laboratory (ANL), over the period October 1, 2013- March 31, 2016. The project’s motivation was the need to provide designers of Sodium Fast Reactors (SFRs) with a validated, state-of-the-art computational tool for the prediction of sodium oxide (Na2O) deposition in small-diameter sodium heat exchanger (HX) channels, such as those in the diffusion bonded HXs proposed for SFRs coupled with a supercritical CO2 (sCO2) Brayton cycle power conversion system. In SFRs, Na2O deposition can potentially occur following accidental air ingress in the intermediate heat transport system (IHTS) sodium and simultaneous failure of the IHTS sodium cold trap. In this scenario, oxygen can travel through the IHTS loop and reach the coldest regions, represented by the cold end of the sodium channels of the HXs, where Na2O precipitation may initiate and continue. In addition to deteriorating HX heat transfer and pressure drop performance, Na2O deposition can lead to channel plugging especially when the size of the sodium channels is small, which is the case for diffusion bonded HXs whose sodium channel hydraulic diameter is generally below 5 mm. Sodium oxide melts at a high temperature well above the sodium melting temperature such that removal of a solid plug such as through dissolution by pure sodium could take a lengthy time. The Sodium Plugging Phenomena Loop (SPPL) was developed at ANL, prior to this project, for investigating Na2O deposition phenomena within sodium channels that are prototypical of the diffusion bonded HX channels envisioned for SFR-sCO2 systems. In this project, a Computational Fluid Dynamic (CFD) model capable of simulating the thermal-hydraulics of the SPPL test

  12. Case study for model validation : assessing a model for thermal decomposition of polyurethane foam.

    Energy Technology Data Exchange (ETDEWEB)

    Dowding, Kevin J.; Leslie, Ian H. (New Mexico State University, Las Cruces, NM); Hobbs, Michael L.; Rutherford, Brian Milne; Hills, Richard Guy (New Mexico State University, Las Cruces, NM); Pilch, Martin M.

    2004-10-01

    A case study is reported to document the details of a validation process to assess the accuracy of a mathematical model to represent experiments involving thermal decomposition of polyurethane foam. The focus of the report is to work through a validation process. The process addresses the following activities. The intended application of mathematical model is discussed to better understand the pertinent parameter space. The parameter space of the validation experiments is mapped to the application parameter space. The mathematical models, computer code to solve the models and its (code) verification are presented. Experimental data from two activities are used to validate mathematical models. The first experiment assesses the chemistry model alone and the second experiment assesses the model of coupled chemistry, conduction, and enclosure radiation. The model results of both experimental activities are summarized and uncertainty of the model to represent each experimental activity is estimated. The comparison between the experiment data and model results is quantified with various metrics. After addressing these activities, an assessment of the process for the case study is given. Weaknesses in the process are discussed and lessons learned are summarized.

  13. GEOCHEMICAL RECOGNITION OF SPILLED SEDIMENTS USED IN NUMERICAL MODEL VALIDATION

    Institute of Scientific and Technical Information of China (English)

    Jens R.VALEUR; Steen LOMHOLT; Christian KNUDSEN

    2004-01-01

    A fixed link (tunnel and bridge,in total 16 km) was constructed between Sweden and Denmark during 1995-2000.As part of the work,approximately 16 million tonnes of seabed materials (limestone and clay till) were dredged,and about 0.6 million tonnes of these were spilled in the water.Modelling of the spreading and sedimentation of the spilled sediments took place as part of the environmental monitoring of the construction activities.In order to verify the results of the numerical modelling of sediment spreading and sedimentation,a new method with the purpose of distinguishing between the spilled sediments and the naturally occurring sediments was developed.Because the spilled sediments tend to accumulate at the seabed in areas with natural sediments of the same size,it is difficult to separate these based purely on the physical properties.The new method is based on the geo-chemical differences between the natural sediment in the area and the spill.The basic properties used are the higher content of calcium carbonate material in the spill as compared to the natural sediments and the higher Ca/Sr ratio in the spill compared to shell fragments dominating the natural calcium carbonate deposition in the area.The reason for these differences is that carbonate derived from recent shell debris can be discriminated from Danien limestone,which is the material in which the majority of the dredging took place,on the basis of the Ca/Sr ratio being 488 in Danien Limestone and 237 in shell debris.The geochemical recognition of the origin of the sediments proved useful in separating the spilled from the naturally occurring sediments.Without this separation,validation of the modelling of accumulation of spilled sediments would not have been possible.The method has general validity and can be used in many situations where the origin ora given sediment is sought.

  14. Projecting future expansion of invasive species: comparing and improving methodologies for species distribution modeling.

    Science.gov (United States)

    Mainali, Kumar P; Warren, Dan L; Dhileepan, Kunjithapatham; McConnachie, Andrew; Strathie, Lorraine; Hassan, Gul; Karki, Debendra; Shrestha, Bharat B; Parmesan, Camille

    2015-12-01

    Modeling the distributions of species, especially of invasive species in non-native ranges, involves multiple challenges. Here, we developed some novel approaches to species distribution modeling aimed at reducing the influences of such challenges and improving the realism of projections. We estimated species-environment relationships for Parthenium hysterophorus L. (Asteraceae) with four modeling methods run with multiple scenarios of (i) sources of occurrences and geographically isolated background ranges for absences, (ii) approaches to drawing background (absence) points, and (iii) alternate sets of predictor variables. We further tested various quantitative metrics of model evaluation against biological insight. Model projections were very sensitive to the choice of training dataset. Model accuracy was much improved using a global dataset for model training, rather than restricting data input to the species' native range. AUC score was a poor metric for model evaluation and, if used alone, was not a useful criterion for assessing model performance. Projections away from the sampled space (i.e., into areas of potential future invasion) were very different depending on the modeling methods used, raising questions about the reliability of ensemble projections. Generalized linear models gave very unrealistic projections far away from the training region. Models that efficiently fit the dominant pattern, but exclude highly local patterns in the dataset and capture interactions as they appear in data (e.g., boosted regression trees), improved generalization of the models. Biological knowledge of the species and its distribution was important in refining choices about the best set of projections. A post hoc test conducted on a new Parthenium dataset from Nepal validated excellent predictive performance of our 'best' model. We showed that vast stretches of currently uninvaded geographic areas on multiple continents harbor highly suitable habitats for parthenium

  15. A Team Building Model for Software Engineering Courses Term Projects

    Science.gov (United States)

    Sahin, Yasar Guneri

    2011-01-01

    This paper proposes a new model for team building, which enables teachers to build coherent teams rapidly and fairly for the term projects of software engineering courses. Moreover, the model can also be used to build teams for any type of project, if the team member candidates are students, or if they are inexperienced on a certain subject. The…

  16. A Team Building Model for Software Engineering Courses Term Projects

    Science.gov (United States)

    Sahin, Yasar Guneri

    2011-01-01

    This paper proposes a new model for team building, which enables teachers to build coherent teams rapidly and fairly for the term projects of software engineering courses. Moreover, the model can also be used to build teams for any type of project, if the team member candidates are students, or if they are inexperienced on a certain subject. The…

  17. Flyover Modeling of Planetary Pits - Undergraduate Student Instrument Project

    Science.gov (United States)

    Bhasin, N.; Whittaker, W.

    2015-12-01

    On the surface of the moon and Mars there are hundreds of skylights, which are collapsed holes that are believed to lead to underground caves. This research uses Vision, Inertial, and LIDAR sensors to build a high resolution model of a skylight as a landing vehicle flies overhead. We design and fabricate a pit modeling instrument to accomplish this task, implement software, and demonstrate sensing and modeling capability on a suborbital reusable launch vehicle flying over a simulated pit. Future missions on other planets and moons will explore pits and caves, led by the technology developed by this research. Sensor software utilizes modern graph-based optimization techniques to build 3D models using camera, LIDAR, and inertial data. The modeling performance was validated with a test flyover of a planetary skylight analog structure on the Masten Xombie sRLV. The trajectory profile closely follows that of autonomous planetary powered descent, including translational and rotational dynamics as well as shock and vibration. A hexagonal structure made of shipping containers provides a terrain feature that serves as an appropriate analog for the rim and upper walls of a cylindrical planetary skylight. The skylight analog floor, walls, and rim are modeled in elevation with a 96% coverage rate at 0.25m2 resolution. The inner skylight walls have 5.9cm2 color image resolution and the rims are 6.7cm2 with measurement precision superior to 1m. The multidisciplinary student team included students of all experience levels, with backgrounds in robotics, physics, computer science, systems, mechanical and electrical engineering. The team was commited to authentic scientific experimentation, and defined specific instrument requirements and measurable experiment objectives to verify successful completion.This work was made possible by the NASA Undergraduate Student Instrument Project Educational Flight Opportunity 2013 program. Additional support was provided by the sponsorship of an

  18. Use of international data sets to evaluate and validate pathway assessment models applicable to exposure and dose reconstruction at DOE facilities. Progress report, August 1993--January 1994

    Energy Technology Data Exchange (ETDEWEB)

    Hendrickson, S.M. [ed.] [Lawrence Livermore National Lab., CA (United States); Hoffman, F.O. [Senes Oak Ridge, Inc., TN (United States). Center for Risk Analysis

    1994-03-01

    This project, ``Use of International Data Sets to Evaluate and Validate Pathway Assessment Models Applicable to Exposure and Dose Reconstruction at DOE Facilities,`` grew out of several activities being conducted by the Principal Investigator Dr. F Owen Hoffman. One activity was originally part of the Chernobyl Studies Project and began as Task 7.1D, ``Internal Dose From Direct Contamination of Terrestrial Food Sources.`` The objective of Task 7.1D was to (1) establish a collaborative US USSR effort to improve and validate our methods of forecasting doses and dose commitments from the direct contamination of food sources, and (2) perform experiments and validation studies to improve our ability to predict rapidly and accurately the long-term internal dose from the contamination of agricultural soil. The latter was to include the consideration of remedial measures to block contamination of food grown on contaminated soil. The current objective of this project is to evaluate and validate pathway-assessment models applicable to exposure and dose reconstruction at DOE facilities through use of international data sets. This project incorporates the activity of Task 7.1D into a multinational effort to evaluate data used for the prediction of radionuclide transfer through agricultural and aquatic systems to humans. It also includes participation in two multinational studies, BIOMOVS (BIOspheric MOdel Validation Study) with the Swedish National Institute for Radiation Protection and VAMP (VAlidation of Model Predictions) with the International Atomic Energy Agency, that address testing the performance of models of radionuclide transport through foodchains.

  19. Modelling and validation of spectral reflectance for the colon

    Science.gov (United States)

    Hidovic-Rowe, Dzena; Claridge, Ela

    2005-03-01

    The spectral reflectance of the colon is known to be affected by malignant and pre-malignant changes in the tissue. As part of long-term research on the derivation of diagnostically important parameters characterizing colon histology, we have investigated the effects of the normal histological variability on the remitted spectra. This paper presents a detailed optical model of the normal colon comprising mucosa, submucosa and the smooth muscle layer. Each layer is characterized by five variable histological parameters: the volume fraction of blood, the haemoglobin saturation, the size of the scattering particles, including collagen, the volume fraction of the scattering particles and the layer thickness, and three optical parameters: the anisotropy factor, the refractive index of the medium and the refractive index of the scattering particles. The paper specifies the parameter ranges corresponding to normal colon tissue, including some previously unpublished ones. Diffuse reflectance spectra were modelled using the Monte Carlo method. Validation of the model-generated spectra against measured spectra demonstrated that good correspondence was achieved between the two. The analysis of the effect of the individual histological parameters on the behaviour of the spectra has shown that the spectral variability originates mainly from changes in the mucosa. However, the submucosa and the muscle layer must be included in the model as they have a significant constant effect on the spectral reflectance above 600 nm. The nature of variations in the spectra also suggests that it may be possible to carry out model inversion and to recover parameters characterizing the colon from multi-spectral images. A preliminary study, in which the mucosal blood and collagen parameters were modified to reflect histopathological changes associated with colon cancer, has shown that the spectra predicted by our model resemble measured spectral reflectance of adenocarcinomas. This suggests that

  20. PIV validation of blood-heart valve leaflet interaction modelling.

    Science.gov (United States)

    Kaminsky, R; Dumont, K; Weber, H; Schroll, M; Verdonck, P

    2007-07-01

    The aim of this study was to validate the 2D computational fluid dynamics (CFD) results of a moving heart valve based on a fluid-structure interaction (FSI) algorithm with experimental measurements. Firstly, a pulsatile laminar flow through a monoleaflet valve model with a stiff leaflet was visualized by means of Particle Image Velocimetry (PIV). The inflow data sets were applied to a CFD simulation including blood-leaflet interaction. The measurement section with a fixed leaflet was enclosed into a standard mock loop in series with a Harvard Apparatus Pulsatile Blood Pump, a compliance chamber and a reservoir. Standard 2D PIV measurements were made at a frequency of 60 bpm. Average velocity magnitude results of 36 phase-locked measurements were evaluated at every 10 degrees of the pump cycle. For the CFD flow simulation, a commercially available package from Fluent Inc. was used in combination with inhouse developed FSI code based on the Arbitrary Lagrangian-Eulerian (ALE) method. Then the CFD code was applied to the leaflet to quantify the shear stress on it. Generally, the CFD results are in agreement with the PIV evaluated data in major flow regions, thereby validating the FSI simulation of a monoleaflet valve with a flexible leaflet. The applicability of the new CFD code for quantifying the shear stress on a flexible leaflet is thus demonstrated.

  1. Modeling simulation and experimental validation for mold filling process

    Institute of Scientific and Technical Information of China (English)

    HOU Hua; JU Dong-ying; MAO Hong-kui; D. SAITO

    2006-01-01

    Based on the continuum equation, momentum conservation and energy conservation equations, the numerical model of turbulent flow filling was introduced; the 3-D free surface vof method was improved. Whether or not the numerical simulation results are reasonable, it needs corresponding experimental validations. General experimental techniques for casting fluid flow process include: thermocouple tracking location method, hydraulic simulating method, heat-resistant glass window method and X-ray observation etc. The hydraulic analogue experiment with DPIV technique is arranged to validate the fluent flow program for low-pressure casting with 0.1×105 Pa and 0.6×105 Pa pressure visually. By comparing the flow head, liquid surface, flow velocity, it is found that the filling pressure value influences the flow state strongly. With the increase of the filling pressure, the fluid flow state becomes unstable, the flow head becomes higher, and the filling time is reduced. The simulated results are accordant with the observed results approximately, which can prove the reasonability of our numerical program for filling process further.

  2. Validation of ice loads predicted from meteorological models

    Energy Technology Data Exchange (ETDEWEB)

    Veal, A.; Skea, A. [UK Met Office, Exeter, England (United Kingdom); Wareing, B. [Brian Wareing Tech Ltd., England (United Kingdom)

    2005-07-01

    Results of a field trial conducted on 2 Gerber PVM-100 instruments at Deadwater Fell test site in the United Kingdom were presented. The trials were conducted to assess whether the instruments were capable of measuring the liquid water content of the air, as well as to validate an ice model in terms of accretion rates on different sized conductors. Ambient air temperature, wind speed and direction were monitored at the Deadwater Fell weather station along with load cell values. Time lapse video recorders and a web camera system were used to view the performance of the conductors in varying weather conditions. All data was collected and stored at the site. It was anticipated that output from the instruments could be related to the conditions under which overhead line conductors suffer from ice loads, and help to revise weather maps which have proved to be incompatible with utility experience and the lifetimes achieved by overhead line designs. The data provided from the Deadwater work included logged data from the Gerbers, weather data and load data from a 10 mm diameter aluminium alloy conductor. When the combination of temperature, wind direction and Gerber output indicated icing conditions, they were confirmed by the conductor's load cell data. The tests confirmed the validity of the Gerber instruments to predict the occurrence of icing conditions, when combined with other meteorological data. It was concluded that the instruments may aid in optimized prediction methods for ice loads and icing events. 2 refs., 4 figs.

  3. Validation of elastic cross section models for space radiation applications

    Science.gov (United States)

    Werneth, C. M.; Xu, X.; Norman, R. B.; Ford, W. P.; Maung, K. M.

    2017-02-01

    The space radiation field is composed of energetic particles that pose both acute and long-term risks for astronauts in low earth orbit and beyond. In order to estimate radiation risk to crew members, the fluence of particles and biological response to the radiation must be known at tissue sites. Given that the spectral fluence at the boundary of the shielding material is characterized, radiation transport algorithms may be used to find the fluence of particles inside the shield and body, and the radio-biological response is estimated from experiments and models. The fidelity of the radiation spectrum inside the shield and body depends on radiation transport algorithms and the accuracy of the nuclear cross sections. In a recent study, self-consistent nuclear models based on multiple scattering theory that include the option to study relativistic kinematics were developed for the prediction of nuclear cross sections for space radiation applications. The aim of the current work is to use uncertainty quantification to ascertain the validity of the models as compared to a nuclear reaction database and to identify components of the models that can be improved in future efforts.

  4. Development and validation of a railgun hydrogen pellet injector model

    Energy Technology Data Exchange (ETDEWEB)

    King, T.L. [Univ. of Houston, TX (United States). Dept. of Electrical and Computer Engineering; Zhang, J.; Kim, K. [Univ. of Illinois, Urbana, IL (United States). Dept. of Electrical and Computer Engineering

    1995-12-31

    A railgun hydrogen pellet injector model is presented and its predictions are compared with the experimental data. High-speed hydrogenic ice injection is the dominant refueling method for magnetically confined plasmas used in controlled thermonuclear fusion research. As experimental devices approach the scale of power-producing fusion reactors, the fueling requirements become increasingly more difficult to meet since, due to the large size and the high electron densities and temperatures of the plasma, hypervelocity pellets of a substantial size will need to be injected into the plasma continuously and at high repetition rates. Advanced technologies, such as the railgun pellet injector, are being developed to address this demand. Despite the apparent potential of electromagnetic launchers to produce hypervelocity projectiles, physical effects that were neither anticipated nor well understood have made it difficult to realize this potential. Therefore, it is essential to understand not only the theory behind railgun operation, but the primary loss mechanisms, as well. Analytic tools have been used by many researchers to design and optimize railguns and analyze their performance. This has led to a greater understanding of railgun behavior and opened the door for further improvement. A railgun hydrogen pellet injector model has been developed. The model is based upon a pellet equation of motion that accounts for the dominant loss mechanisms, inertial and viscous drag. The model has been validated using railgun pellet injectors developed by the Fusion Technology Research Laboratory at the University of Illinois at Urbana-Champaign.

  5. Using of Structural Equation Modeling Techniques in Cognitive Levels Validation

    Directory of Open Access Journals (Sweden)

    Natalija Curkovic

    2012-10-01

    Full Text Available When constructing knowledge tests, cognitive level is usually one of the dimensions comprising the test specifications with each item assigned to measure a particular level. Recently used taxonomies of the cognitive levels most often represent some modification of the original Bloom’s taxonomy. There are many concerns in current literature about existence of predefined cognitive levels. The aim of this article is to investigate can structural equation modeling techniques confirm existence of different cognitive levels. For the purpose of the research, a Croatian final high-school Mathematics exam was used (N = 9626. Confirmatory factor analysis and structural regression modeling were used to test three different models. Structural equation modeling techniques did not support existence of different cognitive levels in this case. There is more than one possible explanation for that finding. Some other techniques that take into account nonlinear behaviour of the items as well as qualitative techniques might be more useful for the purpose of the cognitive levels validation. Furthermore, it seems that cognitive levels were not efficient descriptors of the items and so improvements are needed in describing the cognitive skills measured by items.

  6. Development, validation and application of numerical space environment models

    Science.gov (United States)

    Honkonen, Ilja

    2013-10-01

    Currently the majority of space-based assets are located inside the Earth's magnetosphere where they must endure the effects of the near-Earth space environment, i.e. space weather, which is driven by the supersonic flow of plasma from the Sun. Space weather refers to the day-to-day changes in the temperature, magnetic field and other parameters of the near-Earth space, similarly to ordinary weather which refers to changes in the atmosphere above ground level. Space weather can also cause adverse effects on the ground, for example, by inducing large direct currents in power transmission systems. The performance of computers has been growing exponentially for many decades and as a result the importance of numerical modeling in science has also increased rapidly. Numerical modeling is especially important in space plasma physics because there are no in-situ observations of space plasmas outside of the heliosphere and it is not feasible to study all aspects of space plasmas in a terrestrial laboratory. With the increasing number of computational cores in supercomputers, the parallel performance of numerical models on distributed memory hardware is also becoming crucial. This thesis consists of an introduction, four peer reviewed articles and describes the process of developing numerical space environment/weather models and the use of such models to study the near-Earth space. A complete model development chain is presented starting from initial planning and design to distributed memory parallelization and optimization, and finally testing, verification and validation of numerical models. A grid library that provides good parallel scalability on distributed memory hardware and several novel features, the distributed cartesian cell-refinable grid (DCCRG), is designed and developed. DCCRG is presently used in two numerical space weather models being developed at the Finnish Meteorological Institute. The first global magnetospheric test particle simulation based on the

  7. Worksheet of Exogenous Variables that Impact the Success of Validation Stage of Product Delivery of a Project

    Directory of Open Access Journals (Sweden)

    Altino José Mentzingen de Moraes

    2015-02-01

    Full Text Available In the theory presented by PMBoK© - Project Management Book of Knowledge the Project Management [1] features - among others - the Disciplines of Scope and Quality. The Discipline of Scope, to become effective, must provide the execution of a hypothetical step (not defined in its text but perceived by the present author of this paper that can be termed as Planning Stage of Project Result. The Discipline of Quality, to become effective, must provide the execution of an also hypothetical step (also not defined in its text but also perceived by the same present author of this paper that can be termed as Validation Stage of Product Delivery.In the theory presented by PMBoK© - Project Management Book of Knowledge the Project Management [1] features - among others - the Disciplines of Scope and Quality. The Discipline of Scope, to become effective, must provide the execution of a hypothetical step (not defined in its text but perceived by the present author of this paper that can be termed as Planning Stage of Project Result. The Discipline of Quality, to become effective, must provide the execution of an also hypothetical step (also not defined in its text but also perceived by the same present author of this paper that can be termed as Validation Stage of Product Delivery.In the theory presented by PMBoK© - Project Management Book of Knowledge the Project Management [1] features - among others - the Disciplines of Scope and Quality. The Discipline of Scope, to become effective, must provide the execution of a hypothetical step (not defined in its text but perceived by the present author of this paper that can be termed as Planning Stage of Project Result. The Discipline of Quality, to become effective, must provide the execution of an also hypothetical step (also not defined in its text but also perceived by the same present author of this paper that can be termed as Validation Stage of Product Delivery. The importance of both Stages is crucial to

  8. Bayesian-based Project Monitoring: Framework Development and Model Testing

    Directory of Open Access Journals (Sweden)

    Budi Hartono

    2015-12-01

    Full Text Available During project implementation, risk becomes an integral part of project monitoring. Therefore. a tool that could dynamically include elements of risk in project progress monitoring is needed. This objective of this study is to develop a general framework that addresses such a concern. The developed framework consists of three interrelated major building blocks, namely: Risk Register (RR, Bayesian Network (BN, and Project Time Networks (PTN for dynamic project monitoring. RR is used to list and to categorize identified project risks. PTN is utilized for modeling the relationship between project activities. BN is used to reflect the interdependence among risk factors and to bridge RR and PTN. A residential development project is chosen as a working example and the result shows that the proposed framework has been successfully applied. The specific model of the development project is also successfully developed and is used to monitor the project progress. It is shown in this study that the proposed BN-based model provides superior performance in terms of forecast accuracy compared to the extant models.

  9. Final Technical Report: Controlled Hydrogen Fleet and Infrastructure Demonstration and Validation Project

    Energy Technology Data Exchange (ETDEWEB)

    Ronald Grasman

    2011-12-31

    This report summarizes the work conducted under U.S. Department of Energy (DOE) under contract DE-FC36-04GO14285 by Mercedes-Benz & Research Development, North America (MBRDNA), Chrysler, Daimler, Mercedes Benz USA (MBUSA), BP, DTE Energy and NextEnergy to validate fuel cell technologies for infrastructure, transportation as well as assess technology and commercial readiness for the market. The Mercedes Team, together with its partners, tested the technology by operating and fueling hydrogen fuel cell vehicles under real world conditions in varying climate, terrain and driving conditions. Vehicle and infrastructure data was collected to monitor the progress toward the hydrogen vehicle and infrastructure performance targets of $2.00 to 3.00/gge hydrogen production cost and 2,000-hour fuel cell durability. Finally, to prepare the public for a hydrogen economy, outreach activities were designed to promote awareness and acceptance of hydrogen technology. DTE, BP and NextEnergy established hydrogen filling stations using multiple technologies for on-site hydrogen generation, storage and dispensing. DTE established a hydrogen station in Southfield, Michigan while NextEnergy and BP worked together to construct one hydrogen station in Detroit. BP constructed another fueling station in Burbank, California and provided a full-time hydrogen trailer at San Francisco, California and a hydrogen station located at Los Angeles International Airport in Southern, California. Stations were operated between 2005 and 2011. The Team deployed 30 Gen I Fuel Cell Vehicles (FCVs) in the beginning of the project. While 28 Gen I F-CELLs used the A-Class platform, the remaining 2 were Sprinter delivery vans. Fuel cell vehicles were operated by external customers for real-world operations in various regions (ecosystems) to capture various driving patterns and climate conditions (hot, moderate and cold). External operators consisted of F-CELL partner organizations in California and Michigan

  10. MASADA: A Modeling and Simulation Automated Data Analysis framework for continuous data-intensive validation of simulation models

    CERN Document Server

    Foguelman, Daniel Jacob; The ATLAS collaboration

    2016-01-01

    Complex networked computer systems are usually subjected to upgrades and enhancements on a continuous basis. Modeling and simulation of such systems helps with guiding their engineering processes, in particular when testing candi- date design alternatives directly on the real system is not an option. Models are built and simulation exercises are run guided by specific research and/or design questions. A vast amount of operational conditions for the real system need to be assumed in order to focus on the relevant questions at hand. A typical boundary condition for computer systems is the exogenously imposed workload. Meanwhile, in typical projects huge amounts of monitoring information are logged and stored with the purpose of studying the system’s performance in search for improvements. Also research questions change as systems’ operational conditions vary throughout its lifetime. This context poses many challenges to determine the validity of simulation models. As the behavioral empirical base of the sys...

  11. MASADA: A MODELING AND SIMULATION AUTOMATED DATA ANALYSIS FRAMEWORK FOR CONTINUOUS DATA-INTENSIVE VALIDATION OF SIMULATION MODELS

    CERN Document Server

    Foguelman, Daniel Jacob; The ATLAS collaboration

    2016-01-01

    Complex networked computer systems are usually subjected to upgrades and enhancements on a continuous basis. Modeling and simulation of such systems helps with guiding their engineering processes, in particular when testing candi- date design alternatives directly on the real system is not an option. Models are built and simulation exercises are run guided by specific research and/or design questions. A vast amount of operational conditions for the real system need to be assumed in order to focus on the relevant questions at hand. A typical boundary condition for computer systems is the exogenously imposed workload. Meanwhile, in typical projects huge amounts of monitoring information are logged and stored with the purpose of studying the system’s performance in search for improvements. Also research questions change as systems’ operational conditions vary throughout its lifetime. This context poses many challenges to determine the validity of simulation models. As the behavioral empirical base of the sys...

  12. Use of Synchronized Phasor Measurements for Model Validation in ERCOT

    Science.gov (United States)

    Nuthalapati, Sarma; Chen, Jian; Shrestha, Prakash; Huang, Shun-Hsien; Adams, John; Obadina, Diran; Mortensen, Tim; Blevins, Bill

    2013-05-01

    This paper discusses experiences in the use of synchronized phasor measurement technology in Electric Reliability Council of Texas (ERCOT) interconnection, USA. Implementation of synchronized phasor measurement technology in the region is a collaborative effort involving ERCOT, ONCOR, AEP, SHARYLAND, EPG, CCET, and UT-Arlington. As several phasor measurement units (PMU) have been installed in ERCOT grid in recent years, phasor data with the resolution of 30 samples per second is being used to monitor power system status and record system events. Post-event analyses using recorded phasor data have successfully verified ERCOT dynamic stability simulation studies. Real time monitoring software "RTDMS"® enables ERCOT to analyze small signal stability conditions by monitoring the phase angles and oscillations. The recorded phasor data enables ERCOT to validate the existing dynamic models of conventional and/or wind generator.

  13. Psychometric instrumentation: reliability and validity of instruments used for clinical practice, evidence-based practice projects and research studies.

    Science.gov (United States)

    Mayo, Ann M

    2015-01-01

    It is important for CNSs and other APNs to consider the reliability and validity of instruments chosen for clinical practice, evidence-based practice projects, or research studies. Psychometric testing uses specific research methods to evaluate the amount of error associated with any particular instrument. Reliability estimates explain more about how well the instrument is designed, whereas validity estimates explain more about scores that are produced by the instrument. An instrument may be architecturally sound overall (reliable), but the same instrument may not be valid. For example, if a specific group does not understand certain well-constructed items, then the instrument does not produce valid scores when used with that group. Many instrument developers may conduct reliability testing only once, yet continue validity testing in different populations over many years. All CNSs should be advocating for the use of reliable instruments that produce valid results. Clinical nurse specialists may find themselves in situations where reliability and validity estimates for some instruments that are being utilized are unknown. In such cases, CNSs should engage key stakeholders to sponsor nursing researchers to pursue this most important work.

  14. From deep TLS validation to ensembles of atomic models built from elemental motions

    Energy Technology Data Exchange (ETDEWEB)

    Urzhumtsev, Alexandre, E-mail: sacha@igbmc.fr [Centre for Integrative Biology, Institut de Génétique et de Biologie Moléculaire et Cellulaire, CNRS–INSERM–UdS, 1 Rue Laurent Fries, BP 10142, 67404 Illkirch (France); Université de Lorraine, BP 239, 54506 Vandoeuvre-les-Nancy (France); Afonine, Pavel V. [Lawrence Berkeley National Laboratory, Berkeley, California (United States); Van Benschoten, Andrew H.; Fraser, James S. [University of California, San Francisco, San Francisco, CA 94158 (United States); Adams, Paul D. [Lawrence Berkeley National Laboratory, Berkeley, California (United States); University of California Berkeley, Berkeley, CA 94720 (United States); Centre for Integrative Biology, Institut de Génétique et de Biologie Moléculaire et Cellulaire, CNRS–INSERM–UdS, 1 Rue Laurent Fries, BP 10142, 67404 Illkirch (France)

    2015-07-28

    Procedures are described for extracting the vibration and libration parameters corresponding to a given set of TLS matrices and their simultaneous validation. Knowledge of these parameters allows the generation of structural ensembles corresponding to these matrices. The translation–libration–screw model first introduced by Cruickshank, Schomaker and Trueblood describes the concerted motions of atomic groups. Using TLS models can improve the agreement between calculated and experimental diffraction data. Because the T, L and S matrices describe a combination of atomic vibrations and librations, TLS models can also potentially shed light on molecular mechanisms involving correlated motions. However, this use of TLS models in mechanistic studies is hampered by the difficulties in translating the results of refinement into molecular movement or a structural ensemble. To convert the matrices into a constituent molecular movement, the matrix elements must satisfy several conditions. Refining the T, L and S matrix elements as independent parameters without taking these conditions into account may result in matrices that do not represent concerted molecular movements. Here, a mathematical framework and the computational tools to analyze TLS matrices, resulting in either explicit decomposition into descriptions of the underlying motions or a report of broken conditions, are described. The description of valid underlying motions can then be output as a structural ensemble. All methods are implemented as part of the PHENIX project.

  15. Black liquor combustion validated recovery boiler modeling: Final year report. Volume 4 (Appendix IV)

    Energy Technology Data Exchange (ETDEWEB)

    Grace, T.M.; Frederick, W.J.; Salcudean, M.; Wessel, R.A.

    1998-08-01

    This project was initiated in October 1990, with the objective of developing and validating a new computer model of a recovery boiler furnace using a computational fluid dynamics (CFD) code specifically tailored to the requirements for solving recovery boiler flows, and using improved submodels for black liquor combustion based on continued laboratory fundamental studies. The key tasks to be accomplished were as follows: (1) Complete the development of enhanced furnace models that have the capability to accurately predict carryover, emissions behavior, dust concentrations, gas temperatures, and wall heat fluxes. (2) Validate the enhanced furnace models, so that users can have confidence in the predicted results. (3) Obtain fundamental information on aerosol formation, deposition, and hardening so as to develop the knowledge base needed to relate furnace model outputs to plugging and fouling in the convective sections of the boiler. (4) Facilitate the transfer of codes, black liquid submodels, and fundamental knowledge to the US kraft pulp industry. Volume 4 contains the following appendix sections: Radiative heat transfer properties for black liquor combustion -- Facilities and techniques and Spectral absorbance and emittance data; and Radiate heat transfer determination of the optical constants of ash samples from kraft recovery boilers -- Calculation procedure; Computation program; Density determination; Particle diameter determination; Optical constant data; and Uncertainty analysis.

  16. Mitigation of turbidity currents in reservoirs with passive retention systems: validation of CFD modeling

    Science.gov (United States)

    Ferreira, E.; Alves, E.; Ferreira, R. M. L.

    2012-04-01

    Sediment deposition by continuous turbidity currents may affect eco-environmental river dynamics in natural reservoirs and hinder the maneuverability of bottom discharge gates in dam reservoirs. In recent years, innovative techniques have been proposed to enforce the deposition of turbidity further upstream in the reservoir (and away from the dam), namely, the use of solid and permeable obstacles such as water jet screens , geotextile screens, etc.. The main objective of this study is to validate a computational fluid dynamics (CFD) code applied to the simulation of the interaction between a turbidity current and a passive retention system, designed to induce sediment deposition. To accomplish the proposed objective, laboratory tests were conducted where a simple obstacle configuration was subjected to the passage of currents with different initial sediment concentrations. The experimental data was used to build benchmark cases to validate the 3D CFD software ANSYS-CFX. Sensitivity tests of mesh design, turbulence models and discretization requirements were performed. The validation consisted in comparing experimental and numerical results, involving instantaneous and time-averaged sediment concentrations and velocities. In general, a good agreement between the numerical and the experimental values is achieved when: i) realistic outlet conditions are specified, ii) channel roughness is properly calibrated, iii) two equation k - ɛ models are employed iv) a fine mesh is employed near the bottom boundary. Acknowledgements This study was funded by the Portuguese Foundation for Science and Technology through the project PTDC/ECM/099485/2008. The first author thanks the assistance of Professor Moitinho de Almeida from ICIST and to all members of the project and of the Fluvial Hydraulics group of CEHIDRO.

  17. THM Model Validation: Integrated Assessment of Measured and Predicted Behavior

    Energy Technology Data Exchange (ETDEWEB)

    Blair, S C; Carlson, S R; Wagoner, J; Wagner, R; Vogt, T

    2001-10-10

    This paper presents results of coupled thermal-hydrological-mechanical (THM) simulations of two field-scale tests that are part of the thermal testing program being conducted by the Yucca Mountain Site Characterization Project. The two tests analyzed are the Drift-Scale Test (DST) which is sited in an alcove of the Exploratory Studies Facility at Yucca Mountain, Nevada, and the Large Block Test (LBT) which is sited at Fran Ridge, near Yucca Mountain, Nevada. Both of these tests were designed to investigate coupled thermal-mechanical-hydrological-chemical (TMHC) behavior in a fractured, densely welded ash-flow tuff. The geomechanical response of the rock mass forming the DST and the LBT is analyzed using a coupled THM model. A coupled model for analysis of the DST and LBT has been formulated by linking the 3DEC distinct element code for thermal-mechanical analysis and the NUFT finite element code for thermal-hydrologic analysis. The TH model (NUFT) computes temperatures at preselected times using a model that extends from the surface to the water table. The temperatures computed by NUFT are input to 3DEC, which then computes stresses and deformations. The distinct element method was chosen to permit the inclusion of discrete fractures and explicit modeling of fracture deformations. Shear deformations and normal mode opening of fractures are expected to increase fracture permeability and thereby alter thermal hydrologic behavior in these tests. We have collected fracture data for both the DST and the LBT and have used these data in the formulation of the model of the test. This paper presents a brief discussion of the model formulation, along with comparison of simulated and observed deformations at selected locations within the tests.

  18. An approach to model validation and model-based prediction -- polyurethane foam case study.

    Energy Technology Data Exchange (ETDEWEB)

    Dowding, Kevin J.; Rutherford, Brian Milne

    2003-07-01

    Enhanced software methodology and improved computing hardware have advanced the state of simulation technology to a point where large physics-based codes can be a major contributor in many systems analyses. This shift toward the use of computational methods has brought with it new research challenges in a number of areas including characterization of uncertainty, model validation, and the analysis of computer output. It is these challenges that have motivated the work described in this report. Approaches to and methods for model validation and (model-based) prediction have been developed recently in the engineering, mathematics and statistical literatures. In this report we have provided a fairly detailed account of one approach to model validation and prediction applied to an analysis investigating thermal decomposition of polyurethane foam. A model simulates the evolution of the foam in a high temperature environment as it transforms from a solid to a gas phase. The available modeling and experimental results serve as data for a case study focusing our model validation and prediction developmental efforts on this specific thermal application. We discuss several elements of the ''philosophy'' behind the validation and prediction approach: (1) We view the validation process as an activity applying to the use of a specific computational model for a specific application. We do acknowledge, however, that an important part of the overall development of a computational simulation initiative is the feedback provided to model developers and analysts associated with the application. (2) We utilize information obtained for the calibration of model parameters to estimate the parameters and quantify uncertainty in the estimates. We rely, however, on validation data (or data from similar analyses) to measure the variability that contributes to the uncertainty in predictions for specific systems or units (unit-to-unit variability). (3) We perform statistical

  19. Validation of the dermal exposure model in ECETOC TRA.

    Science.gov (United States)

    Marquart, Hans; Franken, Remy; Goede, Henk; Fransman, Wouter; Schinkel, Jody

    2017-08-01

    The ECETOC TRA model (presently version 3.1) is often used to estimate worker inhalation and dermal exposure in regulatory risk assessment. The dermal model in ECETOC TRA has not yet been validated by comparison with independent measured exposure levels. This was the goal of the present study. Measured exposure levels and relevant contextual information were gathered via literature search, websites of relevant occupational health institutes and direct requests for data to industry. Exposure data were clustered in so-called exposure cases, which are sets of data from one data source that are expected to have the same values for input parameters in the ECETOC TRA dermal exposure model. For each exposure case, the 75th percentile of measured values was calculated, because the model intends to estimate these values. The input values for the parameters in ECETOC TRA were assigned by an expert elicitation and consensus building process, based on descriptions of relevant contextual information.From more than 35 data sources, 106 useful exposure cases were derived, that were used for direct comparison with the model estimates. The exposure cases covered a large part of the ECETOC TRA dermal exposure model. The model explained 37% of the variance in the 75th percentiles of measured values. In around 80% of the exposure cases, the model estimate was higher than the 75th percentile of measured values. In the remaining exposure cases, the model estimate may not be sufficiently conservative.The model was shown to have a clear bias towards (severe) overestimation of dermal exposure at low measured exposure values, while all cases of apparent underestimation by the ECETOC TRA dermal exposure model occurred at high measured exposure values. This can be partly explained by a built-in bias in the effect of concentration of substance in product used, duration of exposure and the use of protective gloves in the model. The effect of protective gloves was calculated to be on average a

  20. BUSINESS PROCESS MODELLING FOR PROJECTS COSTS MANAGEMENT IN AN ORGANIZATION

    Directory of Open Access Journals (Sweden)

    PĂTRAŞCU AURELIA

    2014-05-01

    Full Text Available Using Information Technologies in organizations represents an evident progress for company, money economy, time economy and generates value for the organization. In this paper the author proposes to model the business processes for an organization that manages projects costs, because modelling is an important part of any software development process. Using software for projects costs management is essential because it allows the management of all operations according to the established parameters, the management of the projects groups, as well as the management of the projects and subprojects, at different complexity levels.

  1. Radiative transfer model for contaminated slabs : experimental validations

    CERN Document Server

    Andrieu, François; Schmitt, Bernard; Douté, Sylvain; Brissaud, Olivier

    2015-01-01

    This article presents a set of spectro-goniometric measurements of different water ice samples and the comparison with an approximated radiative transfer model. The experiments were done using the spectro-radiogoniometer described in Brissaud et al. (2004). The radiative transfer model assumes an isotropization of the flux after the second interface and is fully described in Andrieu et al. (2015). Two kind of experiments were conducted. First, the specular spot was closely investigated, at high angular resolution, at the wavelength of $1.5\\,\\mbox{\\mu m}$, where ice behaves as a very absorbing media. Second, the bidirectional reflectance was sampled at various geometries, including low phase angles on 61 wavelengths ranging from $0.8\\,\\mbox{\\mu m}$ to $2.0\\,\\mbox{\\mu m}$. In order to validate the model, we made a qualitative test to demonstrate the relative isotropization of the flux. We also conducted quantitative assessments by using a bayesian inversion method in order to estimate the parameters (e.g. sampl...

  2. Neuroinflammatory targets and treatments for epilepsy validated in experimental models.

    Science.gov (United States)

    Aronica, Eleonora; Bauer, Sebastian; Bozzi, Yuri; Caleo, Matteo; Dingledine, Raymond; Gorter, Jan A; Henshall, David C; Kaufer, Daniela; Koh, Sookyong; Löscher, Wolfgang; Louboutin, Jean-Pierre; Mishto, Michele; Norwood, Braxton A; Palma, Eleonora; Poulter, Michael O; Terrone, Gaetano; Vezzani, Annamaria; Kaminski, Rafal M

    2017-07-01

    A large body of evidence that has accumulated over the past decade strongly supports the role of inflammation in the pathophysiology of human epilepsy. Specific inflammatory molecules and pathways have been identified that influence various pathologic outcomes in different experimental models of epilepsy. Most importantly, the same inflammatory pathways have also been found in surgically resected brain tissue from patients with treatment-resistant epilepsy. New antiseizure therapies may be derived from these novel potential targets. An essential and crucial question is whether targeting these molecules and pathways may result in anti-ictogenesis, antiepileptogenesis, and/or disease-modification effects. Therefore, preclinical testing in models mimicking relevant aspects of epileptogenesis is needed to guide integrated experimental and clinical trial designs. We discuss the most recent preclinical proof-of-concept studies validating a number of therapeutic approaches against inflammatory mechanisms in animal models that could represent novel avenues for drug development in epilepsy. Finally, we suggest future directions to accelerate preclinical to clinical translation of these recent discoveries. Wiley Periodicals, Inc. © 2017 International League Against Epilepsy.

  3. [Analysis of tobacco style features using near-infrared spectroscopy and projection model].

    Science.gov (United States)

    Shu, Ru-Xin; Cai, Jia-Yue; Yang, Zheng-Yu; Yang, Kai; Zhao, Long-Lian; Zhang, Lu-Da; Zhang Ye-Hui; Li, Ye-Hui

    2014-10-01

    In the present paper, a total of 4,733 flue-cured tobacco samples collected from 2003 to 2012 in 17 provincial origins and 5 ecological areas were tested by near infrared spectroscopy, including the NONG(Luzhou) flavor 1,580 cartons, QING (Fen) flavor 2004 cartons and Intermediate flavor 1 149 cartons. Using projection model based on principal component and Fisher criterion (PPF), Projection analysis models of tobacco ecological regions and style characteristics were established. Reasonableness of style flavor division is illustrated by the model results of tobacco ecological areas. With the Euclidean distance between the predicted sample projection values and the mean projection values of each class in style characteristics model, a description is given for the prediction samples to quantify the extent of the style features, and their first and second close categories. Using the dispersion of projected values in model and the given threshold value, prediction results can be refined into typical NONG, NONG to Intermediate, Intermediate to NONG, typical Intermediate, Intermediate to QING, QING to Intermediate, typical QING, QING to NONG, NONG to QING, or super-model range. The model was validated by 35 tobacco samples obtained from the re-dryingprocess in 2012 with different origins and parts. This kind of analysis methods not only can achieve discriminant analysis, but also can get richer feature attribute information and provide guidance to raw tobacco processing and formulations.

  4. Robust Proactive Project Scheduling Model for the Stochastic Discrete Time/Cost Trade-Off Problem

    Directory of Open Access Journals (Sweden)

    Hongbo Li

    2015-01-01

    Full Text Available We study the project budget version of the stochastic discrete time/cost trade-off problem (SDTCTP-B from the viewpoint of the robustness in the scheduling. Given the project budget and a set of activity execution modes, each with uncertain activity time and cost, the objective of the SDTCTP-B is to minimize the expected project makespan by determining each activity’s mode and starting time. By modeling the activity time and cost using interval numbers, we propose a proactive project scheduling model for the SDTCTP-B based on robust optimization theory. Our model can generate robust baseline schedules that enable a freely adjustable level of robustness. We convert our model into its robust counterpart using a form of the mixed-integer programming model. Extensive experiments are performed on a large number of randomly generated networks to validate our model. Moreover, simulation is used to investigate the trade-off between the advantages and the disadvantages of our robust proactive project scheduling model.

  5. A Computational Model with Experimental Validation for DNA Flow in Microchannels

    Energy Technology Data Exchange (ETDEWEB)

    Nonaka, A; Gulati, S; Trebotich, D; Miller, G H; Muller, S J; Liepmann, D

    2005-02-02

    The authors compare a computational model to experimental data for DNA-laden flow in microchannels. The purpose of this work in progress is to validate a new numerical algorithm for viscoelastic flow using the Oldroyd-B model. The numerical approach is a stable and convergent polymeric stress-splitting scheme for viscoelasticity. They treat the hyperbolic part of the equations of motion with an embedded boundary method for solving hyperbolic conservation laws in irregular domains. They enforce incompressibility and evolve velocity and pressure with a projection method. The experiments are performed using epifluorescent microscopy and digital particle image velocimetry to measure velocity fields and track the conformation of biological macromolecules. They present results comparing velocity fields and the observations of computed fluid stress on molecular conformation in various microchannels.

  6. A six-factor model of brand personality and its predictive validity

    Directory of Open Access Journals (Sweden)

    Živanović Marko

    2017-01-01

    Full Text Available The study examines applicability and usefulness of HEXACO-based model in the description of brand personality. Following contemporary theoretical developments in human personality research, Study 1 explored the latent personality structure of 120 brands using descriptors of six personality traits as defined in HEXACO model: Honesty-Humility, Emotionality, Extraversion, Agreeableness, Conscientiousness, and Openness. The results of exploratory factor analyses have supported HEXACO personality six-factor structure to a large extent. In Study 2 we addressed the question of predictive validity of HEXACO-based brand personality. Brand personality traits, but predominantly Honesty-Humility, accounted for substantial amount of variance in prediction of important aspects of consumer-brand relationship: attitude toward brand, perceived quality of a brand, and brand loyalty. The implications of applying HEXACO-based brand personality in marketing research are discussed. [Project of the Serbian Ministry of Education, Science and Technological Development, Grant no. 179018 and Grant no. 175012

  7. Unit testing, model validation, and biological simulation [version 1; referees: 2 approved, 1 approved with reservations

    Directory of Open Access Journals (Sweden)

    Gopal P. Sarma

    2016-08-01

    Full Text Available The growth of the software industry has gone hand in hand with the development of tools and cultural practices for ensuring the reliability of complex pieces of software. These tools and practices are now acknowledged to be essential to the management of modern software. As computational models and methods have become increasingly common in the biological sciences, it is important to examine how these practices can accelerate biological software development and improve research quality. In this article, we give a focused case study of our experience with the practices of unit testing and test-driven development in OpenWorm, an open-science project aimed at modeling Caenorhabditis elegans. We identify and discuss the challenges of incorporating test-driven development into a heterogeneous, data-driven project, as well as the role of model validation tests, a category of tests unique to software which expresses scientific models.

  8. Long-term durum wheat monoculture: modelling and future projection

    Directory of Open Access Journals (Sweden)

    Ettore Bernardoni

    2012-03-01

    Full Text Available The potential effects of future climate change on grain production of a winter durum wheat cropping system were investigated. Based on future climate change projections, derived from a statistical downscaling process applied to the HadCM3 general circulation model and referred to two IPCC scenarios (A2 and B1, the response on yield and aboveground biomass (AGB and the variation in total organic carbon (TOC were explored. The software used in this work is an hybrid dynamic simulation model able to simulate, under different pedoclimatic conditions, the processes involved in cropping system such as crop growth and development, water and nitrogen balance. It implements different approaches in order to ensure accurate simulation of the mainprocess related to soil-crop-atmosphere continuum.The model was calibrated using soil data, crop yield, AGB and phenology coming from a long-term experiment, located in Apulia region. The calibration was performed using data collected in the period 1978–1990; validation was carried out on the 1991–2009 data. Phenology simulation was sufficiently accurate, showing some limitation only in predicting the physiological maturity. Yields and AGBs were predicted with an acceptable accuracy during both calibration and validation. CRM resulted always close to optimum value, EF in every case scored positive value, the value of index r2 was good, although in some cases values lower than 0.6 were calculated. Slope of the linear regression equation between measured and simulated values was always close to 1, indicating an overall good performance of the model. Both future climate scenarios led to a general increase in yields but a slightly decrease in AGB values. Data showed variations in the total production and yield among the different periods due to the climate variation. TOC evolution suggests that the combination of temperature and precipitation is the main factor affecting TOC variation under future scenarios

  9. Calibration and validation of a spar-type floating offshore wind turbine model using the FAST dynamic simulation tool

    Science.gov (United States)

    Browning, J. R.; Jonkman, J.; Robertson, A.; Goupee, A. J.

    2014-12-01

    High-quality computer simulations are required when designing floating wind turbines because of the complex dynamic responses that are inherent with a high number of degrees of freedom and variable metocean conditions. In 2007, the FAST wind turbine simulation tool, developed and maintained by the U.S. Department of Energy's (DOE's) National Renewable Energy Laboratory (NREL), was expanded to include capabilities that are suitable for modeling floating offshore wind turbines. In an effort to validate FAST and other offshore wind energy modeling tools, DOE funded the DeepCwind project that tested three prototype floating wind turbines at 1/50th scale in a wave basin, including a semisubmersible, a tension-leg platform, and a spar buoy. This paper describes the use of the results of the spar wave basin tests to calibrate and validate the FAST offshore floating simulation tool, and presents some initial results of simulated dynamic responses of the spar to several combinations of wind and sea states. Wave basin tests with the spar attached to a scale model of the NREL 5-megawatt reference wind turbine were performed at the Maritime Research Institute Netherlands under the DeepCwind project. This project included free-decay tests, tests with steady or turbulent wind and still water (both periodic and irregular waves with no wind), and combined wind/wave tests. The resulting data from the 1/50th model was scaled using Froude scaling to full size and used to calibrate and validate a full-size simulated model in FAST. Results of the model calibration and validation include successes, subtleties, and limitations of both wave basin testing and FAST modeling capabilities.

  10. A dashboard for measuring capability when designing, implementing and validating business continuity and disaster recovery projects.

    Science.gov (United States)

    Sheth, Sandesh; McHugh, Joseph; Jones, Freyae

    2008-04-01

    This paper proposes an approach for designing a business continuity management system (BCMS) dashboard constructed using resiliency capability levels. The dashboard ensures the ability to track and baseline the present capability level and focus on the activities that would help leapfrog into the higher capability levels; it also provides guidance for governance. The model is based on the building blocks of a comprehensive BCMS and SMART maturity levels (ie specific, measurable, achievable, realistic and time-bound). In terms of principles of maturity, the dashboard draws its inspiration from the Software Engineering Institute's capability maturity model. However, the commonalities end there, as the components of a BCMS are different from those of software development. This customer-centric paper addresses the need of professionals who are entrusted with developing, implementing and validating business continuity and disaster recovery plans. The paper does not address the technologies used for backing up and recovering systems, platforms, databases and applications or the contents of a business impact analysis survey, risk assessment, vital records plan, emergency response procedures, disaster recovery plans etc.

  11. Validation and comparison of aerodynamic modelling approaches for wind turbines

    Science.gov (United States)

    Blondel, F.; Boisard, R.; Milekovic, M.; Ferrer, G.; Lienard, C.; Teixeira, D.

    2016-09-01

    The development of large capacity Floating Offshore Wind Turbines (FOWT) is an interdisciplinary challenge for the design solvers, requiring accurate modelling of both hydrodynamics, elasticity, servodynamics and aerodynamics all together. Floating platforms will induce low-frequency unsteadiness, and for large capacity turbines, the blade induced vibrations will lead to high-frequency unsteadiness. While yawed inflow conditions are still a challenge for commonly used aerodynamic methods such as the Blade Element Momentum method (BEM), the new sources of unsteadiness involved by large turbine scales and floater motions have to be tackled accurately, keeping the computational cost small enough to be compatible with design and certification purposes. In the light of this, this paper will focus on the comparison of three aerodynamic solvers based on BEM and vortex methods, on standard, yawed and unsteady inflow conditions. We will focus here on up-to-date wind tunnel experiments, such as the Unsteady Aerodynamics Experiment (UAE) database and the MexNext international project.

  12. Food for thought: Overconfidence in model projections

    DEFF Research Database (Denmark)

    Brander, Keith; Neuheimer, Anna; Andersen, Ken Haste

    2013-01-01

    There is considerable public and political interest in the state of marine ecosystems and fisheries, but the reliability of some recent projections has been called into question. New information about declining fish stocks, loss of biodiversity, climate impacts, and management failure is frequently...

  13. Rapid Energy Modeling Workflow Demonstration Project

    Science.gov (United States)

    2014-01-01

    CONTACT Point of Contact Organization Phone E-Mail Role In Project John Sullivan Autodesk, Inc. 111 McInnis Parkway San Rafael, CA 94903...McInnis Parkway San Rafael, CA 94903 Phone: 703-827-7213 E-Mail: john.rittling@autodesk.com Collaborator Mark Frost Autodesk, Inc. 111 McInnis

  14. Validation model for Raman based skin carotenoid detection.

    Science.gov (United States)

    Ermakov, Igor V; Gellermann, Werner

    2010-12-01

    measurement, and which can be easily removed for subsequent biochemical measurements. Excellent correlation (coefficient R=0.95) is obtained for this tissue site which could serve as a model site for scaled up future validation studies of large populations. The obtained results provide proof that resonance Raman spectroscopy is a valid non-invasive objective methodology for the quantitative assessment of carotenoid antioxidants in human skin in vivo.

  15. An inventory control project in a major Danish company using compound renewal demand models

    DEFF Research Database (Denmark)

    Larsen, Christian; Seiding, Claus Hoe; Teller, Christian

    operation is highly automated. However, the procedures for estimating demands and the policies for the inventory control system that were in use at the beginning of the project did not fully match the sophisticated technological standard of the physical system. During the initial phase of the project...... inventory control variables based on the fitted demand distributions and a service level requirement stated in terms of an order fill rate. Finally, we validated the results of our models against the procedures that had been in use in the company. It was concluded that the new procedures were considerably...

  16. Validation of conducting wall models using magnetic measurements

    Science.gov (United States)

    Hanson, J. M.; Bialek, J.; Turco, F.; King, J.; Navratil, G. A.; Strait, E. J.; Turnbull, A.

    2016-10-01

    The impact of conducting wall eddy currents on perturbed magnetic field measurements is a key issue for understanding the measurement and control of long-wavelength MHD stability in tokamak devices. As plasma response models have growth in sophistication, the need to understand and resolve small changes in these measurements has become more important, motivating increased fidelity in simulations of externally applied fields and the wall eddy current response. In this manuscript, we describe thorough validation studies of the wall models in the mars-f and valen stability codes, using coil-sensor vacuum coupling measurements from the DIII-D tokamak (Luxon et al 2005 Fusion Sci. Technol. 48 807). The valen formulation treats conducting structures with arbitrary three-dimensional geometries, while mars-f uses an axisymmetric wall model and a spectral decomposition of the problem geometry with a fixed toroidal harmonic n. The vacuum coupling measurements have a strong sensitivity to wall eddy currents induced by time-changing coil currents, owing to the close proximities of both the sensors and coils to the wall. Measurements from individual coil and sensor channels are directly compared with valen predictions. It is found that straightforward improvements to the valen model, such as refining the wall mesh and simulating the vertical extent of the DIII-D poloidal field sensors, lead to good agreement with the experimental measurements. In addition, couplings to multi-coil, n  =  1 toroidal mode perturbations are calculated from the measurements and compared with predictions from both codes. The toroidal mode comparisons favor the fully three-dimensional simulation approach, likely because this approach naturally treats n  >  1 sidebands generated by the coils and wall eddy currents, as well as the n  =  1 fundamental.

  17. Structure Modeling and Validation applied to Source Physics Experiments (SPEs)

    Science.gov (United States)

    Larmat, C. S.; Rowe, C. A.; Patton, H. J.

    2012-12-01

    The U. S. Department of Energy's Source Physics Experiments (SPEs) comprise a series of small chemical explosions used to develop a better understanding of seismic energy generation and wave propagation for low-yield explosions. In particular, we anticipate improved understanding of the processes through which shear waves are generated by the explosion source. Three tests, 100, 1000 and 1000 kg yields respectively, were detonated in the same emplacement hole and recorded on the same networks of ground motion sensors in the granites of Climax Stock at the Nevada National Security Site. We present results for the analysis and modeling of seismic waveforms recorded close-in on five linear geophone lines extending radially from ground zero, having offsets from 100 to 2000 m and station spacing of 100 m. These records exhibit azimuthal variations of P-wave arrival times, and phase velocity, spreading and attenuation properties of high-frequency Rg waves. We construct a 1D seismic body-wave model starting from a refraction analysis of P-waves and adjusting to address time-domain and frequency-domain dispersion measurements of Rg waves between 2 and 9 Hz. The shallowest part of the structure we address using the arrival times recorded by near-field accelerometers residing within 200 m of the shot hole. We additionally perform a 2D modeling study with the Spectral Element Method (SEM) to investigate which structural features are most responsible for the observed variations, in particular anomalously weak amplitude decay in some directions of this topographically complicated locality. We find that a near-surface, thin, weathered layer of varying thickness and low wave speeds plays a major role on the observed waveforms. We anticipate performing full 3D modeling of the seismic near-field through analysis and validation of waveforms on the 5 radial receiver arrays.

  18. Multilevel modelling of mechanical properties of textile composites: ITOOL Project

    NARCIS (Netherlands)

    Van Den Broucke, Bjorn; Drechsler, Klaus; Hanisch, Vera; Hartung, Daniel; Ivanov, Dimitry S.; Koissin, Vitaly E.; Lomov, Stepan V.; Middendorf, Peter

    2007-01-01

    The paper presents an overview of the multi-level modelling of textile composites in the ITOOL project, focusing on the models of textile reinforcements, which serve as a basis for micromechanical models of textile composites on the unit cell level. The modelling is performed using finite element an

  19. Nonparametric model validations for hidden Markov models with applications in financial econometrics.

    Science.gov (United States)

    Zhao, Zhibiao

    2011-06-01

    We address the nonparametric model validation problem for hidden Markov models with partially observable variables and hidden states. We achieve this goal by constructing a nonparametric simultaneous confidence envelope for transition density function of the observable variables and checking whether the parametric density estimate is contained within such an envelope. Our specification test procedure is motivated by a functional connection between the transition density of the observable variables and the Markov transition kernel of the hidden states. Our approach is applicable for continuous time diffusion models, stochastic volatility models, nonlinear time series models, and models with market microstructure noise.

  20. Parental modelling of eating behaviours: observational validation of the Parental Modelling of Eating Behaviours scale (PARM).

    Science.gov (United States)

    Palfreyman, Zoe; Haycraft, Emma; Meyer, Caroline

    2015-03-01

    Parents are important role models for their children's eating behaviours. This study aimed to further validate the recently developed Parental Modelling of Eating Behaviours Scale (PARM) by examining the relationships between maternal self-reports on the PARM with the modelling practices exhibited by these mothers during three family mealtime observations. Relationships between observed maternal modelling and maternal reports of children's eating behaviours were also explored. Seventeen mothers with children aged between 2 and 6 years were video recorded at home on three separate occasions whilst eating a meal with their child. Mothers also completed the PARM, the Children's Eating Behaviour Questionnaire and provided demographic information about themselves and their child. Findings provided validation for all three PARM subscales, which were positively associated with their observed counterparts on the observational coding scheme (PARM-O). The results also indicate that habituation to observations did not change the feeding behaviours displayed by mothers. In addition, observed maternal modelling was significantly related to children's food responsiveness (i.e., their interest in and desire for foods), enjoyment of food, and food fussiness. This study makes three important contributions to the literature. It provides construct validation for the PARM measure and provides further observational support for maternal modelling being related to lower levels of food fussiness and higher levels of food enjoyment in their children. These findings also suggest that maternal feeding behaviours remain consistent across repeated observations of family mealtimes, providing validation for previous research which has used single observations. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. Application and Validation of a GIS Model for Local Tsunami Vulnerability and Mortality Risk Analysis

    Science.gov (United States)

    Harbitz, C. B.; Frauenfelder, R.; Kaiser, G.; Glimsdal, S.; Sverdrup-thygeson, K.; Løvholt, F.; Gruenburg, L.; Mc Adoo, B. G.

    2015-12-01

    The 2011 Tōhoku tsunami caused a high number of fatalities and massive destruction. Data collected after the event allow for retrospective analyses. Since 2009, NGI has developed a generic GIS model for local analyses of tsunami vulnerability and mortality risk. The mortality risk convolves the hazard, exposure, and vulnerability. The hazard is represented by the maximum tsunami flow depth (with a corresponding likelihood), the exposure is described by the population density in time and space, while the vulnerability is expressed by the probability of being killed as a function of flow depth and building class. The analysis is further based on high-resolution DEMs. Normally a certain tsunami scenario with a corresponding return period is applied for vulnerability and mortality risk analysis. Hence, the model was first employed for a tsunami forecast scenario affecting Bridgetown, Barbados, and further developed in a forecast study for the city of Batangas in the Philippines. Subsequently, the model was tested by hindcasting the 2009 South Pacific tsunami in American Samoa. This hindcast was based on post-tsunami information. The GIS model was adapted for optimal use of the available data and successfully estimated the degree of mortality.For further validation and development, the model was recently applied in the RAPSODI project for hindcasting the 2011 Tōhoku tsunami in Sendai and Ishinomaki. With reasonable choices of building vulnerability, the estimated expected number of fatalities agree well with the reported death toll. The results of the mortality hindcast for the 2011 Tōhoku tsunami substantiate that the GIS model can help to identify high tsunami mortality risk areas, as well as identify the main risk drivers.The research leading to these results has received funding from CONCERT-Japan Joint Call on Efficient Energy Storage and Distribution/Resilience against Disasters (http://www.concertjapan.eu; project RAPSODI - Risk Assessment and design of

  2. Checklist for the qualitative evaluation of clinical studies with particular focus on external validity and model validity

    Directory of Open Access Journals (Sweden)

    Vollmar Horst C

    2006-12-01

    Full Text Available Abstract Background It is often stated that external validity is not sufficiently considered in the assessment of clinical studies. Although tools for its evaluation have been established, there is a lack of awareness of their significance and application. In this article, a comprehensive checklist is presented addressing these relevant criteria. Methods The checklist was developed by listing the most commonly used assessment criteria for clinical studies. Additionally, specific lists for individual applications were included. The categories of biases of internal validity (selection, performance, attrition and detection bias correspond to structural, treatment-related and observational differences between the test and control groups. Analogously, we have extended these categories to address external validity and model validity, regarding similarity between the study population/conditions and the general population/conditions related to structure, treatment and observation. Results A checklist is presented, in which the evaluation criteria concerning external validity and model validity are systemised and transformed into a questionnaire format. Conclusion The checklist presented in this article can be applied to both planning and evaluating of clinical studies. We encourage the prospective user to modify the checklists according to the respective application and research question. The higher expenditure needed for the evaluation of clinical studies in systematic reviews is justified, particularly in the light of the influential nature of their conclusions on therapeutic decisions and the creation of clinical guidelines.

  3. Prognostic models for locally advanced cervical cancer: external validation of the published models.

    Science.gov (United States)

    Lora, David; Gómez de la Cámara, Agustín; Fernández, Sara Pedraza; Enríquez de Salamanca, Rafael; Gómez, José Fermín Pérez Regadera

    2017-09-01

    To externally validate the prognostic models for predicting the time-dependent outcome in patients with locally advanced cervical cancer (LACC) who were treated with concurrent chemoradiotherapy in an independent cohort. A historical cohort of 297 women with LACC who were treated with radical concurrent chemoradiotherapy from 1999 to 2014 at the 12 de Octubre University Hospital (H12O), Madrid, Spain. The external validity of prognostic models was quantified regarding discrimination, calibration, measures of overall performance, and decision curve analyses. The review identified 8 studies containing 13 prognostic models. Different (International Federation of Gynecology and Obstetrics [FIGO] stages, parametrium involvement, hydronephrosis, location of positive nodes, and race) but related cohorts with validation cohort (5-year overall survival [OS]=70%; 5-year disease-free survival [DFS]=64%; average age of 50; and over 79% squamous cell) were evaluated. The following models exhibited good external validity in terms of discrimination and calibration but limited clinical utility: the OS model at 3 year from Kidd et al.'s study (area under the receiver operating characteristic curve [AUROC]=0.69; threshold of clinical utility [TCU] between 36% and 50%), the models of DFS at 1 year from Kidd et al.'s study (AUROC=0.64; TCU between 24% and 32%) and 2 years from Rose et al.'s study (AUROC=0.70; TCU between 19% and 58%) and the distant recurrence model at 5 years from Kang et al.'s study (AUROC=0.67; TCU between 12% and 36%). The external validation revealed the statistical and clinical usefulness of 4 prognostic models published in the literature.

  4. Validation of population-based disease simulation models: a review of concepts and methods

    Directory of Open Access Journals (Sweden)

    Sharif Behnam

    2010-11-01

    Full Text Available Abstract Background Computer simulation models are used increasingly to support public health research and policy, but questions about their quality persist. The purpose of this article is to review the principles and methods for validation of population-based disease simulation models. Methods We developed a comprehensive framework for validating population-based chronic disease simulation models and used this framework in a review of published model validation guidelines. Based on the review, we formulated a set of recommendations for gathering evidence of model credibility. Results Evidence of model credibility derives from examining: 1 the process of model development, 2 the performance of a model, and 3 the quality of decisions based on the model. Many important issues in model validation are insufficiently addressed by current guidelines. These issues include a detailed evaluation of different data sources, graphical representation of models, computer programming, model calibration, between-model comparisons, sensitivity analysis, and predictive validity. The role of external data in model validation depends on the purpose of the model (e.g., decision analysis versus prediction. More research is needed on the methods of comparing the quality of decisions based on different models. Conclusion As the role of simulation modeling in population health is increasing and models are becoming more complex, there is a need for further improvements in model validation methodology and common standards for evaluating model credibility.

  5. DNDC Model Calibration, Validation and Quantification of Structural Uncertainty to Support Rice Methane Offset Protocols

    Science.gov (United States)

    Salas, W.; Ducey, M. J.; Li, C.

    2014-12-01

    Agriculture represents an important near-term option for GHG offsets. Currently, the most widely accepted low-cost approaches to quantify N2O and CH4 emissions are based on emission factors. Given that N2O and CH4 emissions from agricultural practices exhibit high spatial and temporal variability, emission factors are not very sensitive to estimate this variability in emissions at the farm level, even when the emission factors are regional. It is clear that if agricultural offset projects are going to include N2O and CH4 reductions, then process-based biogeochemical models are potentially important tools to quantify emission reductions within offset protocols. The question remains how good a model's performance is with respect to emission reductions. As PBM, are integrated into protocols for agricultural GHG offsets, comprehensive and systematic validation is needed to statistically quantify uncertainties in model-based estimates of GHG emission reductions that are obtained by standardized approach to parameterization and calibration that can be applied across a whole region. The DNDC model was validated against 88 datasets of rice methane emissions. Data were collected at sites in California and MidSouth. In addition to examining the magnitude of the measured versus modeled emissions, we analyzed model performance for estimating the changes in emissions associated with a change in management practices (e.g. dry versus wet seeded rice, different fertilizer rates, etc.). We analyzed 100 pairs of modeled and measured emission reductions. DNDC model performance and uncertainty was quantified using a suite of statistical measures. First, we examined how well the modeled emissions differences match the field-measured differences on a case-by-case basis and also on average, using a combination of Monte Carlo approaches and equivalence testing. Although modeled emissions for individual fields show a slight bias, emissions reductions for baseline:treatment pairs fall close

  6. Early Validation of Automation Plant Control Software using Simulation Based on Assumption Modeling and Validation Use Cases

    Directory of Open Access Journals (Sweden)

    Veronika Brandstetter

    2015-10-01

    Full Text Available In automation plants, technical processes must be conducted in a way that products, substances, or services are produced reliably, with sufficient quality and with minimal strain on resources. A key driver in conducting these processes is the automation plant’s control software, which controls the technical plant components and thereby affects the physical, chemical, and mechanical processes that take place in automation plants. To this end, the control software of an automation plant must adhere to strict process requirements arising from the technical processes, and from the physical plant design. Currently, the validation of the control software often starts late in the engineering process in many cases – once the automation plant is almost completely constructed. However, as widely acknowledged, the later the control software of the automation plant is validated, the higher the effort for correcting revealed defects is, which can lead to serious budget overruns and project delays. In this article we propose an approach that allows the early validation of automation control software against the technical plant processes and assumptions about the physical plant design by means of simulation. We demonstrate the application of our approach on the example of an actual plant project from the automation industry and present it’s technical implementation

  7. Development and Validation of a Needs Assessment Model Using Stakeholders Involved in a University Program.

    Science.gov (United States)

    Labrecque, Monique

    1999-01-01

    Developed a needs-assessment model and validated the model with five groups of stakeholders connected with an undergraduate university nursing program in Canada. Used focus groups, questionnaires, a hermeneutic approach, and the magnitude-estimation scaling model to validate the model. Results show that respondents must define need to clarify the…

  8. Building Context with Tumor Growth Modeling Projects in Differential Equations

    Science.gov (United States)

    Beier, Julie C.; Gevertz, Jana L.; Howard, Keith E.

    2015-01-01

    The use of modeling projects serves to integrate, reinforce, and extend student knowledge. Here we present two projects related to tumor growth appropriate for a first course in differential equations. They illustrate the use of problem-based learning to reinforce and extend course content via a writing or research experience. Here we discuss…

  9. Improving Project Management Using Formal Models and Architectures

    Science.gov (United States)

    Kahn, Theodore; Sturken, Ian

    2011-01-01

    This talk discusses the advantages formal modeling and architecture brings to project management. These emerging technologies have both great potential and challenges for improving information available for decision-making. The presentation covers standards, tools and cultural issues needing consideration, and includes lessons learned from projects the presenters have worked on.

  10. Hypothesis-driven and field-validated method to prioritize fragmentation mitigation efforts in road projects.

    Science.gov (United States)

    Vanthomme, Hadrien; Kolowski, Joseph; Nzamba, Brave S; Alonso, Alfonso

    2015-10-01

    The active field of connectivity conservation has provided numerous methods to identify wildlife corridors with the aim of reducing the ecological effect of fragmentation. Nevertheless, these methods often rely on untested hypotheses of animal movements, usually fail to generate fine-scale predictions of road crossing sites, and do not allow managers to prioritize crossing sites for implementing road fragmentation mitigation measures. We propose a new method that addresses these limitations. We illustrate this method with data from southwestern Gabon (central Africa). We used stratified random transect surveys conducted in two seasons to model the distribution of African forest elephant (Loxodonta cyclotis), forest buffalo (Syncerus caffer nanus), and sitatunga (Tragelaphus spekii) in a mosaic landscape along a 38.5 km unpaved road scheduled for paving. Using a validation data set of recorded crossing locations, we evaluated the performance of three types of models (local suitability, local least-cost movement, and regional least-cost movement) in predicting actual road crossings for each species, and developed a unique and flexible scoring method for prioritizing road sections for the implementation of road fragmentation mitigation measures. With a data set collected in method was able to identify seasonal changes in animal movements for buffalo and sitatunga that shift from a local exploitation of the site in the wet season to movements through the study site in the dry season, whereas elephants use the entire study area in both seasons. These three species highlighted the need to use species- and season-specific modeling of movement. From these movement models, the method ranked road sections for their suitability for implementing fragmentation mitigation efforts, allowing managers to adjust priority thresholds based on budgets and management goals. The method relies on data that can be obtained in a period compatible with environmental impact assessment

  11. Performance of Landslide-HySEA tsunami model for NTHMP benchmarking validation process

    Science.gov (United States)

    Macias, Jorge

    2017-04-01

    In its FY2009 Strategic Plan, the NTHMP required that all numerical tsunami inundation models be verified as accurate and consistent through a model benchmarking process. This was completed in 2011, but only for seismic tsunami sources and in a limited manner for idealized solid underwater landslides. Recent work by various NTHMP states, however, has shown that landslide tsunami hazard may be dominant along significant parts of the US coastline, as compared to hazards from other tsunamigenic sources. To perform the above-mentioned validation process, a set of candidate benchmarks were proposed. These benchmarks are based on a subset of available laboratory date sets for solid slide experiments and deformable slide experiments, and include both submarine and subaerial slides. A benchmark based on a historic field event (Valdez, AK, 1964) close the list of proposed benchmarks. The Landslide-HySEA model has participated in the workshop that was organized at Texas A&M University - Galveston, on January 9-11, 2017. The aim of this presentation is to show some of the numerical results obtained for Landslide-HySEA in the framework of this benchmarking validation/verification effort. Acknowledgements. This research has been partially supported by the Junta de Andalucía research project TESELA (P11-RNM7069), the Spanish Government Research project SIMURISK (MTM2015-70490-C02-01-R) and Universidad de Málaga, Campus de Excelencia Internacional Andalucía Tech. The GPU computations were performed at the Unit of Numerical Methods (University of Malaga).

  12. Validation of transport models using additive flux minimization technique

    Energy Technology Data Exchange (ETDEWEB)

    Pankin, A. Y.; Kruger, S. E. [Tech-X Corporation, 5621 Arapahoe Ave., Boulder, Colorado 80303 (United States); Groebner, R. J. [General Atomics, San Diego, California 92121 (United States); Hakim, A. [Princeton Plasma Physics Laboratory, Princeton, New Jersey 08543-0451 (United States); Kritz, A. H.; Rafiq, T. [Department of Physics, Lehigh University, Bethlehem, Pennsylvania 18015 (United States)

    2013-10-15

    A new additive flux minimization technique is proposed for carrying out the verification and validation (V and V) of anomalous transport models. In this approach, the plasma profiles are computed in time dependent predictive simulations in which an additional effective diffusivity is varied. The goal is to obtain an optimal match between the computed and experimental profile. This new technique has several advantages over traditional V and V methods for transport models in tokamaks and takes advantage of uncertainty quantification methods developed by the applied math community. As a demonstration of its efficiency, the technique is applied to the hypothesis that the paleoclassical density transport dominates in the plasma edge region in DIII-D tokamak discharges. A simplified version of the paleoclassical model that utilizes the Spitzer resistivity for the parallel neoclassical resistivity and neglects the trapped particle effects is tested in this paper. It is shown that a contribution to density transport, in addition to the paleoclassical density transport, is needed in order to describe the experimental profiles. It is found that more additional diffusivity is needed at the top of the H-mode pedestal, and almost no additional diffusivity is needed at the pedestal bottom. The implementation of this V and V technique uses the FACETS::Core transport solver and the DAKOTA toolkit for design optimization and uncertainty quantification. The FACETS::Core solver is used for advancing the plasma density profiles. The DAKOTA toolkit is used for the optimization of plasma profiles and the computation of the additional diffusivity that is required for the predicted density profile to match the experimental profile.

  13. Validating global hydrological models by ground and space gravimetry

    Institute of Scientific and Technical Information of China (English)

    ZHOU JiangCun; SUN HePing; XU JianQiao

    2009-01-01

    The long-term continuous gravity observations obtained by the superconducting gravimeters (SG) at seven globally-distributed stations are comprehensively analyzed. After removing the signals related to the Earth's tides and variations in the Earth's rotation, the gravity residuals are used to describe the seasonal fluctuations in gravity field. Meanwhile, the gravity changes due to the air pressure loading are theoretically modeled from the measurements of the local air pressure, and those due to land water and nontidal ocean loading are also calculated according to the corresponding numerical models. The numerical results show that the gravity changes due to both the air pressure and land water loading are as large as 100×10-9 m s-2 in magnitude, and about 10×10-9 m s-2 for those due to the nontidal ocean loading in the coastal area. On the other hand, the monthly-averaged gravity variations over the area surrounding the stations are derived from the spherical harmonic coefficients of the GRACE-recovered gravity fields, by using Gaussian smoothing technique in which the radius is set to be 600 km. Compared the land water induced gravity variations, the SG observations after removal of tides, polar motion effects, air pressure and nontidal ocean loading effects and the GRACE-derived gravity variations with each other, it is inferred that both the ground- and space-based gravity observations can effectively detect the seasonal gravity variations with a magnitude of 100×10-9 m s-2 induced by the land water loading. This implies that high precision gravimetry is an effective technique to validate the reliabilities of the hydrological models.

  14. Active control strategy on a catenary-pantograph validated model

    Science.gov (United States)

    Sanchez-Rebollo, C.; Jimenez-Octavio, J. R.; Carnicero, A.

    2013-04-01

    Dynamic simulation methods have become essential in the design process and control of the catenary-pantograph system, overall since high-speed trains and interoperability criteria are getting very trendy. This paper presents an original hardware-in-the-loop (HIL) strategy aimed at integrating a multicriteria active control within the catenary-pantograph dynamic interaction. The relevance of HIL control systems applied in the frame of the pantograph is undoubtedly increasing due to the recent and more demanding requirements for high-speed railway systems. Since the loss of contact between the catenary and the pantograph leads to arcing and electrical wear, and too high contact forces cause mechanical wear of both the catenary wires and the strips of the pantograph, not only prescribed but also economic and performance criteria ratify such a relevance. Different configurations of the proportional-integral-derivative (PID) controller are proposed and applied to two different plant systems. Since this paper is mainly focused on the control strategy, both plant systems are simulation models though the methodology is suitable for a laboratory bench. The strategy of control involves a multicriteria optimisation of the contact force and the consumption of the energy supplied by the control force, a genetic algorithm has been applied for this purpose. Thus, the PID controller is fitted according to these conflicting objectives and tested within a nonlinear lumped model and a nonlinear finite element model, being the last one validated against the European Standard EN 50318. Finally, certain tests have been accomplished in order to analyse the robustness of the control strategy. Particularly, the relevance or the plant simulation, the running speed and the instrumentation time delay are studied in this paper.

  15. Validity of thermal comfort models; Gueltigkeit thermischer Behaglichkeitsmodelle

    Energy Technology Data Exchange (ETDEWEB)

    Hellwig, Runa Tabea [Fraunhofer-Institut fuer Bauphysik (IBP) Stuttgart (Germany); Bischof, Wolfgang [Institut fuer Arbeits-, Sozial-, Umweltmedizin und Hygiene des Klinikums der Friedrich-Schiller-Universitaet Jena, Bachstr. 18, 07743 Jena (Germany)

    2006-04-15

    The lack of regulations for room climate have caused uncertainties in the planning of free-ventilation office buildings in Germany. An interdisciplinary research study was carried out using enquiries among users as well as measured data in order to find out about the differences in thermal comfort, if any, between freely ventilated and mechanically ventilated buildings. The data are based on 14 offices of the ProKlimA project. The relevant literature describes four main methods for assessment and prediction of thermal comfort. On the one hand, the PMV model by Fanger and a modification of this model by Mayer were used; on the other hand, a Netherlands guideline and an approach presented in an ASHRAE study. In contrast to the PMV model, these two approaches define the optimal room temperature as a function of an averaged ambient temperature. The four methods were investigated with regard to their applicability for assessing the thermal comfort of buildings with free and mechanical ventilation. In the case of mechanical ventilation, the best results were achieved using Mayer's method; in the case of free ventilation, with the ASHRAE method. The results show that in both cases new models for thermal comfort planning will be required that are not included in German standards so far. (orig.) [German] Zur Zeit fuehren fehlende Richtlinien fuer das Raumklima zu Planungsunsicherheit bei frei beluefteten Buerogebaeuden in Deutschland. Eine interdisziplinaere Forschungsarbeit wurde durchgefuehrt, um anhand von Befragungs- und Messdaten zu untersuchen, ob es Unterschiede in der thermischen Behaglichkeit von Personen in frei beluefteten und mechanisch beluefteten Gebaeuden gibt. Es werden Befragungs- und Messdaten von 14 Buerogebaeuden aus dem ProKlimA-Projekt verwendet. Die relevante Literatur nennt vier bedeutende Methoden zur Bewertung und Vorhersage der thermischen Behaglichkeit. Auf der einen Seite werden das PMV-Modell von Fanger sowie eine Modifizierung dieses

  16. Validation of Superelement Modelling of Complex Offshore Support Structures

    DEFF Research Database (Denmark)

    Wang, Shaofeng; Larsen, Torben J.; Hansen, Anders Melchior

    2016-01-01

    Modern large MW wind turbines today are installed at larger water depth than applicable for traditional monopile substructure. It appears that foundation types such as jacket and tripod are gaining more popularity for these locations. For certification purposes, a full set of design load calculat......Modern large MW wind turbines today are installed at larger water depth than applicable for traditional monopile substructure. It appears that foundation types such as jacket and tripod are gaining more popularity for these locations. For certification purposes, a full set of design load...... calculations consisting of up to thousands design load cases needs to be evaluated. However, even the simplest aero-elastic model of such structures has many more DOFs than monopile, resulting in excessive computation burden. In order to deal with this problem, the superelement method has been introduced...... for modelling such structures. One superelement method has been proven very promising in the previous project of Wave Loads [1] and a fundamental question in such DOFs reduction methods is which modes that are essential and which modes can be neglected. For the jacket structure, the introduction of a gravity...

  17. Polarisers in the focal domain: Theoretical model and experimental validation

    Science.gov (United States)

    Martínez-Herrero, Rosario; Maluenda, David; Juvells, Ignasi; Carnicer, Artur

    2017-02-01

    Polarisers are one of the most widely used devices in optical set-ups. They are commonly used with paraxial beams that propagate in the normal direction of the polariser plane. Nevertheless, the conventional projection character of these devices may change when the beam impinges a polariser with a certain angle of incidence. This effect is more noticeable if polarisers are used in optical systems with a high numerical aperture, because multiple angles of incidence have to be taken into account. Moreover, the non-transverse character of highly focused beams makes the problem more complex and strictly speaking, the Malus’ law does not apply. In this paper we develop a theoretical framework to explain how ideal polarisers affect the behavior of highly focused fields. In this model, the polarisers are considered as birefringent plates, and the vector behaviour of focused fields is described using the plane-wave angular spectrum approach. Experiments involving focused fields were conducted to verify the theoretical model and a satisfactory agreement between theoretical and experimental results was found.

  18. Projections of annual rainfall and surface temperature from CMIP5 models over the BIMSTEC countries

    Science.gov (United States)

    Pattnayak, K. C.; Kar, S. C.; Dalal, Mamta; Pattnayak, R. K.

    2017-05-01

    Bay of Bengal Initiative for Multi-Sectoral Technical and Economic Cooperation (BIMSTEC) comprising Bangladesh, Bhutan, India, Myanmar, Nepal, Sri Lanka and Thailand brings together 21% of the world population. Thus the impact of climate change in this region is a major concern for all. To study the climate change, fifth phase of Climate Model Inter-comparison Project (CMIP5) models have been used to project the climate for the 21st century under the Representative Concentration Pathways (RCPs) 4.5 and 8.5 over the BIMSTEC countries for the period 1901 to 2100 (initial 105 years are historical period and the later 95 years are projected period). Climate change in the projected period has been examined with respect to the historical period. In order to validate the models, the mean annual rainfall has been compared with observations from multiple sources and temperature has been compared with the data from Climatic Research Unit (CRU) during the historical period. Comparison reveals that ensemble mean of the models is able to represent the observed spatial distribution of rainfall and temperature over the BIMSTEC countries. Therefore, data from these models may be used to study the future changes in the 21st century. Four out of six models show that the rainfall over India, Thailand and Myanmar has decreasing trend and Bangladesh, Bhutan, Nepal and Sri Lanka show an increasing trend in both the RCP scenarios. In case of temperature, all the models show an increasing trend over all the BIMSTEC countries in both the scenarios, however, the rate of increase is relatively less over Sri Lanka than the other countries. The rate of increase/decrease in rainfall and temperature are relatively more in RCP8.5 than RCP4.5 over all these countries. Inter-model comparison show that there are uncertainties within the CMIP5 model projections. More similar studies are required to be done for better understanding the model uncertainties in climate projections over this region.

  19. QMU in Integrated Spacecraft System Models Project

    Data.gov (United States)

    National Aeronautics and Space Administration — ACTA and Sandia National Laboratories propose to quantify and propagate substructure modeling uncertainty for reduced-order substructure models to higher levels of...

  20. QMU in Integrated Spacecraft System Models Project

    Data.gov (United States)

    National Aeronautics and Space Administration — ACTA and Sandia National Laboratories propose to quantify and propagate substructure modeling uncertainty for reduced-order substructure models to higher levels of...