Sample records for validated all-qsar models

  1. Validation of simulation models

    DEFF Research Database (Denmark)

    Rehman, Muniza; Pedersen, Stig Andur


    In philosophy of science, the interest for computational models and simulations has increased heavily during the past decades. Different positions regarding the validity of models have emerged but the views have not succeeded in capturing the diversity of validation methods. The wide variety...... of models has been somewhat narrow-minded reducing the notion of validation to establishment of truth. This article puts forward the diversity in applications of simulation models that demands a corresponding diversity in the notion of validation....

  2. Model Validation Status Review

    Energy Technology Data Exchange (ETDEWEB)

    E.L. Hardin


    The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified, and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M&O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and

  3. Perpetual Model Validation (United States)


    2014 – SEP 2016 4. TITLE AND SUBTITLE PERPETUAL MODEL VALIDATION 5a. CONTRACT NUMBER IN HOUSE –R1AK 5b. GRANT NUMBER N/A 5c. PROGRAM ELEMENT...Trustworthy architectures for system of systems; Modeling , assessment, and vulnerability analysis; Assessment and measurement for end-to-end system analysis...PERPETUAL MODEL VALIDATION MARCH 2017 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED STINFO COPY AIR FORCE RESEARCH

  4. Characteristic Time Model Validation (United States)


    Characteristic Time Model Validation Final Technical Report .’ ". Tallio, R.C. Prior, Jr., and A. M. Mellor* U.S. Army Research Office Contract...Park, NC 27709-2211 I N I 11, TITLE (Include Securrty Cassification) Characteristic Time Model Validation (unclassified)512 PERSONAL AUTHOR(S) Tallio...number) FIELD GROUP SUB-GROUP Two-dimensional confined shear layers; two-dimensional prefilming airblast atomizers; characteristic time model; finite

  5. Groundwater Model Validation

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed E. Hassan


    Models have an inherent uncertainty. The difficulty in fully characterizing the subsurface environment makes uncertainty an integral component of groundwater flow and transport models, which dictates the need for continuous monitoring and improvement. Building and sustaining confidence in closure decisions and monitoring networks based on models of subsurface conditions require developing confidence in the models through an iterative process. The definition of model validation is postulated as a confidence building and long-term iterative process (Hassan, 2004a). Model validation should be viewed as a process not an end result. Following Hassan (2004b), an approach is proposed for the validation process of stochastic groundwater models. The approach is briefly summarized herein and detailed analyses of acceptance criteria for stochastic realizations and of using validation data to reduce input parameter uncertainty are presented and applied to two case studies. During the validation process for stochastic models, a question arises as to the sufficiency of the number of acceptable model realizations (in terms of conformity with validation data). Using a hierarchical approach to make this determination is proposed. This approach is based on computing five measures or metrics and following a decision tree to determine if a sufficient number of realizations attain satisfactory scores regarding how they represent the field data used for calibration (old) and used for validation (new). The first two of these measures are applied to hypothetical scenarios using the first case study and assuming field data consistent with the model or significantly different from the model results. In both cases it is shown how the two measures would lead to the appropriate decision about the model performance. Standard statistical tests are used to evaluate these measures with the results indicating they are appropriate measures for evaluating model realizations. The use of validation

  6. Validated dynamic flow model

    DEFF Research Database (Denmark)

    Knudsen, Torben


    model structure suggested by University of Lund the WP4 leader. This particular model structure has the advantages that it fits better into the control design frame work used by WP3-4 compared to the model structures previously developed in WP2. The different model structures are first summarised....... Then issues dealing with optimal experimental design is considered. Finally the parameters are estimated in the chosen static and dynamic models and a validation is performed. Two of the static models, one of them the additive model, explains the data well. In case of dynamic models the suggested additive...... model turns out not to be useful for prediction of the flow. Moreover, standard Box Jenkins model structures and multiple output auto regressive models proves to be superior as they can give useful predictions of the flow....

  7. Validating Animal Models

    Directory of Open Access Journals (Sweden)

    Nina Atanasova


    Full Text Available In this paper, I respond to the challenge raised against contemporary experimental neurobiology according to which the field is in a state of crisis because of the multiple experimental protocols employed in different laboratories and strengthening their reliability that presumably preclude the validity of neurobiological knowledge. I provide an alternative account of experimentation in neurobiology which makes sense of its experimental practices. I argue that maintaining a multiplicity of experimental protocols and strengthening their reliability are well justified and they foster rather than preclude the validity of neurobiological knowledge. Thus, their presence indicates thriving rather than crisis of experimental neurobiology.

  8. Testing and validating environmental models (United States)

    Kirchner, J.W.; Hooper, R.P.; Kendall, C.; Neal, C.; Leavesley, G.


    Generally accepted standards for testing and validating ecosystem models would benefit both modellers and model users. Universally applicable test procedures are difficult to prescribe, given the diversity of modelling approaches and the many uses for models. However, the generally accepted scientific principles of documentation and disclosure provide a useful framework for devising general standards for model evaluation. Adequately documenting model tests requires explicit performance criteria, and explicit benchmarks against which model performance is compared. A model's validity, reliability, and accuracy can be most meaningfully judged by explicit comparison against the available alternatives. In contrast, current practice is often characterized by vague, subjective claims that model predictions show 'acceptable' agreement with data; such claims provide little basis for choosing among alternative models. Strict model tests (those that invalid models are unlikely to pass) are the only ones capable of convincing rational skeptics that a model is probably valid. However, 'false positive' rates as low as 10% can substantially erode the power of validation tests, making them insufficiently strict to convince rational skeptics. Validation tests are often undermined by excessive parameter calibration and overuse of ad hoc model features. Tests are often also divorced from the conditions under which a model will be used, particularly when it is designed to forecast beyond the range of historical experience. In such situations, data from laboratory and field manipulation experiments can provide particularly effective tests, because one can create experimental conditions quite different from historical data, and because experimental data can provide a more precisely defined 'target' for the model to hit. We present a simple demonstration showing that the two most common methods for comparing model predictions to environmental time series (plotting model time series

  9. Base Flow Model Validation Project (United States)

    National Aeronautics and Space Administration — The innovation is the systematic "building-block" validation of CFD/turbulence models employing a GUI driven CFD code (RPFM) and existing as well as new data sets to...

  10. Validating Dart Model

    Directory of Open Access Journals (Sweden)

    Mazur Jolanta


    Full Text Available The primary objective of the study was to quantitatively test the DART model, which despite being one of the most popular representations of co-creation concept was so far studied almost solely with qualitative methods. To this end, the researchers developed a multiple measurement scale and employed it in interviewing managers. The statistical evidence for adequacy of the model was obtained through CFA with AMOS software. The findings suggest that the DART model may not be an accurate representation of co-creation practices in companies. From the data analysis it was evident that the building blocks of DART had too much of conceptual overlap to be an effective framework for quantitative analysis. It was also implied that the phenomenon of co-creation is so rich and multifaceted that it may be more adequately captured by a measurement model where co-creation is conceived as a third-level factor with two layers of intermediate latent variables.

  11. Uncertainty Modeling Via Frequency Domain Model Validation (United States)

    Waszak, Martin R.; Andrisani, Dominick, II


    Abstract The majority of literature on robust control assumes that a design model is available and that the uncertainty model bounds the actual variations about the nominal model. However, methods for generating accurate design models have not received as much attention in the literature. The influence of the level of accuracy of the uncertainty model on closed loop performance has received even less attention. The research reported herein is an initial step in applying and extending the concept of model validation to the problem of obtaining practical uncertainty models for robust control analysis and design applications. An extension of model validation called 'sequential validation' is presented and applied to a simple spring-mass-damper system to establish the feasibility of the approach and demonstrate the benefits of the new developments.

  12. Turbulence Modeling Verification and Validation (United States)

    Rumsey, Christopher L.


    Computational fluid dynamics (CFD) software that solves the Reynolds-averaged Navier-Stokes (RANS) equations has been in routine use for more than a quarter of a century. It is currently employed not only for basic research in fluid dynamics, but also for the analysis and design processes in many industries worldwide, including aerospace, automotive, power generation, chemical manufacturing, polymer processing, and petroleum exploration. A key feature of RANS CFD is the turbulence model. Because the RANS equations are unclosed, a model is necessary to describe the effects of the turbulence on the mean flow, through the Reynolds stress terms. The turbulence model is one of the largest sources of uncertainty in RANS CFD, and most models are known to be flawed in one way or another. Alternative methods such as direct numerical simulations (DNS) and large eddy simulations (LES) rely less on modeling and hence include more physics than RANS. In DNS all turbulent scales are resolved, and in LES the large scales are resolved and the effects of the smallest turbulence scales are modeled. However, both DNS and LES are too expensive for most routine industrial usage on today's computers. Hybrid RANS-LES, which blends RANS near walls with LES away from walls, helps to moderate the cost while still retaining some of the scale-resolving capability of LES, but for some applications it can still be too expensive. Even considering its associated uncertainties, RANS turbulence modeling has proved to be very useful for a wide variety of applications. For example, in the aerospace field, many RANS models are considered to be reliable for computing attached flows. However, existing turbulence models are known to be inaccurate for many flows involving separation. Research has been ongoing for decades in an attempt to improve turbulence models for separated and other nonequilibrium flows. When developing or improving turbulence models, both verification and validation are important

  13. (Validity of environmental transfer models)

    Energy Technology Data Exchange (ETDEWEB)

    Blaylock, B.G.; Hoffman, F.O.; Gardner, R.H.


    BIOMOVS (BIOspheric MOdel Validation Study) is an international cooperative study initiated in 1985 by the Swedish National Institute of Radiation Protection to test models designed to calculate the environmental transfer and bioaccumulation of radionuclides and other trace substances. The objective of the symposium and workshop was to synthesize results obtained during Phase 1 of BIOMOVS (the first five years of the study) and to suggest new directions that might be pursued during Phase 2 of BIOMOVS. The travelers were an instrumental part of the development of BIOMOVS. This symposium allowed the travelers to present a review of past efforts at model validation and a synthesis of current activities and to refine ideas concerning future development of models and data for assessing the fate, effect, and human risks of environmental contaminants. R. H. Gardner also visited the Free University, Amsterdam, and the National Institute of Public Health and Environmental Protection (RIVM) in Bilthoven to confer with scientists about current research in theoretical ecology and the use of models for estimating the transport and effect of environmental contaminants and to learn about the European efforts to map critical loads of acid deposition.

  14. PEMFC modeling and experimental validation

    Energy Technology Data Exchange (ETDEWEB)

    Vargas, J.V.C. [Federal University of Parana (UFPR), Curitiba, PR (Brazil). Dept. of Mechanical Engineering], E-mail:; Ordonez, J.C.; Martins, L.S. [Florida State University, Tallahassee, FL (United States). Center for Advanced Power Systems], Emails:,


    In this paper, a simplified and comprehensive PEMFC mathematical model introduced in previous studies is experimentally validated. Numerical results are obtained for an existing set of commercial unit PEM fuel cells. The model accounts for pressure drops in the gas channels, and for temperature gradients with respect to space in the flow direction, that are investigated by direct infrared imaging, showing that even at low current operation such gradients are present in fuel cell operation, and therefore should be considered by a PEMFC model, since large coolant flow rates are limited due to induced high pressure drops in the cooling channels. The computed polarization and power curves are directly compared to the experimentally measured ones with good qualitative and quantitative agreement. The combination of accuracy and low computational time allow for the future utilization of the model as a reliable tool for PEMFC simulation, control, design and optimization purposes. (author)

  15. Verifying and Validating Simulation Models

    Energy Technology Data Exchange (ETDEWEB)

    Hemez, Francois M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)


    This presentation is a high-level discussion of the Verification and Validation (V&V) of computational models. Definitions of V&V are given to emphasize that “validation” is never performed in a vacuum; it accounts, instead, for the current state-of-knowledge in the discipline considered. In particular comparisons between physical measurements and numerical predictions should account for their respective sources of uncertainty. The differences between error (bias), aleatoric uncertainty (randomness) and epistemic uncertainty (ignorance, lack-of- knowledge) are briefly discussed. Four types of uncertainty in physics and engineering are discussed: 1) experimental variability, 2) variability and randomness, 3) numerical uncertainty and 4) model-form uncertainty. Statistical sampling methods are available to propagate, and analyze, variability and randomness. Numerical uncertainty originates from the truncation error introduced by the discretization of partial differential equations in time and space. Model-form uncertainty is introduced by assumptions often formulated to render a complex problem more tractable and amenable to modeling and simulation. The discussion concludes with high-level guidance to assess the “credibility” of numerical simulations, which stems from the level of rigor with which these various sources of uncertainty are assessed and quantified.

  16. Real-world validation of SHAC models

    Energy Technology Data Exchange (ETDEWEB)

    Morrison, L.


    A statistical approach is proposed to validation of SHAC models. It includes a definition of validation, an explanation of its purposes, and a description of the statistical aspects of experimental design. It proposes a study to validate design codes with statistical samples of real-world systems. Also included is a summary of present SHAC validation methodologies and studies as well as recommendations for future activity.

  17. On validation of multibody musculoskeletal models

    DEFF Research Database (Denmark)

    Lund, Morten Enemark; de Zee, Mark; Andersen, Michael Skipper


    This paper reviews the opportunities to validate multibody musculoskeletal models in view of the current transition of musculoskeletal modelling from a research topic to a practical simulation tool in product design, healthcare and other important applications. This transition creates a new need...... for justification that the models are adequate representations of the systems they simulate. The need for a consistent terminology and established standards is identified and knowledge from fields with a more progressed state-of-the-art in Verification and Validation is introduced. A number of practical steps...... for improvement of the validation of multibody musculoskeletal models are pointed out and directions for future research in the field are proposed. It is our hope that a more structured approach to model validation can help to improve the credibility of musculoskeletal models....

  18. Feature Extraction for Structural Dynamics Model Validation

    Energy Technology Data Exchange (ETDEWEB)

    Farrar, Charles [Los Alamos National Laboratory; Nishio, Mayuko [Yokohama University; Hemez, Francois [Los Alamos National Laboratory; Stull, Chris [Los Alamos National Laboratory; Park, Gyuhae [Chonnam Univesity; Cornwell, Phil [Rose-Hulman Institute of Technology; Figueiredo, Eloi [Universidade Lusófona; Luscher, D. J. [Los Alamos National Laboratory; Worden, Keith [University of Sheffield


    As structural dynamics becomes increasingly non-modal, stochastic and nonlinear, finite element model-updating technology must adopt the broader notions of model validation and uncertainty quantification. For example, particular re-sampling procedures must be implemented to propagate uncertainty through a forward calculation, and non-modal features must be defined to analyze nonlinear data sets. The latter topic is the focus of this report, but first, some more general comments regarding the concept of model validation will be discussed.

  19. Model Validation in Ontology Based Transformations

    Directory of Open Access Journals (Sweden)

    Jesús M. Almendros-Jiménez


    Full Text Available Model Driven Engineering (MDE is an emerging approach of software engineering. MDE emphasizes the construction of models from which the implementation should be derived by applying model transformations. The Ontology Definition Meta-model (ODM has been proposed as a profile for UML models of the Web Ontology Language (OWL. In this context, transformations of UML models can be mapped into ODM/OWL transformations. On the other hand, model validation is a crucial task in model transformation. Meta-modeling permits to give a syntactic structure to source and target models. However, semantic requirements have to be imposed on source and target models. A given transformation will be sound when source and target models fulfill the syntactic and semantic requirements. In this paper, we present an approach for model validation in ODM based transformations. Adopting a logic programming based transformational approach we will show how it is possible to transform and validate models. Properties to be validated range from structural and semantic requirements of models (pre and post conditions to properties of the transformation (invariants. The approach has been applied to a well-known example of model transformation: the Entity-Relationship (ER to Relational Model (RM transformation.

  20. Validating EHR clinical models using ontology patterns. (United States)

    Martínez-Costa, Catalina; Schulz, Stefan


    Clinical models are artefacts that specify how information is structured in electronic health records (EHRs). However, the makeup of clinical models is not guided by any formal constraint beyond a semantically vague information model. We address this gap by advocating ontology design patterns as a mechanism that makes the semantics of clinical models explicit. This paper demonstrates how ontology design patterns can validate existing clinical models using SHACL. Based on the Clinical Information Modelling Initiative (CIMI), we show how ontology patterns detect both modeling and terminology binding errors in CIMI models. SHACL, a W3C constraint language for the validation of RDF graphs, builds on the concept of "Shape", a description of data in terms of expected cardinalities, datatypes and other restrictions. SHACL, as opposed to OWL, subscribes to the Closed World Assumption (CWA) and is therefore more suitable for the validation of clinical models. We have demonstrated the feasibility of the approach by manually describing the correspondences between six CIMI clinical models represented in RDF and two SHACL ontology design patterns. Using a Java-based SHACL implementation, we found at least eleven modeling and binding errors within these CIMI models. This demonstrates the usefulness of ontology design patterns not only as a modeling tool but also as a tool for validation. Copyright © 2017. Published by Elsevier Inc.

  1. Base Flow Model Validation Project (United States)

    National Aeronautics and Space Administration — The program focuses on turbulence modeling enhancements for predicting high-speed rocket base flows. A key component of the effort is the collection of high-fidelity...

  2. Model validation: Correlation for updating

    Indian Academy of Sciences (India)

    Abstract. In this paper, a review is presented of the various methods which are available for the purpose of performing a systematic comparison and correla- tion between two sets of vibration data. In the present case, the application of interest is in conducting this correlation process as a prelude to model correc-.

  3. Model validation: Correlation for updating

    Indian Academy of Sciences (India)

    In this paper, a review is presented of the various methods which are available for the purpose of performing a systematic comparison and correlation between two sets of vibration data. In the present case, the application of interest is in conducting this correlation process as a prelude to model correlation or updating activity ...

  4. Numerical model representation and validation strategies

    Energy Technology Data Exchange (ETDEWEB)

    Dolin, R.M.; Hefele, J.


    This paper describes model representation and validation strategies for use in numerical tools that define models in terms of topology, geometry, or topography. Examples of such tools include Computer-Assisted Engineering (CAE), Computer-Assisted Manufacturing (CAM), Finite Element Analysis (FEA), and Virtual Environment Simulation (VES) tools. These tools represent either physical objects or conceptual ideas using numerical models for the purpose of posing a question, performing a task, or generating information. Dependence on these numerical representations require that models be precise, consistent across different applications, and verifiable. This paper describes a strategy for ensuring precise, consistent, and verifiable numerical model representations in a topographic framework. The main assertion put forth is that topographic model descriptions are more appropriate for numerical applications than topological or geometrical descriptions. A topographic model verification and validation methodology is presented.

  5. Model performance analysis and model validation in logistic regression

    Directory of Open Access Journals (Sweden)

    Rosa Arboretti Giancristofaro


    Full Text Available In this paper a new model validation procedure for a logistic regression model is presented. At first, we illustrate a brief review of different techniques of model validation. Next, we define a number of properties required for a model to be considered "good", and a number of quantitative performance measures. Lastly, we describe a methodology for the assessment of the performance of a given model by using an example taken from a management study.

  6. Structural system identification: Structural dynamics model validation

    Energy Technology Data Exchange (ETDEWEB)

    Red-Horse, J.R.


    Structural system identification is concerned with the development of systematic procedures and tools for developing predictive analytical models based on a physical structure`s dynamic response characteristics. It is a multidisciplinary process that involves the ability (1) to define high fidelity physics-based analysis models, (2) to acquire accurate test-derived information for physical specimens using diagnostic experiments, (3) to validate the numerical simulation model by reconciling differences that inevitably exist between the analysis model and the experimental data, and (4) to quantify uncertainties in the final system models and subsequent numerical simulations. The goal of this project was to develop structural system identification techniques and software suitable for both research and production applications in code and model validation.

  7. Global precipitation measurements for validating climate models (United States)

    Tapiador, F. J.; Navarro, A.; Levizzani, V.; García-Ortega, E.; Huffman, G. J.; Kidd, C.; Kucera, P. A.; Kummerow, C. D.; Masunaga, H.; Petersen, W. A.; Roca, R.; Sánchez, J.-L.; Tao, W.-K.; Turk, F. J.


    The advent of global precipitation data sets with increasing temporal span has made it possible to use them for validating climate models. In order to fulfill the requirement of global coverage, existing products integrate satellite-derived retrievals from many sensors with direct ground observations (gauges, disdrometers, radars), which are used as reference for the satellites. While the resulting product can be deemed as the best-available source of quality validation data, awareness of the limitations of such data sets is important to avoid extracting wrong or unsubstantiated conclusions when assessing climate model abilities. This paper provides guidance on the use of precipitation data sets for climate research, including model validation and verification for improving physical parameterizations. The strengths and limitations of the data sets for climate modeling applications are presented, and a protocol for quality assurance of both observational databases and models is discussed. The paper helps elaborating the recent IPCC AR5 acknowledgment of large observational uncertainties in precipitation observations for climate model validation.

  8. Validating the passenger traffic model for Copenhagen

    DEFF Research Database (Denmark)

    Overgård, Christian Hansen; VUK, Goran


    Metro became operational in autumn 2002. We observed that forecasts from the demand sub-models agree well with the data from the 2000 national travel survey, with the mode choice forecasts in particular being a good match with the observed modal split. The results of the 2000 car assignment model...... the model’s base trip matrices. Second, a dialog between researchers and the Ministry of Transport has been initiated to discuss the need to upgrade the Copenhagen model, e.g. a switching to an activity-based paradigm and improving assignment procedures.......The paper presents a comprehensive validation procedure for the passenger traffic model for Copenhagen based on external data from the Danish national travel survey and traffic counts. The model was validated for the years 2000 to 2004, with 2004 being of particular interest because the Copenhagen...

  9. Using Model Checking to Validate AI Planner Domain Models (United States)

    Penix, John; Pecheur, Charles; Havelund, Klaus


    This report describes an investigation into using model checking to assist validation of domain models for the HSTS planner. The planner models are specified using a qualitative temporal interval logic with quantitative duration constraints. We conducted several experiments to translate the domain modeling language into the SMV, Spin and Murphi model checkers. This allowed a direct comparison of how the different systems would support specific types of validation tasks. The preliminary results indicate that model checking is useful for finding faults in models that may not be easily identified by generating test plans.

  10. Validation of Hadronic Models in GEANT4

    Energy Technology Data Exchange (ETDEWEB)

    Koi, Tatsumi; Wright, Dennis H.; /SLAC; Folger, Gunter; Ivanchenko, Vladimir; Kossov, Mikhail; Starkov, Nikolai; /CERN; Heikkinen, Aatos; /Helsinki Inst. of Phys.; Truscott,; Lei, Fan; /QinetiQ; Wellisch, Hans-Peter


    Geant4 is a software toolkit for the simulation of the passage of particles through matter. It has abundant hadronic models from thermal neutron interactions to ultra relativistic hadrons. An overview of validations in Geant4 hadronic physics is presented based on thin target measurements. In most cases, good agreement is available between Monte Carlo prediction and experimental data; however, several problems have been detected which require some improvement in the models.

  11. Modeling the construct validity of the Berlin Intelligence Structure Model


    Süß,Heinz-Martin; Beauducel,André


    The Berlin Intelligence Structure Model is a hierarchical and faceted model which is originally based on an almost representative sample of tasks found in the literature. Therefore, the Berlin Intelligence Structure Model is an integrative model with a high degree of generality. The present paper investigates the construct validity of this model by using different confirmatory factor analysis models. The results show that the model assumptions are supported only in part by the data. Moreover,...

  12. Validation of Space Weather Models at Community Coordinated Modeling Center (United States)

    Kuznetsova, M. M.; Hesse, M.; Pulkkinen, A.; Maddox, M.; Rastaetter, L.; Berrios, D.; Zheng, Y.; MacNeice, P. J.; Shim, J.; Taktakishvili, A.; hide


    The Community Coordinated Modeling Center (CCMC) is a multi-agency partnership to support the research and developmental work necessary to substantially increase space weather modeling capabilities and to facilitate advanced models deployment in forecasting operations. Space weather models and coupled model chains hosted at the CCMC range from the solar corona to the Earth's upper atmosphere. CCMC has developed a number of real-time modeling systems, as well as a large number of modeling and data products tailored to address the space weather needs of NASA's robotic missions. The CCMC conducts unbiased model testing and validation and evaluates model readiness for operational environment. CCMC has been leading recent comprehensive modeling challenges under GEM, CEDAR and SHINE programs. The presentation will focus on experience in carrying out comprehensive and systematic validation of large sets of. space weather models

  13. SPR Hydrostatic Column Model Verification and Validation.

    Energy Technology Data Exchange (ETDEWEB)

    Bettin, Giorgia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lord, David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rudeen, David Keith [Gram, Inc. Albuquerque, NM (United States)


    A Hydrostatic Column Model (HCM) was developed to help differentiate between normal "tight" well behavior and small-leak behavior under nitrogen for testing the pressure integrity of crude oil storage wells at the U.S. Strategic Petroleum Reserve. This effort was motivated by steady, yet distinct, pressure behavior of a series of Big Hill caverns that have been placed under nitrogen for extended period of time. This report describes the HCM model, its functional requirements, the model structure and the verification and validation process. Different modes of operation are also described, which illustrate how the software can be used to model extended nitrogen monitoring and Mechanical Integrity Tests by predicting wellhead pressures along with nitrogen interface movements. Model verification has shown that the program runs correctly and it is implemented as intended. The cavern BH101 long term nitrogen test was used to validate the model which showed very good agreement with measured data. This supports the claim that the model is, in fact, capturing the relevant physical phenomena and can be used to make accurate predictions of both wellhead pressure and interface movements.

  14. Predictive Validation of an Influenza Spread Model (United States)

    Hyder, Ayaz; Buckeridge, David L.; Leung, Brian


    Background Modeling plays a critical role in mitigating impacts of seasonal influenza epidemics. Complex simulation models are currently at the forefront of evaluating optimal mitigation strategies at multiple scales and levels of organization. Given their evaluative role, these models remain limited in their ability to predict and forecast future epidemics leading some researchers and public-health practitioners to question their usefulness. The objective of this study is to evaluate the predictive ability of an existing complex simulation model of influenza spread. Methods and Findings We used extensive data on past epidemics to demonstrate the process of predictive validation. This involved generalizing an individual-based model for influenza spread and fitting it to laboratory-confirmed influenza infection data from a single observed epidemic (1998–1999). Next, we used the fitted model and modified two of its parameters based on data on real-world perturbations (vaccination coverage by age group and strain type). Simulating epidemics under these changes allowed us to estimate the deviation/error between the expected epidemic curve under perturbation and observed epidemics taking place from 1999 to 2006. Our model was able to forecast absolute intensity and epidemic peak week several weeks earlier with reasonable reliability and depended on the method of forecasting-static or dynamic. Conclusions Good predictive ability of influenza epidemics is critical for implementing mitigation strategies in an effective and timely manner. Through the process of predictive validation applied to a current complex simulation model of influenza spread, we provided users of the model (e.g. public-health officials and policy-makers) with quantitative metrics and practical recommendations on mitigating impacts of seasonal influenza epidemics. This methodology may be applied to other models of communicable infectious diseases to test and potentially improve their predictive

  15. Seclazone Reactor Modeling And Experimental Validation

    Energy Technology Data Exchange (ETDEWEB)

    Osinga, T. [ETH-Zuerich (Switzerland); Olalde, G. [CNRS Odeillo (France); Steinfeld, A. [PSI and ETHZ (Switzerland)


    A numerical model is formulated for the SOLZINC solar chemical reactor for the production of Zn by carbothermal reduction of ZnO. The model involves solving, by the finite-volume technique, a 1D unsteady state energy equation that couples heat transfer to the chemical kinetics for a shrinking packed bed exposed to thermal radiation. Validation is accomplished by comparison with experimentally measured temperature profiles and Zn production rates as a function of time, obtained for a 5-kW solar reactor tested at PSI's solar furnace. (author)

  16. Validation of Calibrated Energy Models: Common Errors

    Directory of Open Access Journals (Sweden)

    Germán Ramos Ruiz


    Full Text Available Nowadays, there is growing interest in all the smart technologies that provide us with information and knowledge about the human environment. In the energy field, thanks to the amount of data received from smart meters and devices and the progress made in both energy software and computers, the quality of energy models is gradually improving and, hence, also the suitability of Energy Conservation Measures (ECMs. For this reason, the measurement of the accuracy of building energy models is an important task, because once the model is validated through a calibration procedure, it can be used, for example, to apply and study different strategies to reduce its energy consumption in maintaining human comfort. There are several agencies that have developed guidelines and methodologies to establish a measure of the accuracy of these models, and the most widely recognized are: ASHRAE Guideline 14-2014, the International Performance Measurement and Verification Protocol (IPMVP and the Federal Energy Management Program (FEMP. This article intends to shed light on these validation measurements (uncertainty indices by focusing on the typical mistakes made, as these errors could produce a false belief that the models used are calibrated.

  17. Towards policy relevant environmental modeling: contextual validity and pragmatic models (United States)

    Miles, Scott B.


    "What makes for a good model?" In various forms, this question is a question that, undoubtedly, many people, businesses, and institutions ponder with regards to their particular domain of modeling. One particular domain that is wrestling with this question is the multidisciplinary field of environmental modeling. Examples of environmental models range from models of contaminated ground water flow to the economic impact of natural disasters, such as earthquakes. One of the distinguishing claims of the field is the relevancy of environmental modeling to policy and environment-related decision-making in general. A pervasive view by both scientists and decision-makers is that a "good" model is one that is an accurate predictor. Thus, determining whether a model is "accurate" or "correct" is done by comparing model output to empirical observations. The expected outcome of this process, usually referred to as "validation" or "ground truthing," is a stamp on the model in question of "valid" or "not valid" that serves to indicate whether or not the model will be reliable before it is put into service in a decision-making context. In this paper, I begin by elaborating on the prevailing view of model validation and why this view must change. Drawing from concepts coming out of the studies of science and technology, I go on to propose a contextual view of validity that can overcome the problems associated with "ground truthing" models as an indicator of model goodness. The problem of how we talk about and determine model validity has much to do about how we perceive the utility of environmental models. In the remainder of the paper, I argue that we should adopt ideas of pragmatism in judging what makes for a good model and, in turn, developing good models. From such a perspective of model goodness, good environmental models should facilitate communication, convey—not bury or "eliminate"—uncertainties, and, thus, afford the active building of consensus decisions, instead

  18. Bayes factor of model selection validates FLMP. (United States)

    Massaro, D W; Cohen, M M; Campbell, C S; Rodriguez, T


    The fuzzy logical model of perception (FLMP; Massaro, 1998) has been extremely successful at describing performance across a wide range of ecological domains as well as for a broad spectrum of individuals. An important issue is whether this descriptive ability is theoretically informative or whether it simply reflects the model's ability to describe a wider range of possible outcomes. Previous tests and contrasts of this model with others have been adjudicated on the basis of both a root mean square deviation (RMSD) for goodness-of-fit and an observed RMSD relative to a benchmark RMSD if the model was indeed correct. We extend the model evaluation by another technique called Bayes factor (Kass & Raftery, 1995; Myung & Pitt, 1997). The FLMP maintains its significant descriptive advantage with this new criterion. In a series of simulations, the RMSD also accurately recovers the correct model under actual experimental conditions. When additional variability was added to the results, the models continued to be recoverable. In addition to its descriptive accuracy, RMSD should not be ignored in model testing because it can be justified theoretically and provides a direct and meaningful index of goodness-of-fit. We also make the case for the necessity of free parameters in model testing. Finally, using Newton's law of universal gravitation as an analogy, we argue that it might not be valid to expect a model's fit to be invariant across the whole range of possible parameter values for the model. We advocate that model selection should be analogous to perceptual judgment, which is characterized by the optimal use of multiple sources of information (e.g., the FLMP). Conclusions about models should be based on several selection criteria.

  19. [Catalonia's primary healthcare accreditation model: a valid model]. (United States)

    Davins, Josep; Gens, Montserrat; Pareja, Clara; Guzmán, Ramón; Marquet, Roser; Vallès, Roser


    There are few experiences of accreditation models validated by primary care teams (EAP). The aim of this study was to detail the process of design, development, and subsequent validation of the consensus EAP accreditation model of Catalonia. An Operating Committee of the Health Department of Catalonia revised models proposed by the European Foundation for Quality Management, the Joint Commission International and the Institut Català de la Salut and proposed 628 essential standards to the technical group (25 experts in primary care and quality of care), to establish consensus standards. The consensus document was piloted in 30 EAP for the purpose of validating the contents, testing standards and identifying evidence. Finally, we did a survey to assess acceptance and validation of the document. The Technical Group agreed on a total of 414 essential standards. The pilot selected a total of 379. Mean compliance with the standards of the final document in the 30 EAP was 70.4%. The standards results were the worst fulfilment percentage. The survey target that 83% of the EAP found it useful and 78% found the content of the accreditation manual suitable as a tool to assess the quality of the EAP, and identify opportunities for improvement. On the downside they highlighted its complexity and laboriousness. We have a model that fits the reality of the EAP, and covers all relevant issues for the functioning of an excellent EAP. The model developed in Catalonia is a model for easy understanding. Copyright © 2014. Published by Elsevier Espana.

  20. Model-Based Method for Sensor Validation (United States)

    Vatan, Farrokh


    Fault detection, diagnosis, and prognosis are essential tasks in the operation of autonomous spacecraft, instruments, and in situ platforms. One of NASA s key mission requirements is robust state estimation. Sensing, using a wide range of sensors and sensor fusion approaches, plays a central role in robust state estimation, and there is a need to diagnose sensor failure as well as component failure. Sensor validation can be considered to be part of the larger effort of improving reliability and safety. The standard methods for solving the sensor validation problem are based on probabilistic analysis of the system, from which the method based on Bayesian networks is most popular. Therefore, these methods can only predict the most probable faulty sensors, which are subject to the initial probabilities defined for the failures. The method developed in this work is based on a model-based approach and provides the faulty sensors (if any), which can be logically inferred from the model of the system and the sensor readings (observations). The method is also more suitable for the systems when it is hard, or even impossible, to find the probability functions of the system. The method starts by a new mathematical description of the problem and develops a very efficient and systematic algorithm for its solution. The method builds on the concepts of analytical redundant relations (ARRs).

  1. Assessment model validity document FARF31

    Energy Technology Data Exchange (ETDEWEB)

    Elert, Mark; Gylling Bjoern; Lindgren, Maria [Kemakta Konsult AB, Stockholm (Sweden)


    The prime goal of model validation is to build confidence in the model concept and that the model is fit for its intended purpose. In other words: Does the model predict transport in fractured rock adequately to be used in repository performance assessments. Are the results reasonable for the type of modelling tasks the model is designed for. Commonly, in performance assessments a large number of realisations of flow and transport is made to cover the associated uncertainties. Thus, the flow and transport including radioactive chain decay are preferably calculated in the same model framework. A rather sophisticated concept is necessary to be able to model flow and radionuclide transport in the near field and far field of a deep repository, also including radioactive chain decay. In order to avoid excessively long computational times there is a need for well-based simplifications. For this reason, the far field code FARF31 is made relatively simple, and calculates transport by using averaged entities to represent the most important processes. FARF31 has been shown to be suitable for the performance assessments within the SKB studies, e.g. SR 97. Among the advantages are that it is a fast, simple and robust code, which enables handling of many realisations with wide spread in parameters in combination with chain decay of radionuclides. Being a component in the model chain PROPER, it is easy to assign statistical distributions to the input parameters. Due to the formulation of the advection-dispersion equation in FARF31 it is possible to perform the groundwater flow calculations separately.The basis for the modelling is a stream tube, i.e. a volume of rock including fractures with flowing water, with the walls of the imaginary stream tube defined by streamlines. The transport within the stream tube is described using a dual porosity continuum approach, where it is assumed that rock can be divided into two distinct domains with different types of porosity

  2. Validation of HEDR models. Hanford Environmental Dose Reconstruction Project

    Energy Technology Data Exchange (ETDEWEB)

    Napier, B.A.; Simpson, J.C.; Eslinger, P.W.; Ramsdell, J.V. Jr.; Thiede, M.E.; Walters, W.H.


    The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computer models for estimating the possible radiation doses that individuals may have received from past Hanford Site operations. This document describes the validation of these models. In the HEDR Project, the model validation exercise consisted of comparing computational model estimates with limited historical field measurements and experimental measurements that are independent of those used to develop the models. The results of any one test do not mean that a model is valid. Rather, the collection of tests together provide a level of confidence that the HEDR models are valid.

  3. A proposed best practice model validation framework for banks

    Directory of Open Access Journals (Sweden)

    Pieter J. (Riaan de Jongh


    Full Text Available Background: With the increasing use of complex quantitative models in applications throughout the financial world, model risk has become a major concern. The credit crisis of 2008–2009 provoked added concern about the use of models in finance. Measuring and managing model risk has subsequently come under scrutiny from regulators, supervisors, banks and other financial institutions. Regulatory guidance indicates that meticulous monitoring of all phases of model development and implementation is required to mitigate this risk. Considerable resources must be mobilised for this purpose. The exercise must embrace model development, assembly, implementation, validation and effective governance. Setting: Model validation practices are generally patchy, disparate and sometimes contradictory, and although the Basel Accord and some regulatory authorities have attempted to establish guiding principles, no definite set of global standards exists. Aim: Assessing the available literature for the best validation practices. Methods: This comprehensive literature study provided a background to the complexities of effective model management and focussed on model validation as a component of model risk management. Results: We propose a coherent ‘best practice’ framework for model validation. Scorecard tools are also presented to evaluate if the proposed best practice model validation framework has been adequately assembled and implemented. Conclusion: The proposed best practice model validation framework is designed to assist firms in the construction of an effective, robust and fully compliant model validation programme and comprises three principal elements: model validation governance, policy and process.

  4. Geochemistry Model Validation Report: Material Degradation and Release Model

    Energy Technology Data Exchange (ETDEWEB)

    H. Stockman


    The purpose of this Analysis and Modeling Report (AMR) is to validate the Material Degradation and Release (MDR) model that predicts degradation and release of radionuclides from a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. This AMR is prepared according to ''Technical Work Plan for: Waste Package Design Description for LA'' (Ref. 17). The intended use of the MDR model is to estimate the long-term geochemical behavior of waste packages (WPs) containing U. S . Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The model is intended to predict (1) the extent to which criticality control material, such as gadolinium (Gd), will remain in the WP after corrosion of the initial WP, (2) the extent to which fissile Pu and uranium (U) will be carried out of the degraded WP by infiltrating water, and (3) the chemical composition and amounts of minerals and other solids left in the WP. The results of the model are intended for use in criticality calculations. The scope of the model validation report is to (1) describe the MDR model, and (2) compare the modeling results with experimental studies. A test case based on a degrading Pu-ceramic WP is provided to help explain the model. This model does not directly feed the assessment of system performance. The output from this model is used by several other models, such as the configuration generator, criticality, and criticality consequence models, prior to the evaluation of system performance. This document has been prepared according to AP-3.10Q, ''Analyses and Models'' (Ref. 2), and prepared in accordance with the technical work plan (Ref. 17).

  5. Validating agent based models through virtual worlds.

    Energy Technology Data Exchange (ETDEWEB)

    Lakkaraju, Kiran; Whetzel, Jonathan H.; Lee, Jina; Bier, Asmeret Brooke; Cardona-Rivera, Rogelio E.; Bernstein, Jeremy Ray Rhythm


    As the US continues its vigilance against distributed, embedded threats, understanding the political and social structure of these groups becomes paramount for predicting and dis- rupting their attacks. Agent-based models (ABMs) serve as a powerful tool to study these groups. While the popularity of social network tools (e.g., Facebook, Twitter) has provided extensive communication data, there is a lack of ne-grained behavioral data with which to inform and validate existing ABMs. Virtual worlds, in particular massively multiplayer online games (MMOG), where large numbers of people interact within a complex environ- ment for long periods of time provide an alternative source of data. These environments provide a rich social environment where players engage in a variety of activities observed between real-world groups: collaborating and/or competing with other groups, conducting battles for scarce resources, and trading in a market economy. Strategies employed by player groups surprisingly re ect those seen in present-day con icts, where players use diplomacy or espionage as their means for accomplishing their goals. In this project, we propose to address the need for ne-grained behavioral data by acquiring and analyzing game data a commercial MMOG, referred to within this report as Game X. The goals of this research were: (1) devising toolsets for analyzing virtual world data to better inform the rules that govern a social ABM and (2) exploring how virtual worlds could serve as a source of data to validate ABMs established for analogous real-world phenomena. During this research, we studied certain patterns of group behavior to compliment social modeling e orts where a signi cant lack of detailed examples of observed phenomena exists. This report outlines our work examining group behaviors that underly what we have termed the Expression-To-Action (E2A) problem: determining the changes in social contact that lead individuals/groups to engage in a particular behavior

  6. Empirical data validation for model building (United States)

    Kazarian, Aram


    Optical Proximity Correction (OPC) has become an integral and critical part of process development for advanced technologies with challenging k I requirements. OPC solutions in turn require stable, predictive models to be built that can project the behavior of all structures. These structures must comprehend all geometries that can occur in the layout in order to define the optimal corrections by feature, and thus enable a manufacturing process with acceptable margin. The model is built upon two main component blocks. First, is knowledge of the process conditions which includes the optical parameters (e.g. illumination source, wavelength, lens characteristics, etc) as well as mask definition, resist parameters and process film stack information. Second, is the empirical critical dimension (CD) data collected using this process on specific test features the results of which are used to fit and validate the model and to project resist contours for all allowable feature layouts. The quality of the model therefore is highly dependent on the integrity of the process data collected for this purpose. Since the test pattern suite generally extends to below the resolution limit that the process can support with adequate latitude, the CD measurements collected can often be quite noisy with marginal signal-to-noise ratios. In order for the model to be reliable and a best representation of the process behavior, it is necessary to scrutinize empirical data to ensure that it is not dominated by measurement noise or flyer/outlier points. The primary approach for generating a clean, smooth and dependable empirical data set should be a replicated measurement sampling that can help to statistically reduce measurement noise by averaging. However, it can often be impractical to collect the amount of data needed to ensure a clean data set by this method. An alternate approach is studied in this paper to further smooth the measured data by means of curve fitting to identify remaining

  7. Validation of ecological state space models using the Laplace approximation

    DEFF Research Database (Denmark)

    Thygesen, Uffe Høgsbro; Albertsen, Christoffer Moesgaard; Berg, Casper Willestofte


    Many statistical models in ecology follow the state space paradigm. For such models, the important step of model validation rarely receives as much attention as estimation or hypothesis testing, perhaps due to lack of available algorithms and software. Model validation is often based on a naive...... for estimation in general mixed effects models. Implementing one-step predictions in the R package Template Model Builder, we demonstrate that it is possible to perform model validation with little effort, even if the ecological model is multivariate, has non-linear dynamics, and whether observations...

  8. Validation of the measure automobile emissions model : a statistical analysis (United States)


    The Mobile Emissions Assessment System for Urban and Regional Evaluation (MEASURE) model provides an external validation capability for hot stabilized option; the model is one of several new modal emissions models designed to predict hot stabilized e...

  9. Dental models made with an intraoral scanner: A validation study.

    NARCIS (Netherlands)

    Cuperus, A.M.; Harms, M.C.; Rangel, F.A.; Bronkhorst, E.M.; Schols, J.G.J.H.; Breuning, K.H.


    INTRODUCTION: Our objectives were to determine the validity and reproducibility of measurements on stereolithographic models and 3-dimensional digital dental models made with an intraoral scanner. METHODS: Ten dry human skulls were scanned; from the scans, stereolithographic models and digital

  10. System Advisor Model: Flat Plate Photovoltaic Performance Modeling Validation Report

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, Janine [National Renewable Energy Lab. (NREL), Golden, CO (United States); Whitmore, Jonathan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Kaffine, Leah [National Renewable Energy Lab. (NREL), Golden, CO (United States); Blair, Nate [National Renewable Energy Lab. (NREL), Golden, CO (United States); Dobos, Aron P. [National Renewable Energy Lab. (NREL), Golden, CO (United States)


    The System Advisor Model (SAM) is a free software tool that performs detailed analysis of both system performance and system financing for a variety of renewable energy technologies. This report provides detailed validation of the SAM flat plate photovoltaic performance model by comparing SAM-modeled PV system generation data to actual measured production data for nine PV systems ranging from 75 kW to greater than 25 MW in size. The results show strong agreement between SAM predictions and field data, with annualized prediction error below 3% for all fixed tilt cases and below 8% for all one axis tracked cases. The analysis concludes that snow cover and system outages are the primary sources of disagreement, and other deviations resulting from seasonal biases in the irradiation models and one axis tracking issues are discussed in detail.

  11. Design and Development Research: A Model Validation Case (United States)

    Tracey, Monica W.


    This is a report of one case of a design and development research study that aimed to validate an overlay instructional design model incorporating the theory of multiple intelligences into instructional systems design. After design and expert review model validation, The Multiple Intelligence (MI) Design Model, used with an Instructional Systems…

  12. Modal testing for model validation of structures with discrete nonlinearities

    National Research Council Canada - National Science Library

    Ewins, D J; Weekes, B; delli Carri, A


    Model validation using data from modal tests is now widely practiced in many industries for advanced structural dynamic design analysis, especially where structural integrity is a primary requirement...

  13. Validation of 2D flood models with insurance claims (United States)

    Zischg, Andreas Paul; Mosimann, Markus; Bernet, Daniel Benjamin; Röthlisberger, Veronika


    Flood impact modelling requires reliable models for the simulation of flood processes. In recent years, flood inundation models have been remarkably improved and widely used for flood hazard simulation, flood exposure and loss analyses. In this study, we validate a 2D inundation model for the purpose of flood exposure analysis at the river reach scale. We validate the BASEMENT simulation model with insurance claims using conventional validation metrics. The flood model is established on the basis of available topographic data in a high spatial resolution for four test cases. The validation metrics were calculated with two different datasets; a dataset of event documentations reporting flooded areas and a dataset of insurance claims. The model fit relating to insurance claims is in three out of four test cases slightly lower than the model fit computed on the basis of the observed inundation areas. This comparison between two independent validation data sets suggests that validation metrics using insurance claims can be compared to conventional validation data, such as the flooded area. However, a validation on the basis of insurance claims might be more conservative in cases where model errors are more pronounced in areas with a high density of values at risk.

  14. Statistical Validation of Normal Tissue Complication Probability Models

    Energy Technology Data Exchange (ETDEWEB)

    Xu Chengjian, E-mail: [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schaaf, Arjen van der; Veld, Aart A. van' t; Langendijk, Johannes A. [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schilstra, Cornelis [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Radiotherapy Institute Friesland, Leeuwarden (Netherlands)


    Purpose: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. Methods and Materials: A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Results: Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Conclusion: Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use.

  15. Validity maintenance in semantic feature modeling

    NARCIS (Netherlands)

    Bidarra de Almeida, A.R.E.


    Feature modeling has the ability to associate functional and engineering information to shape information in a product model. Current feature modeling systems, however, are still rather tied to techniques of conventional geometric modeling systems, offering limited facilities for defining feature

  16. Prospects and problems for standardizing model validation in systems biology. (United States)

    Gross, Fridolin; MacLeod, Miles


    There are currently no widely shared criteria by which to assess the validity of computational models in systems biology. Here we discuss the feasibility and desirability of implementing validation standards for modeling. Having such a standard would facilitate journal review, interdisciplinary collaboration, model exchange, and be especially relevant for applications close to medical practice. However, even though the production of predictively valid models is considered a central goal, in practice modeling in systems biology employs a variety of model structures and model-building practices. These serve a variety of purposes, many of which are heuristic and do not seem to require strict validation criteria and may even be restricted by them. Moreover, given the current situation in systems biology, implementing a validation standard would face serious technical obstacles mostly due to the quality of available empirical data. We advocate a cautious approach to standardization. However even though rigorous standardization seems premature at this point, raising the issue helps us develop better insights into the practices of systems biology and the technical problems modelers face validating models. Further it allows us to identify certain technical validation issues which hold regardless of modeling context and purpose. Informal guidelines could in fact play a role in the field by helping modelers handle these. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Requirements Validation: Execution of UML Models with CPN Tools

    DEFF Research Database (Denmark)

    Machado, Ricardo J.; Lassen, Kristian Bisgaard; Oliveira, Sérgio


    with simple unified modelling language (UML) requirements models, it is not easy for the development team to get confidence on the stakeholders' requirements validation. This paper describes an approach, based on the construction of executable interactive prototypes, to support the validation of workflow...

  18. Validation of mentorship model for newly qualified professional ...

    African Journals Online (AJOL)

    Newly qualified professional nurses (NQPNs) allocated to community health care services require the use of validated model to practice independently. Validation was done to adapt and assess if the model is understood and could be implemented by NQPNs and mentors employed in community health care services.

  19. Validation and Adaptation of Router and Switch Models

    NARCIS (Netherlands)

    Boltjes, B.; Fernandez Diaz, I.; Kock, B.A.; Langeveld, R.J.G.M.; Schoenmaker, G.


    This paper describes validating OPNET models of key devices for the next generation IP-based tactical network of the Royal Netherlands Army (RNLA). The task of TNO-FEL is to provide insight in scalability and performance of future deployed networks. Because validated models ol key Cisco equipment

  20. Models for Validation of Prior Learning (VPL)

    DEFF Research Database (Denmark)

    Ehlers, Søren

    would have been categorized as utopian can become realpolitik. Validation of Prior Learning (VPL) was in Europe mainly regarded as utopian while universities in the United States of America (USA) were developing ways to obtain credits to those students which was coming with experiences from working life....

  1. Using virtual reality to validate system models

    Energy Technology Data Exchange (ETDEWEB)

    Winter, V.L.; Caudell, T.P.


    To date most validation techniques are highly biased towards calculations involving symbolic representations of problems. These calculations are either formal (in the case of consistency and completeness checks), or informal in the case of code inspections. The authors believe that an essential type of evidence of the correctness of the formalization process must be provided by (i.e., must originate from) human-based calculation. They further believe that human calculation can by significantly amplified by shifting from symbolic representations to graphical representations. This paper describes their preliminary efforts in realizing such a representational shift.

  2. Alternative Models of Collegiate Business Education: Their Validity and Implications. (United States)

    Van Auken, Stuart; And Others


    Two models of management education are examined: the academic model, which treats the field of business as a science; and the professional model, which is responsive to the perceived needs of the business community. A study investigated the models' validity within the context of existing programs by 268 surveying program deans about their beliefs…

  3. Model Validation and Verification of Data Mining from the ...

    African Journals Online (AJOL)

    Michael Horsfall

    In this paper, we seek to present a hybrid method for Model Validation and Verification of Data Mining .... obtained, program design and development procedures .... instrument. Cronbach's alpha was used to calculate the questionnaire's reliability and validity. For analyzing the present research's information, both descriptive ...

  4. Validation study of the mine fire simulation model

    Energy Technology Data Exchange (ETDEWEB)

    Wala, A.M.; Dziurzynski, W.; Tracz, J.; Wooton, D.


    The purpose of this paper is to present validation studies of the mine-fire simulator using data gathered from an actual mine fire which occurred in November 1991 at the Pattiki Mine. This study evaluates the suitability of the computer software package for modeling underground fires. The paper also discusses the importance of the on-line monitoring system for the validation process.

  5. Adolescent Personality: A Five-Factor Model Construct Validation (United States)

    Baker, Spencer T.; Victor, James B.; Chambers, Anthony L.; Halverson, Jr., Charles F.


    The purpose of this study was to investigate convergent and discriminant validity of the five-factor model of adolescent personality in a school setting using three different raters (methods): self-ratings, peer ratings, and teacher ratings. The authors investigated validity through a multitrait-multimethod matrix and a confirmatory factor…

  6. Experimental Validation of Flow Force Models for Fast Switching Valves

    DEFF Research Database (Denmark)

    Bender, Niels Christian; Pedersen, Henrik Clemmensen; Nørgård, Christian


    to compare and validate different models, where an effort is directed towards capturing the fluid squeeze effect just before material on material contact. The test data is compared with simulation data relying solely on analytic formulations. The general dynamics of the plunger is validated...

  7. A Practical Approach to Validating a PD Model

    NARCIS (Netherlands)

    Medema, L.; Koning, de R.; Lensink, B.W.


    The capital adequacy framework Basel II aims to promote the adoption of stronger risk management practices by the banking industry. The implementation makes validation of credit risk models more important. Lenders therefore need a validation methodology to convince their supervisors that their

  8. A practical approach to validating a PD model

    NARCIS (Netherlands)

    Medema, Lydian; Koning, Ruud H.; Lensink, Robert; Medema, M.

    The capital adequacy framework Basel II aims to promote the adoption of stronger risk management practices by the banking industry. The implementation makes validation of credit risk models more important. Lenders therefore need a validation methodology to convince their supervisors that their

  9. Ion channel model development and validation (United States)

    Nelson, Peter Hugo


    The structure of the KcsA ion channel selectivity filter is used to develop three simple models of ion channel permeation. The quantitative predictions of the knock-on model are tested by comparison with experimental data from single-channel recordings of the KcsA channel. By comparison with experiment, students discover that the knock-on model can't explain saturation of ion channel current as the concentrations of the bathing solutions are increased. By inverting the energy diagram, students derive the association-dissociation model of ion channel permeation. This model predicts non-linear Michaelis-Menten saturating behavior that requires students to perform non-linear least-squares fits to the experimental data. This is done using Excel's solver feature. Students discover that this simple model does an excellent job of explaining the qualitative features of ion channel permeation but cannot account for changes in voltage sensitivity. The model is then extended to include an electrical dissociation distance. This rapid translocation model is then compared with experimental data from a wide variety of ion channels and students discover that this model also has its limitations. Support from NSF DUE 0836833 is gratefully acknowledged.

  10. A validated physical model of greenhouse climate.

    NARCIS (Netherlands)

    Bot, G.P.A.


    In the greenhouse model the momentaneous environmental crop growth factors are calculated as output, together with the physical behaviour of the crop. The boundary conditions for this model are the outside weather conditions; other inputs are the physical characteristics of the crop, of the

  11. Validity of microgravity simulation models on earth

    DEFF Research Database (Denmark)

    Regnard, J; Heer, M; Drummer, C


    Many studies have used water immersion and head-down bed rest as experimental models to simulate responses to microgravity. However, some data collected during space missions are at variance or in contrast with observations collected from experimental models. These discrepancies could reflect inc...

  12. Validation & verification of a Bayesian network model for aircraft vulnerability

    CSIR Research Space (South Africa)

    Schietekat, Sunelle


    Full Text Available This process describes how the model is exercised and evaluated for its intended use. 12th INCOSE SA Systems Engineering Conference ISBN 978-0-620-72719-8 Page 094 Conceptual Model Validation Concerned with validating that the conceptual model... as input to the model and to create the Conditional Probability Tables (CPTs). These include the expert knowledge data and CmSim2 simulation data. Inferencing - When the model is exercised for the different use cases in a what-if analysis. The different...

  13. Model-driven description and validation of composite learning content


    Melia, Mark; Pahl, Claus


    Authoring of learning content for courseware systems is a complex activity requiring the combination of a range of design and validation techniques. We introduce the CAVIAr courseware models allowing for learning content description and validation. Model-based representation and analysis of different concerns such as the subject domain, learning context, resources and instructional design used are key contributors to this integrated solution. Personalised learning is particularly difficult to...

  14. HEDR model validation plan. Hanford Environmental Dose Reconstruction Project

    Energy Technology Data Exchange (ETDEWEB)

    Napier, B.A.; Gilbert, R.O.; Simpson, J.C.; Ramsdell, J.V. Jr.; Thiede, M.E.; Walters, W.H.


    The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computational ``tools`` for estimating the possible radiation dose that individuals may have received from past Hanford Site operations. This document describes the planned activities to ``validate`` these tools. In the sense of the HEDR Project, ``validation`` is a process carried out by comparing computational model predictions with field observations and experimental measurements that are independent of those used to develop the model.


    Mathematical models for predicting the transport and fate of pollutants in the environment require reactivity parameter values--that is, the physical and chemical constants that govern reactivity. Although empirical structure-activity relationships that allow estimation of some ...

  16. Development and validation of model for sand

    Directory of Open Access Journals (Sweden)

    Church P.


    Full Text Available There is a growing requirement within QinetiQ to develop models for assessments when there is very little experimental data. A theoretical approach to developing equations of state for geological materials has been developed using Quantitative Structure Property Modelling based on the Porter-Gould model approach. This has been applied to well-controlled sand with different moisture contents and particle shapes. The Porter-Gould model describes an elastic response and gives good agreement at high impact pressures with experiment indicating that the response under these conditions is dominated by the molecular response. However at lower pressures the compaction behaviour is dominated by a micro-mechanical response which drives the need for additional theoretical tools and experiments to separate the volumetric and shear compaction behaviour. The constitutive response is fitted to existing triaxial cell data and Quasi-Static (QS compaction data. This data is then used to construct a model in the hydrocode. The model shows great promise in predicting plate impact, Hopkinson bar, fragment penetration and residual velocity of fragments through a finite thickness of sand.

  17. Contact Modelling in Resistance Welding, Part II: Experimental Validation

    DEFF Research Database (Denmark)

    Song, Quanfeng; Zhang, Wenqi; Bay, Niels


    Contact algorithms in resistance welding presented in the previous paper are experimentally validated in the present paper. In order to verify the mechanical contact algorithm, two types of experiments, i.e. sandwich upsetting of circular, cylindrical specimens and compression tests of discs...... with a solid ring projection towards a flat ring, are carried out at room temperature. The complete algorithm, involving not only the mechanical model but also the thermal and electrical models, is validated by projection welding experiments. The experimental results are in satisfactory agreement...... with the simulation prediction, showing the validity of the algorithm....

  18. Context discovery using attenuated Bloom codes: model description and validation

    NARCIS (Netherlands)

    Liu, F.; Heijenk, Geert

    A novel approach to performing context discovery in ad-hoc networks based on the use of attenuated Bloom filters is proposed in this report. In order to investigate the performance of this approach, a model has been developed. This document describes the model and its validation. The model has been

  19. Validation of Model Forecasts of the Ambient Solar Wind (United States)

    Macneice, P. J.; Hesse, M.; Kuznetsova, M. M.; Rastaetter, L.; Taktakishvili, A.


    Independent and automated validation is a vital step in the progression of models from the research community into operational forecasting use. In this paper we describe a program in development at the CCMC to provide just such a comprehensive validation for models of the ambient solar wind in the inner heliosphere. We have built upon previous efforts published in the community, sharpened their definitions, and completed a baseline study. We also provide first results from this program of the comparative performance of the MHD models available at the CCMC against that of the Wang-Sheeley-Arge (WSA) model. An important goal of this effort is to provide a consistent validation to all available models. Clearly exposing the relative strengths and weaknesses of the different models will enable forecasters to craft more reliable ensemble forecasting strategies. Models of the ambient solar wind are developing rapidly as a result of improvements in data supply, numerical techniques, and computing resources. It is anticipated that in the next five to ten years, the MHD based models will supplant semi-empirical potential based models such as the WSA model, as the best available forecast models. We anticipate that this validation effort will track this evolution and so assist policy makers in gauging the value of past and future investment in modeling support.

  20. Quantitative system validation in model driven design

    DEFF Research Database (Denmark)

    Hermanns, Hilger; Larsen, Kim Guldstrand; Raskin, Jean-Francois


    The European STREP project Quasimodo1 develops theory, techniques and tool components for handling quantitative constraints in model-driven development of real-time embedded systems, covering in particular real-time, hybrid and stochastic aspects. This tutorial highlights the advances made, focus...

  1. Distinguishing Valid from Invalid Causal Indicator Models (United States)

    Cadogan, John W.; Lee, Nick


    In this commentary from Issue 14, n3, authors John Cadogan and Nick Lee applaud the paper by Aguirre-Urreta, Rönkkö, and Marakas "Measurement: Interdisciplinary Research and Perspectives", 14(3), 75-97 (2016), since their explanations and simulations work toward demystifying causal indicator models, which are often used by scholars…

  2. Experiments for foam model development and validation.

    Energy Technology Data Exchange (ETDEWEB)

    Bourdon, Christopher Jay; Cote, Raymond O.; Moffat, Harry K.; Grillet, Anne Mary; Mahoney, James F. (Honeywell Federal Manufacturing and Technologies, Kansas City Plant, Kansas City, MO); Russick, Edward Mark; Adolf, Douglas Brian; Rao, Rekha Ranjana; Thompson, Kyle Richard; Kraynik, Andrew Michael; Castaneda, Jaime N.; Brotherton, Christopher M.; Mondy, Lisa Ann; Gorby, Allen D.


    A series of experiments has been performed to allow observation of the foaming process and the collection of temperature, rise rate, and microstructural data. Microfocus video is used in conjunction with particle image velocimetry (PIV) to elucidate the boundary condition at the wall. Rheology, reaction kinetics and density measurements complement the flow visualization. X-ray computed tomography (CT) is used to examine the cured foams to determine density gradients. These data provide input to a continuum level finite element model of the blowing process.

  3. Finite element model validation of bridge based on structural health monitoring—Part II: Uncertainty propagation and model validation

    Directory of Open Access Journals (Sweden)

    Xiaosong Lin


    Full Text Available Because of uncertainties involved in modeling, construction, and measurement systems, the assessment of the FE model validation must be conducted based on stochastic measurements to provide designers with confidence for further applications. In this study, based on the updated model using response surface methodology, a practical model validation methodology via uncertainty propagation is presented. Several criteria of testing/analysis correlation are introduced, and the sources of model and testing uncertainties are also discussed. After that, Monte Carlo stochastic finite element (FE method is employed to perform the uncertainty quantification and propagation. The proposed methodology is illustrated with the examination of the validity of a large-span prestressed concrete continuous rigid frame bridge monitored under operational conditions. It can be concluded that the calculated frequencies and vibration modes of the updated FE model of Xiabaishi Bridge are consistent with the measured ones. The relative errors of each frequency are all less than 3.7%. Meanwhile, the overlap ratio indexes of each frequency are all more than 75%; The MAC values of each calculated vibration frequency are all more than 90%. The model of Xiabaishi Bridge is valid in the whole operation space including experimental design space, and its confidence level is upper than 95%. The validated FE model of Xiabaishi Bridge can reflect the current condition of Xiabaishi Bridge, and also can be used as basis of bridge health monitoring, damage identification and safety assessment.

  4. Validation of an Efficient Outdoor Sound Propagation Model Using BEM

    DEFF Research Database (Denmark)

    Quirós-Alpera, S.; Henriquez, Vicente Cutanda; Jacobsen, Finn


    An approximate, simple and practical model for prediction of outdoor sound propagation exists based on ray theory, diffraction theory and Fresnel-zone considerations [1]. This model, which can predict sound propagation over non-flat terrain, has been validated for combinations of flat ground, hills...... and barriers, but it still needs to be validated for configurations that involve combinations of valleys and barriers. In order to do this a boundary element model has been implemented in MATLAB to serve as a reliable reference....

  5. Reverse electrodialysis : A validated process model for design and optimization

    NARCIS (Netherlands)

    Veerman, J.; Saakes, M.; Metz, S. J.; Harmsen, G. J.


    Reverse electrodialysis (RED) is a technology to generate electricity using the entropy of the mixing of sea and river water. A model is made of the RED process and validated experimentally. The model is used to design and optimize the RED process. It predicts very small differences between counter-

  6. Landslide Tsunami Generation Models: Validation and Case Studies (United States)

    Watts, P.; Grilli, S. T.; Kirby, J. T.; Fryer, G. J.; Tappin, D. R.


    There has been a proliferation of landslide tsunami generation and propagation models in recent time, spurred largely by the 1998 Papua New Guinea event. However, few of these models or techniques have been carefully validated. Moreover, few of these models have proven capable of integrating the best available geological data and interpretations into convincing case studies. The Tsunami Open and Progressive Initial Conditions System (TOPICS) rapidly provides approximate landslide tsunami sources for tsunami propagation models. We present 3D laboratory experiments and 3D Boundary Element Method simulations that validate the tsunami sources given by TOPICS. Geowave is a combination of TOPICS with the fully nonlinear and dispersive Boussinesq model FUNWAVE, which has been the subject of extensive testing and validation over the course of the last decade. Geowave is currently a tsunami community model made available to all tsunami researchers on the web site We validate Geowave with case studies of the 1946 Unimak, Alaska, the 1994 Skagway, Alaska, and the 1998 Papua New Guinea events. The benefits of Boussinesq wave propagation over traditional shallow water wave models is very apparent for these relatively steep and nonlinear waves. For the first time, a tsunami community model appear sufficiently powerful to reproduce all observations and records with the first numerical simulation. This can only be accomplished by first assembling geological data and interpretations into a reasonable tsunami source.

  7. Validating Finite Element Models of Assembled Shell Structures (United States)

    Hoff, Claus


    The validation of finite element models of assembled shell elements is presented. The topics include: 1) Problems with membrane rotations in assembled shell models; 2) Penalty stiffness for membrane rotations; 3) Physical stiffness for membrane rotations using shell elements with 6 dof per node; and 4) Connections avoiding rotations.

  8. Child human model development: a hybrid validation approach

    NARCIS (Netherlands)

    Forbes, P.A.; Rooij, L. van; Rodarius, C.; Crandall, J.


    The current study presents a development and validation approach of a child human body model that will help understand child impact injuries and improve the biofidelity of child anthropometric test devices. Due to the lack of fundamental child biomechanical data needed to fully develop such models a

  9. A Formal Approach to Empirical Dynamic Model Optimization and Validation (United States)

    Crespo, Luis G; Morelli, Eugene A.; Kenny, Sean P.; Giesy, Daniel P.


    A framework was developed for the optimization and validation of empirical dynamic models subject to an arbitrary set of validation criteria. The validation requirements imposed upon the model, which may involve several sets of input-output data and arbitrary specifications in time and frequency domains, are used to determine if model predictions are within admissible error limits. The parameters of the empirical model are estimated by finding the parameter realization for which the smallest of the margins of requirement compliance is as large as possible. The uncertainty in the value of this estimate is characterized by studying the set of model parameters yielding predictions that comply with all the requirements. Strategies are presented for bounding this set, studying its dependence on admissible prediction error set by the analyst, and evaluating the sensitivity of the model predictions to parameter variations. This information is instrumental in characterizing uncertainty models used for evaluating the dynamic model at operating conditions differing from those used for its identification and validation. A practical example based on the short period dynamics of the F-16 is used for illustration.

  10. On the development and validation of QSAR models. (United States)

    Gramatica, Paola


    The fundamental and more critical steps that are necessary for the development and validation of QSAR models are presented in this chapter as best practices in the field. These procedures are discussed in the context of predictive QSAR modelling that is focused on achieving models of the highest statistical quality and with external predictive power. The most important and most used statistical parameters needed to verify the real performances of QSAR models (of both linear regression and classification) are presented. Special emphasis is placed on the validation of models, both internally and externally, as well as on the need to define model applicability domains, which should be done when models are employed for the prediction of new external compounds.

  11. The turbulent viscosity models and their experimental validation; Les modeles de viscosite turbulente et leur validation experimentale

    Energy Technology Data Exchange (ETDEWEB)



    This workshop on turbulent viscosity models and on their experimental validation was organized by the `convection` section of the French society of thermal engineers. From the 9 papers presented during this workshop, 8 deal with the modeling of turbulent flows inside combustion chambers, turbo-machineries or in other energy-related applications, and have been selected for ETDE. (J.S.)

  12. Validation of spectral gas radiation models under oxyfuel conditions

    Energy Technology Data Exchange (ETDEWEB)

    Becher, Johann Valentin


    Combustion of hydrocarbon fuels with pure oxygen results in a different flue gas composition than combustion with air. Standard computational-fluid-dynamics (CFD) spectral gas radiation models for air combustion are therefore out of their validity range in oxyfuel combustion. This thesis provides a common spectral basis for the validation of new spectral models. A literature review about fundamental gas radiation theory, spectral modeling and experimental methods provides the reader with a basic understanding of the topic. In the first results section, this thesis validates detailed spectral models with high resolution spectral measurements in a gas cell with the aim of recommending one model as the best benchmark model. In the second results section, spectral measurements from a turbulent natural gas flame - as an example for a technical combustion process - are compared to simulated spectra based on measured gas atmospheres. The third results section compares simplified spectral models to the benchmark model recommended in the first results section and gives a ranking of the proposed models based on their accuracy. A concluding section gives recommendations for the selection and further development of simplified spectral radiation models. Gas cell transmissivity spectra in the spectral range of 2.4 - 5.4 {mu}m of water vapor and carbon dioxide in the temperature range from 727 C to 1500 C and at different concentrations were compared in the first results section at a nominal resolution of 32 cm{sup -1} to line-by-line models from different databases, two statistical-narrow-band models and the exponential-wide-band model. The two statistical-narrow-band models EM2C and RADCAL showed good agreement with a maximal band transmissivity deviation of 3 %. The exponential-wide-band model showed a deviation of 6 %. The new line-by-line database HITEMP2010 had the lowest band transmissivity deviation of 2.2% and was therefore recommended as a reference model for the


    Directory of Open Access Journals (Sweden)

    Vjeran Strahonja


    Full Text Available An anomaly in legislation is absence of completeness, consistency and other desirable properties, caused by different semantic, syntactic or pragmatic reasons. In general, the detection of anomalies in legislation comprises validation and verification. The basic idea of research, as presented in this paper, is modelling legislation by capturing domain knowledge of legislation and specifying it in a generic way by using commonly agreed and understandable modelling concepts of the Unified Modelling Language (UML. Models of legislation enable to understand the system better, support the detection of anomalies and help to improve the quality of legislation by validation and verification. By implementing model-based approach, the object of validation and verification moves from legislation to its model. The business domain of legislation has two distinct aspects: a structural or static aspect (functionality, business data etc., and a behavioural or dynamic part (states, transitions, activities, sequences etc.. Because anomalism can occur on two different levels, on the level of a model, or on the level of legislation itself, a framework for validation and verification of legal regulation and its model is discussed. The presented framework includes some significant types of semantic and syntactic anomalies. Some ideas for assessment of pragmatic anomalies of models were found in the field of software quality metrics. Thus pragmatic features and attributes can be determined that could be relevant for evaluation purposes of models. Based on analogue standards for the evaluation of software, a qualitative and quantitative scale can be applied to determine the value of some feature for a specific model.

  14. Data for model validation summary report. A summary of data for validation and benchmarking of recovery boiler models

    Energy Technology Data Exchange (ETDEWEB)

    Grace, T.; Lien, S.; Schmidl, W.; Salcudean, M.; Abdullah, Z.


    One of the tasks in the project was to obtain data from operating recovery boilers for the purpose of model validation. Another task was to obtain water model data and computer output from University of British Columbia for purposes of benchmarking the UBC model against other codes. In the course of discussions on recovery boiler modeling over the course of this project, it became evident that there would be value in having some common cases for carrying out benchmarking exercises with different recovery boiler models. In order to facilitate such a benchmarking exercise, the data that was obtained on this project for validation and benchmarking purposes has been brought together in a single, separate report. The intent is to make this data available to anyone who may want to use it for model validation. The report contains data from three different cases. Case 1 is an ABBCE recovery boiler which was used for model validation. The data are for a single set of operating conditions. Case 2 is a Babcock & Wilcox recovery boiler that was modified by Tampella. In this data set, several different operating conditions were employed. The third case is water flow data supplied by UBC, along with computational output using the UBC code, for benchmarking purposes.

  15. Validation techniques of agent based modelling for geospatial simulations (United States)

    Darvishi, M.; Ahmadi, G.


    One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS) is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS), biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI's ArcGIS, OpenMap, GeoTools, etc) for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  16. Predicting third molar surgery operative time: a validated model. (United States)

    Susarla, Srinivas M; Dodson, Thomas B


    The purpose of the present study was to develop and validate a statistical model to predict third molar (M3) operative time. This was a prospective cohort study consisting of a sample of subjects presenting for M3 removal. The demographic, anatomic, and operative variables were recorded for each subject. Using an index sample of randomly selected subjects, a multiple linear regression model was generated to predict the operating time. A nonoverlapping group of randomly selected subjects (validation sample) was used to assess model accuracy. P≤.05 was considered significant. The sample was composed of 150 subjects (n) who had 450 (k) M3s removed. The index sample (n=100 subjects, k=313 M3s extracted) had a mean age of 25.4±10.0 years. The mean extraction time was 6.4±7.0 minutes. The multiple linear regression model included M3 location, Winter's classification, tooth morphology, number of teeth extracted, procedure type, and surgical experience (R2=0.58). No statistically significant differences were seen between the index sample and the validation sample (n=50, k=137) for any of the study variables. Compared with the index model, the β-coefficients of the validation model were similar in direction and magnitude for most variables. Compared with the observed extraction time for all teeth in the sample, the predicted extraction time was not significantly different (P=.16). Fair agreement was seen between the β-coefficients for our multiple models in the index and validation populations, with no significant difference in the predicted and observed operating times. Copyright © 2013 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  17. Validation of the Revised WAsP Park Model

    DEFF Research Database (Denmark)

    Rathmann, Ole Steen; Hansen, Brian Ohrbeck; Leon, J.P. Murcia

    The DTU Wind Energy wind-resource model WAsP contains a wind farm wake model Park (Park1). This Park model in has been revised, Park2, to improve prediction accuracy in large wind farms, based on sound physical and mathematical principles: consistent wake-modelling and perturbation theory for wake......-wake-interaction. Park2 has been validated and calibrated using a number of off-shore and on-shore wind farms. The calibration has resulted in recommended values for the wakeexpansion coefficients of the Park2 model....

  18. Validation of statistical models for creep rupture by parametric analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bolton, J., E-mail: [65, Fisher Ave., Rugby, Warks CV22 5HW (United Kingdom)


    Statistical analysis is an efficient method for the optimisation of any candidate mathematical model of creep rupture data, and for the comparative ranking of competing models. However, when a series of candidate models has been examined and the best of the series has been identified, there is no statistical criterion to determine whether a yet more accurate model might be devised. Hence there remains some uncertainty that the best of any series examined is sufficiently accurate to be considered reliable as a basis for extrapolation. This paper proposes that models should be validated primarily by parametric graphical comparison to rupture data and rupture gradient data. It proposes that no mathematical model should be considered reliable for extrapolation unless the visible divergence between model and data is so small as to leave no apparent scope for further reduction. This study is based on the data for a 12% Cr alloy steel used in BS PD6605:1998 to exemplify its recommended statistical analysis procedure. The models considered in this paper include a) a relatively simple model, b) the PD6605 recommended model and c) a more accurate model of somewhat greater complexity. - Highlights: Black-Right-Pointing-Pointer The paper discusses the validation of creep rupture models derived from statistical analysis. Black-Right-Pointing-Pointer It demonstrates that models can be satisfactorily validated by a visual-graphic comparison of models to data. Black-Right-Pointing-Pointer The method proposed utilises test data both as conventional rupture stress and as rupture stress gradient. Black-Right-Pointing-Pointer The approach is shown to be more reliable than a well-established and widely used method (BS PD6605).

  19. A practical guide for operational validation of discrete simulation models

    Directory of Open Access Journals (Sweden)

    Fabiano Leal


    Full Text Available As the number of simulation experiments increases, the necessity for validation and verification of these models demands special attention on the part of the simulation practitioners. By analyzing the current scientific literature, it is observed that the operational validation description presented in many papers does not agree on the importance designated to this process and about its applied techniques, subjective or objective. With the expectation of orienting professionals, researchers and students in simulation, this article aims to elaborate a practical guide through the compilation of statistical techniques in the operational validation of discrete simulation models. Finally, the guide's applicability was evaluated by using two study objects, which represent two manufacturing cells, one from the automobile industry and the other from a Brazilian tech company. For each application, the guide identified distinct steps, due to the different aspects that characterize the analyzed distributions

  20. Progress in Geant4 Electromagnetic Physics Modelling and Validation

    CERN Document Server

    Apostolakis, J; Bagulya, A; Brown, J M C; Burkhardt, H; Chikuma, N; Cortes-Giraldo, M A; Elles, S; Grichine, V; Guatelli, S; Incerti, S; Ivanchenko, V N; Jacquemier, J; Kadri, O; Maire, M; Pandola, L; Sawkey, D; Toshito, T; Urban, L; Yamashita, T


    In this work we report on recent improvements in the electromagnetic (EM) physics models of Geant4 and new validations of EM physics. Improvements have been made in models of the photoelectric effect, Compton scattering, gamma conversion to electron and muon pairs, fluctuations of energy loss, multiple scattering, synchrotron radiation, and high energy positron annihilation. The results of these developments are included in the new Geant4 version 10.1 and in patches to previous versions 9.6 and 10.0 that are planned to be used for production for run-2 at LHC. The Geant4 validation suite for EM physics has been extended and new validation results are shown in this work. In particular, the effect of gamma-nuclear interactions on EM shower shape at LHC energies is discussed.

  1. Hydrologic and water quality models: Use, calibration, and validation (United States)

    This paper introduces a special collection of 22 research articles that present and discuss calibration and validation concepts in detail for hydrologic and water quality models by their developers and presents a broad framework for developing the American Society of Agricultural and Biological Engi...

  2. Hydrologic and water quality models: Key calibration and validation topics (United States)

    As a continuation of efforts to provide a common background and platform for accordant development of calibration and validation (C/V) engineering practices, ASABE members worked to determine critical topics related to model C/V, perform a synthesis of the Moriasi et al. (2012) special collection of...

  3. Model validation studies of solar systems, Phase III. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Lantz, L.J.; Winn, C.B.


    Results obtained from a validation study of the TRNSYS, SIMSHAC, and SOLCOST solar system simulation and design are presented. Also included are comparisons between the FCHART and SOLCOST solar system design programs and some changes that were made to the SOLCOST program. Finally, results obtained from the analysis of several solar radiation models are presented. Separate abstracts were prepared for ten papers.

  4. Development and validation of stem volume models for Pinus kesiya ...

    African Journals Online (AJOL)

    80% of the data set) and validation (20% of the data set). The performance of the different models was evaluated using evaluation statistics: fit index (FI), root mean square error (RMSE), bias (E), absolute mean difference (AMD) and coefficient of ...

  5. Improving Perovskite Solar Cells: Insights From a Validated Device Model

    NARCIS (Netherlands)

    Sherkar, Tejas S.; Momblona, Cristina; Gil-Escrig, Lidon; Bolink, Henk J.; Koster, L. Jan Anton


    To improve the efficiency of existing perovskite solar cells (PSCs), a detailed understanding of the underlying device physics during their operation is essential. Here, a device model has been developed and validated that describes the operation of PSCs and quantitatively explains the role of

  6. Validation of Models : Statistical Techniques and Data Availability

    NARCIS (Netherlands)

    Kleijnen, J.P.C.


    This paper shows which statistical techniques can be used to validate simulation models, depending on which real-life data are available. Concerning this availability three situations are distinguished (i) no data, (ii) only output data, and (iii) both input and output data. In case (i) - no real

  7. Data Set for Emperical Validation of Double Skin Facade Model

    DEFF Research Database (Denmark)

    Kalyanova, Olena; Jensen, Rasmus Lund; Heiselberg, Per


    During the recent years the attention to the double skin facade (DSF) concept has greatly increased. Nevertheless, the application of the concept depends on whether a reliable model for simulation of the DSF performance will be developed or pointed out. This is, however, not possible to do, until...... the model is empirically validated and its' limitations for the DSF modeling are identified. Correspondingly, the existence and availability of the experimental data is very essential. Three sets of accurate empirical data for validation of DSF modeling with building simulation software were produced within...... of a double skin facade: 1. External air curtain mode, it is the naturally ventilated DSF cavity with the top and bottom openings open to the outdoor; 2. Thermal insulation mode, when all of the DSF openings closed; 3. Preheating mode, with the bottom DSF openings open to the outdoor and top openings open...

  8. Validating firn compaction model with remote sensing data

    DEFF Research Database (Denmark)

    Simonsen, S. B.; Stenseng, Lars; Sørensen, Louise Sandberg

    A comprehensive understanding of firn processes is of outmost importance, when estimating present and future changes of the Greenland Ice Sheet. Especially, when remote sensing altimetry is used to assess the state of ice sheets and their contribution to global sea level rise, firn compaction...... models have been shown to be a key component. Now, remote sensing data can also be used to validate the firn models. Radar penetrating the upper part of the firn column in the interior part of Greenland shows a clear layering. The observed layers from the radar data can be used as an in-situ validation...... correction relative to the changes in the elevation of the surface observed with remote sensing altimetry? What model time resolution is necessary to resolved the observed layering? What model refinements are necessary to give better estimates of the surface mass balance of the Greenland ice sheet from...

  9. Development of a Conservative Model Validation Approach for Reliable Analysis (United States)


    validation approach and extended it to solve a multivariate output problem [6]. Arendt et al. applied the Bayesian approach for both single and...32), pp. 2431-2441. [7] Arendt , P., Apley, D., and Chen, W., 2012, "Quantification of Model Uncertainty: Calibration, Model Discrepancy, and...Identifiability," Journal of Mechanical Design, 134(10). [8] Arendt , P., Apley, D., Chen, W., Lamb, D., and Gorsich, D., 2012, "Improving Identifiability

  10. Validation of Fatigue Modeling Predictions in Aviation Operations (United States)

    Gregory, Kevin; Martinez, Siera; Flynn-Evans, Erin


    Bio-mathematical fatigue models that predict levels of alertness and performance are one potential tool for use within integrated fatigue risk management approaches. A number of models have been developed that provide predictions based on acute and chronic sleep loss, circadian desynchronization, and sleep inertia. Some are publicly available and gaining traction in settings such as commercial aviation as a means of evaluating flight crew schedules for potential fatigue-related risks. Yet, most models have not been rigorously evaluated and independently validated for the operations to which they are being applied and many users are not fully aware of the limitations in which model results should be interpreted and applied.

  11. Validation of models with constant bias: an applied approach

    Directory of Open Access Journals (Sweden)

    Salvador Medina-Peralta


    Full Text Available Objective. This paper presents extensions to the statistical validation method based on the procedure of Freese when a model shows constant bias (CB in its predictions and illustrate the method with data from a new mechanistic model that predict weight gain in cattle. Materials and methods. The extensions were the hypothesis tests and maximum anticipated error for the alternative approach, and the confidence interval for a quantile of the distribution of errors. Results. The model evaluated showed CB, once the CB is removed and with a confidence level of 95%, the magnitude of the error does not exceed 0.575 kg. Therefore, the validated model can be used to predict the daily weight gain of cattle, although it will require an adjustment in its structure based on the presence of CB to increase the accuracy of its forecasts. Conclusions. The confidence interval for the 1-α quantile of the distribution of errors after correcting the constant bias, allows determining the top limit for the magnitude of the error of prediction and use it to evaluate the evolution of the model in the forecasting of the system. The confidence interval approach to validate a model is more informative than the hypothesis tests for the same purpose.

  12. HELOKA-HP thermal-hydraulic model validation and calibration

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Xue Zhou; Ghidersa, Bradut-Eugen; Badea, Aurelian Florin


    Highlights: • The electrical heater in HELOKA-HP has been modeled with RELAP5-3D using experimental data as input. • The model has been validated using novel techniques for assimilating experimental data and the representative model parameters with BEST-EST. • The methodology is successfully used for reducing the model uncertainties and provides a quantitative measure of the consistency between the experimental data and the model. - Abstract: The Helium Loop Karlsruhe High Pressure (HELOKA-HP) is an experimental facility for the testing of various helium-cooled components at high temperature (500 °C) and high pressure (8 MPa) for nuclear fusion applications. For modeling the loop thermal dynamics, a thermal-hydraulic model has been created using the system code RELAP5-3D. Recently, new experimental data covering the behavior of the loop components under relevant operational conditions have been made available giving the possibility of validating and calibrating the existing models in order to reduce the uncertainties of the simulated responses. This paper presents an example where such process has been applied for the HELOKA electrical heater model. Using novel techniques for assimilating experimental data, implemented in the computational module BEST-EST, the representative parameters of the model have been calibrated.

  13. FDA 2011 process validation guidance: lifecycle compliance model. (United States)

    Campbell, Cliff


    This article has been written as a contribution to the industry's efforts in migrating from a document-driven to a data-driven compliance mindset. A combination of target product profile, control engineering, and general sum principle techniques is presented as the basis of a simple but scalable lifecycle compliance model in support of modernized process validation. Unit operations and significant variables occupy pole position within the model, documentation requirements being treated as a derivative or consequence of the modeling process. The quality system is repositioned as a subordinate of system quality, this being defined as the integral of related "system qualities". The article represents a structured interpretation of the U.S. Food and Drug Administration's 2011 Guidance for Industry on Process Validation and is based on the author's educational background and his manufacturing/consulting experience in the validation field. The U.S. Food and Drug Administration's Guidance for Industry on Process Validation (2011) provides a wide-ranging and rigorous outline of compliant drug manufacturing requirements relative to its 20(th) century predecessor (1987). Its declared focus is patient safety, and it identifies three inter-related (and obvious) stages of the compliance lifecycle. Firstly, processes must be designed, both from a technical and quality perspective. Secondly, processes must be qualified, providing evidence that the manufacturing facility is fully "roadworthy" and fit for its intended purpose. Thirdly, processes must be verified, meaning that commercial batches must be monitored to ensure that processes remain in a state of control throughout their lifetime.

  14. Beyond Corroboration: Strengthening Model Validation by Looking for Unexpected Patterns.

    Directory of Open Access Journals (Sweden)

    Guillaume Chérel

    Full Text Available Models of emergent phenomena are designed to provide an explanation to global-scale phenomena from local-scale processes. Model validation is commonly done by verifying that the model is able to reproduce the patterns to be explained. We argue that robust validation must not only be based on corroboration, but also on attempting to falsify the model, i.e. making sure that the model behaves soundly for any reasonable input and parameter values. We propose an open-ended evolutionary method based on Novelty Search to look for the diverse patterns a model can produce. The Pattern Space Exploration method was tested on a model of collective motion and compared to three common a priori sampling experiment designs. The method successfully discovered all known qualitatively different kinds of collective motion, and performed much better than the a priori sampling methods. The method was then applied to a case study of city system dynamics to explore the model's predicted values of city hierarchisation and population growth. This case study showed that the method can provide insights on potential predictive scenarios as well as falsifiers of the model when the simulated dynamics are highly unrealistic.

  15. Experimental validation of mathematical model for small air compressor

    Directory of Open Access Journals (Sweden)

    Tuhovčák Ján


    Full Text Available Development process of reciprocating compressors can be simplified by using simulation tools. Modelling of a compressor requires a trade-off between computational effort and accuracy of desired results. This paper presents experimental validation of the simulation tool, which can be used to predict compressor behaviour under different working conditions. The mathematical model provides fast results with very good accuracy, however the model must be calibrated for a certain type of compressor. Small air compressor was used to validate an in-house simulation tool, which is based on mass and energy conservation in a control volume. The simulation tool calculates pressure and temperature history inside the cylinder, valve characteristics, mass flow and heat losses during the cycle of the compressor. A test bench for the compressor consisted of pressure sensors on both discharge and suction side, temperature sensor on discharge side and flow meter with calorimetric principle sensor.

  16. Validation of a parametric finite element human femur model. (United States)

    Klein, Katelyn F; Hu, Jingwen; Reed, Matthew P; Schneider, Lawrence W; Rupp, Jonathan D


    Finite element (FE) models with geometry and material properties that are parametric with subject descriptors, such as age and body shape/size, are being developed to incorporate population variability into crash simulations. However, the validation methods currently being used with these parametric models do not assess whether model predictions are reasonable in the space over which the model is intended to be used. This study presents a parametric model of the femur and applies a unique validation paradigm to this parametric femur model that characterizes whether model predictions reproduce experimentally observed trends. FE models of male and female femurs with geometries that are parametric with age, femur length, and body mass index (BMI) were developed based on existing statistical models that predict femur geometry. These parametric FE femur models were validated by comparing responses from combined loading tests of femoral shafts to simulation results from FE models of the corresponding femoral shafts whose geometry was predicted using the associated age, femur length, and BMI. The effects of subject variables on model responses were also compared with trends in the experimental data set by fitting similarly parameterized statistical models to both the results of the experimental data and the corresponding FE model results and then comparing fitted model coefficients for the experimental and predicted data sets. The average error in impact force at experimental failure for the parametric models was 5%. The coefficients of a statistical model fit to simulation data were within one standard error of the coefficients of a similarly parameterized model of the experimental data except for the age parameter, likely because material properties used in simulations were not varied with specimen age. In simulations to explore the effects of femur length, BMI, and age on impact response, only BMI significantly affected response for both men and women, with increasing

  17. Predicting the ungauged basin: model validation and realism assessment (United States)

    van Emmerik, Tim; Mulder, Gert; Eilander, Dirk; Piet, Marijn; Savenije, Hubert


    The hydrological decade on Predictions in Ungauged Basins (PUB) [1] led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of model outcome has not been discussed to a great extent. With this study [2] we aim to contribute to the discussion on how one can determine the value and validity of a hydrological model developed for an ungauged basin. As in many cases no local, or even regional, data are available, alternative methods should be applied. Using a PUB case study in a genuinely ungauged basin in southern Cambodia, we give several examples of how one can use different types of soft data to improve model design, calibrate and validate the model, and assess the realism of the model output. A rainfall-runoff model was coupled to an irrigation reservoir, allowing the use of additional and unconventional data. The model was mainly forced with remote sensing data, and local knowledge was used to constrain the parameters. Model realism assessment was done using data from surveys. This resulted in a successful reconstruction of the reservoir dynamics, and revealed the different hydrological characteristics of the two topographical classes. We do not present a generic approach that can be transferred to other ungauged catchments, but we aim to show how clever model design and alternative data acquisition can result in a valuable hydrological model for ungauged catchments. [1] Sivapalan, M., Takeuchi, K., Franks, S., Gupta, V., Karambiri, H., Lakshmi, V., et al. (2003). IAHS decade on predictions in ungauged basins (PUB), 2003-2012: shaping an exciting future for the hydrological sciences. Hydrol. Sci. J. 48, 857-880. doi: 10.1623/hysj.48.6.857.51421 [2] van Emmerik, T., Mulder, G., Eilander, D., Piet, M. and Savenije, H. (2015). Predicting the ungauged basin: model validation and realism assessment

  18. Validating a Social Model Wargame: An Analysis of the Green Country Model (United States)


    validating them. One recent effort at validation comes from Marlin (2009), who used the Peace Support Operations Model PSOM model as a starting point...Nichiporuk, B. (2009), Assessing irregular warfare: A framework for intelligence analysis, Santa Monica, CA: Rand Publishing. Marlin , B. J. (2009

  19. Improvement and Validation of Weld Residual Stress Modelling Procedure

    Energy Technology Data Exchange (ETDEWEB)

    Zang, Weilin; Gunnars, Jens (Inspecta Technology AB, Stockholm (Sweden)); Dong, Pingsha; Hong, Jeong K. (Center for Welded Structures Research, Battelle, Columbus, OH (United States))


    The objective of this work is to identify and evaluate improvements for the residual stress modelling procedure currently used in Sweden. There is a growing demand to eliminate any unnecessary conservatism involved in residual stress assumptions. The study was focused on the development and validation of an improved weld residual stress modelling procedure, by taking advantage of the recent advances in residual stress modelling and stress measurement techniques. The major changes applied in the new weld residual stress modelling procedure are: - Improved procedure for heat source calibration based on use of analytical solutions. - Use of an isotropic hardening model where mixed hardening data is not available. - Use of an annealing model for improved simulation of strain relaxation in re-heated material. The new modelling procedure is demonstrated to capture the main characteristics of the through thickness stress distributions by validation to experimental measurements. Three austenitic stainless steel butt-welds cases are analysed, covering a large range of pipe geometries. From the cases it is evident that there can be large differences between the residual stresses predicted using the new procedure, and the earlier procedure or handbook recommendations. Previously recommended profiles could give misleading fracture assessment results. The stress profiles according to the new procedure agree well with the measured data. If data is available then a mixed hardening model should be used

  20. Validation of Advanced EM Models for UXO Discrimination

    CERN Document Server

    Weichman, Peter B


    The work reported here details basic validation of our advanced physics-based EMI forward and inverse models against data collected by the NRL TEMTADS system. The data was collected under laboratory-type conditions using both artificial spheroidal targets and real UXO. The artificial target models are essentially exact, and enable detailed comparison of theory and data in support of measurement platform characterization and target identification. Real UXO targets cannot be treated exactly, but it is demonstrated that quantitative comparisons of the data with the spheroid models nevertheless aids in extracting key target discrimination information, such as target geometry and hollow target shell thickness.

  1. Verifying and Validating Proposed Models for FSW Process Optimization (United States)

    Schneider, Judith


    This slide presentation reviews Friction Stir Welding (FSW) and the attempts to model the process in order to optimize and improve the process. The studies are ongoing to validate and refine the model of metal flow in the FSW process. There are slides showing the conventional FSW process, a couple of weld tool designs and how the design interacts with the metal flow path. The two basic components of the weld tool are shown, along with geometries of the shoulder design. Modeling of the FSW process is reviewed. Other topics include (1) Microstructure features, (2) Flow Streamlines, (3) Steady-state Nature, and (4) Grain Refinement Mechanisms

  2. Dynamic Modeling of Wind Turbine Gearboxes and Experimental Validation

    DEFF Research Database (Denmark)

    Pedersen, Rune

    is presented. The model takes into account the effects of load and applied grinding corrections. The results are verified by comparing to simulated and experimental results reported in the existing literature. Using gear data loosely based on a 1 MW wind turbine gearbox, the gear mesh stiffness is expanded...... analysis in relation to gear dynamics. A multibody model of two complete 2.3MWwind turbine gearboxes mounted back-to-back in a test rig is built. The mean values of the proposed gear mesh stiffnesses are included. The model is validated by comparing with calculated and measured eigenfrequencies and mode...

  3. Validation of recent geopotential models in Tierra Del Fuego (United States)

    Gomez, Maria Eugenia; Perdomo, Raul; Del Cogliano, Daniel


    This work presents a validation study of global geopotential models (GGM) in the region of Fagnano Lake, located in the southern Andes. This is an excellent area for this type of validation because it is surrounded by the Andes Mountains, and there is no terrestrial gravity or GNSS/levelling data. However, there are mean lake level (MLL) observations, and its surface is assumed to be almost equipotential. Furthermore, in this article, we propose improved geoid solutions through the Residual Terrain Modelling (RTM) approach. Using a global geopotential model, the results achieved allow us to conclude that it is possible to use this technique to extend an existing geoid model to those regions that lack any information (neither gravimetric nor GNSS/levelling observations). As GGMs have evolved, our results have improved progressively. While the validation of EGM2008 with MLL data shows a standard deviation of 35 cm, GOCO05C shows a deviation of 13 cm, similar to the results obtained on land.

  4. Finite Element Model and Validation of Nasal Tip Deformation. (United States)

    Manuel, Cyrus T; Harb, Rani; Badran, Alan; Ho, David; Wong, Brian J F


    Nasal tip mechanical stability is important for functional and cosmetic nasal airway surgery. Palpation of the nasal tip provides information on tip strength to the surgeon, though it is a purely subjective assessment. Providing a means to simulate nasal tip deformation with a validated model can offer a more objective approach in understanding the mechanics and nuances of the nasal tip support and eventual nasal mechanics as a whole. Herein we present validation of a finite element (FE) model of the nose using physical measurements recorded using an ABS plastic-silicone nasal phantom. Three-dimensional photogrammetry was used to capture the geometry of the phantom at rest and while under steady state load. The silicone used to make the phantom was mechanically tested and characterized using a linear elastic constitutive model. Surface point clouds of the silicone and FE model were compared for both the loaded and unloaded state. The average Hausdorff distance between actual measurements and FE simulations across the nose were 0.39 ± 1.04 mm and deviated up to 2 mm at the outermost boundaries of the model. FE simulation and measurements were in near complete agreement in the immediate vicinity of the nasal tip with millimeter accuracy. We have demonstrated validation of a two-component nasal FE model, which could be used to model more complex modes of deformation where direct measurement may be challenging. This is the first step in developing a nasal model to simulate nasal mechanics and ultimately the interaction between geometry and airflow.

  5. A community diagnostic tool for chemistry climate model validation

    Directory of Open Access Journals (Sweden)

    A. Gettelman


    Full Text Available This technical note presents an overview of the Chemistry-Climate Model Validation Diagnostic (CCMVal-Diag tool for model evaluation. The CCMVal-Diag tool is a flexible and extensible open source package that facilitates the complex evaluation of global models. Models can be compared to other models, ensemble members (simulations with the same model, and/or many types of observations. The initial construction and application is to coupled chemistry-climate models (CCMs participating in CCMVal, but the evaluation of climate models that submitted output to the Coupled Model Intercomparison Project (CMIP is also possible. The package has been used to assist with analysis of simulations for the 2010 WMO/UNEP Scientific Ozone Assessment and the SPARC Report on the Evaluation of CCMs. The CCMVal-Diag tool is described and examples of how it functions are presented, along with links to detailed descriptions, instructions and source code. The CCMVal-Diag tool supports model development as well as quantifies model changes, both for different versions of individual models and for different generations of community-wide collections of models used in international assessments. The code allows further extensions by different users for different applications and types, e.g. to other components of the Earth system. User modifications are encouraged and easy to perform with minimum coding.

  6. Empirical validation data sets for double skin facade models

    DEFF Research Database (Denmark)

    Kalyanova, Olena; Jensen, Rasmus Lund; Heiselberg, Per


    During recent years application of double skin facades (DSF) has greatly increased. However, successful application depends heavily on reliable and validated models for simulation of the DSF performance and this in turn requires access to high quality experimental data. Three sets of accurate...... empirical data for validation of DSF modeling with building simulation software were produced within the International Energy Agency (IEA) SHCTask 34 / ECBCS Annex 43. This paper describes the full-scale outdoor experimental test facility, the experimental set-up and the measurements procedure....... The empirical data is composed for key-operating modes, i.e. external air curtain mode (summer cooling), thermal insulation mode (all openings are closed) and air pre-heating mode (heating season) and consist of boundary conditions and DSF performance parameters. The DSF performance parameters discussed...

  7. Validation of the Osteopenia Sheep Model for Orthopaedic Biomaterial Research

    DEFF Research Database (Denmark)

    Ding, Ming; Danielsen, C.C.; Cheng, L.


    Validation of the Osteopenia Sheep Model for Orthopaedic Biomaterial Research +1Ding, M; 2Danielsen, CC; 1Cheng, L; 3Bollen, P; 4Schwarz, P; 1Overgaard, S +1Dept of Orthopaedics O, Odense University Hospital, Denmark, 2Dept of Connective Tissue Biology, University of Aarhus, Denmark, 3Biomedicine...... Lab, University of Southern Denmark, 4Dept of Geriatrics, Glostrup University Hospital, Denmark   Introduction:  Currently, majority orthopaedic prosthesis and biomaterial researches have been based on investigation in normal animals. In most clinical situations, most...... resemble osteoporosis in humans. This study aimed to validate glucocorticoid-induced osteopenia sheep model for orthopaedic implant and biomaterial research. We hypothesized that a 7-month GC treatment together with restricted diet but without OVX would induce osteopenia. Materials and Methods: Eighteen...

  8. Trailing Edge Noise Model Validation and Application to Airfoil Optimization

    DEFF Research Database (Denmark)

    Bertagnolio, Franck; Aagaard Madsen, Helge; Bak, Christian


    The aim of this article is twofold. First, an existing trailing edge noise model is validated by comparing with airfoil surface pressure fluctuations and far field sound pressure levels measured in three different experiments. The agreement is satisfactory in one case but poor in two other cases...... across the boundary layer near the trailing edge and to a lesser extent by a smaller boundary layer displacement thickness. ©2010 American Society of Mechanical Engineers...

  9. Iowa Model of Evidence-Based Practice: Revisions and Validation. (United States)

    Buckwalter, Kathleen C; Cullen, Laura; Hanrahan, Kirsten; Kleiber, Charmaine; McCarthy, Ann Marie; Rakel, Barbara; Steelman, Victoria; Tripp-Reimer, Toni; Tucker, Sharon


    The Iowa Model is a widely used framework for the implementation of evidence-based practice (EBP). Changes in health care (e.g., emergence of implementation science, emphasis on patient engagement) prompted the re-evaluation, revision, and validation of the model. A systematic multi-step process was used capturing information from the literature and user feedback via an electronic survey and live work groups. The Iowa Model Collaborative critically assessed and synthesized information and recommendations before revising the model. Survey participants (n = 431) had requested access to the Model between years 2001 and 2013. Eighty-eight percent (n = 379) of participants reported using the Iowa Model and identified the most problematic steps as: topic priority, critique, pilot, and institute change. Users provided 587 comments with rich contextual rationale and insightful suggestions. The revised model was then evaluated by participants (n = 299) of the 22nd National EBP Conference in 2015. They validated the model as a practical tool for the EBP process across diverse settings. Specific changes in the model are discussed. This user driven revision differs from other frameworks in that it links practice changes within the system. Major model changes are expansion of piloting, implementation, patient engagement, and sustaining change. The Iowa Model-Revised remains an application-oriented guide for the EBP process. Intended users are point of care clinicians who ask questions and seek a systematic, EBP approach to promote excellence in health care. © 2017 University of Iowa Hospitals and Clinics, Worldviews on Evidence-Based Nursing © 2017 Sigma Theta Tau International.

  10. Organic acid modeling and model validation: Workshop summary

    Energy Technology Data Exchange (ETDEWEB)

    Sullivan, T.J.; Eilers, J.M.


    A workshop was held in Corvallis, Oregon on April 9--10, 1992 at the offices of E S Environmental Chemistry, Inc. The purpose of this workshop was to initiate research efforts on the entitled Incorporation of an organic acid representation into MAGIC (Model of Acidification of Groundwater in Catchments) and testing of the revised model using Independent data sources.'' The workshop was attended by a team of internationally-recognized experts in the fields of surface water acid-bass chemistry, organic acids, and watershed modeling. The rationale for the proposed research is based on the recent comparison between MAGIC model hindcasts and paleolimnological inferences of historical acidification for a set of 33 statistically-selected Adirondack lakes. Agreement between diatom-inferred and MAGIC-hindcast lakewater chemistry in the earlier research had been less than satisfactory. Based on preliminary analyses, it was concluded that incorporation of a reasonable organic acid representation into the version of MAGIC used for hindcasting was the logical next step toward improving model agreement.

  11. Organic acid modeling and model validation: Workshop summary. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Sullivan, T.J.; Eilers, J.M.


    A workshop was held in Corvallis, Oregon on April 9--10, 1992 at the offices of E&S Environmental Chemistry, Inc. The purpose of this workshop was to initiate research efforts on the entitled ``Incorporation of an organic acid representation into MAGIC (Model of Acidification of Groundwater in Catchments) and testing of the revised model using Independent data sources.`` The workshop was attended by a team of internationally-recognized experts in the fields of surface water acid-bass chemistry, organic acids, and watershed modeling. The rationale for the proposed research is based on the recent comparison between MAGIC model hindcasts and paleolimnological inferences of historical acidification for a set of 33 statistically-selected Adirondack lakes. Agreement between diatom-inferred and MAGIC-hindcast lakewater chemistry in the earlier research had been less than satisfactory. Based on preliminary analyses, it was concluded that incorporation of a reasonable organic acid representation into the version of MAGIC used for hindcasting was the logical next step toward improving model agreement.

  12. A validated approach for modeling collapse of steel structures (United States)

    Saykin, Vitaliy Victorovich

    A civil engineering structure is faced with many hazardous conditions such as blasts, earthquakes, hurricanes, tornadoes, floods, and fires during its lifetime. Even though structures are designed for credible events that can happen during a lifetime of the structure, extreme events do happen and cause catastrophic failures. Understanding the causes and effects of structural collapse is now at the core of critical areas of national need. One factor that makes studying structural collapse difficult is the lack of full-scale structural collapse experimental test results against which researchers could validate their proposed collapse modeling approaches. The goal of this work is the creation of an element deletion strategy based on fracture models for use in validated prediction of collapse of steel structures. The current work reviews the state-of-the-art of finite element deletion strategies for use in collapse modeling of structures. It is shown that current approaches to element deletion in collapse modeling do not take into account stress triaxiality in vulnerable areas of the structure, which is important for proper fracture and element deletion modeling. The report then reviews triaxiality and its role in fracture prediction. It is shown that fracture in ductile materials is a function of triaxiality. It is also shown that, depending on the triaxiality range, different fracture mechanisms are active and should be accounted for. An approach using semi-empirical fracture models as a function of triaxiality are employed. The models to determine fracture initiation, softening and subsequent finite element deletion are outlined. This procedure allows for stress-displacement softening at an integration point of a finite element in order to subsequently remove the element. This approach avoids abrupt changes in the stress that would create dynamic instabilities, thus making the results more reliable and accurate. The calibration and validation of these models are

  13. Parameterization and validation of an ungulate-pasture model. (United States)

    Pekkarinen, Antti-Juhani; Kumpula, Jouko; Tahvonen, Olli


    Ungulate grazing and trampling strongly affect pastures and ecosystems throughout the world. Ecological population models are used for studying these systems and determining the guidelines for sustainable and economically viable management. However, the effect of trampling and other resource wastage is either not taken into account or quantified with data in earlier models. Also, the ability of models to describe the herbivore impact on pastures is usually not validated. We used a detailed model and data to study the level of winter- and summertime lichen wastage by reindeer and the effects of wastage on population sizes and management. We also validated the model with respect to its ability of predicting changes in lichen biomass and compared the actual management in herding districts with model results. The modeling efficiency value (0.75) and visual comparison between the model predictions and data showed that the model was able to describe the changes in lichen pastures caused by reindeer grazing and trampling. At the current lichen biomass levels in the northernmost Finland, the lichen wastage varied from 0 to 1 times the lichen intake during winter and from 6 to 10 times the intake during summer. With a higher value for wastage, reindeer numbers and net revenues were lower in the economically optimal solutions. Higher wastage also favored the use of supplementary feeding in the optimal steady state. Actual reindeer numbers in the districts were higher than in the optimal steady-state solutions for the model in 18 herding districts out of 20. Synthesis and applications . We show that a complex model can be used for analyzing ungulate-pasture dynamics and sustainable management if the model is parameterized and validated for the system. Wastage levels caused by trampling and other causes should be quantified with data as they strongly affect the results and management recommendations. Summertime lichen wastage caused by reindeer is higher than expected, which

  14. Evaluation model and experimental validation of tritium in agricultural plant

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Hee Suk; Keum, Dong Kwon; Lee, Han Soo; Jun, In; Choi, Yong Ho; Lee, Chang Woo [KAERI, Daejon (Korea, Republic of)


    This paper describes a compartment dynamic model for evaluating the contamination level of tritium in agricultural plants exposed by accidentally released tritium. The present model uses a time dependent growth equation of plant so that it can predict the effect of growth stage of plant during the exposure time. The model including atmosphere, soil and plant compartments is described by a set of nonlinear ordinary differential equations, and is able to predict time-dependent concentrations of tritium in the compartments. To validate the model, a series of exposure experiments of HTO vapor on Chinese cabbage and radish was carried out at the different growth stage of each plant. At the end of exposure, the tissue free water(TFWT) and the organically bound tritium (OBT) were measured. The measured concentrations were agreed well with model predictions.

  15. Methods for Geometric Data Validation of 3d City Models (United States)

    Wagner, D.; Alam, N.; Wewetzer, M.; Pries, M.; Coors, V.


    Geometric quality of 3D city models is crucial for data analysis and simulation tasks, which are part of modern applications of the data (e.g. potential heating energy consumption of city quarters, solar potential, etc.). Geometric quality in these contexts is however a different concept as it is for 2D maps. In the latter case, aspects such as positional or temporal accuracy and correctness represent typical quality metrics of the data. They are defined in ISO 19157 and should be mentioned as part of the metadata. 3D data has a far wider range of aspects which influence their quality, plus the idea of quality itself is application dependent. Thus, concepts for definition of quality are needed, including methods to validate these definitions. Quality on this sense means internal validation and detection of inconsistent or wrong geometry according to a predefined set of rules. A useful starting point would be to have correct geometry in accordance with ISO 19107. A valid solid should consist of planar faces which touch their neighbours exclusively in defined corner points and edges. No gaps between them are allowed, and the whole feature must be 2-manifold. In this paper, we present methods to validate common geometric requirements for building geometry. Different checks based on several algorithms have been implemented to validate a set of rules derived from the solid definition mentioned above (e.g. water tightness of the solid or planarity of its polygons), as they were developed for the software tool CityDoctor. The method of each check is specified, with a special focus on the discussion of tolerance values where they are necessary. The checks include polygon level checks to validate the correctness of each polygon, i.e. closeness of the bounding linear ring and planarity. On the solid level, which is only validated if the polygons have passed validation, correct polygon orientation is checked, after self-intersections outside of defined corner points and edges

  16. Validation of the WATEQ4 geochemical model for uranium

    Energy Technology Data Exchange (ETDEWEB)

    Krupka, K.M.; Jenne, E.A.; Deutsch, W.J.


    As part of the Geochemical Modeling and Nuclide/Rock/Groundwater Interactions Studies Program, a study was conducted to partially validate the WATEQ4 aqueous speciation-solubility geochemical model for uranium. The solubility controls determined with the WATEQ4 geochemical model were in excellent agreement with those laboratory studies in which the solids schoepite (UO/sub 2/(OH)/sub 2/ . H/sub 2/O), UO/sub 2/(OH)/sub 2/, and rutherfordine ((UO/sub 2/CO/sub 3/) were identified as actual solubility controls for uranium. The results of modeling solution analyses from laboratory studies of uranyl phosphate solids, however, identified possible errors in the characterization of solids in the original solubility experiments. As part of this study, significant deficiencies in the WATEQ4 thermodynamic data base for uranium solutes and solids were corrected. Revisions included recalculation of selected uranium reactions. Additionally, thermodynamic data for the hydroxyl complexes of U(VI), including anionic (VI) species, were evaluated (to the extent permitted by the available data). Vanadium reactions were also added to the thermodynamic data base because uranium-vanadium solids can exist in natural ground-water systems. This study is only a partial validation of the WATEQ4 geochemical model because the available laboratory solubility studies do not cover the range of solid phases, alkaline pH values, and concentrations of inorganic complexing ligands needed to evaluate the potential solubility of uranium in ground waters associated with various proposed nuclear waste repositories. Further validation of this or other geochemical models for uranium will require careful determinations of uraninite solubility over the pH range of 7 to 10 under highly reducing conditions and of uranyl hydroxide and phosphate solubilities over the pH range of 7 to 10 under oxygenated conditions.

  17. Validation of Power Requirement Model for Active Loudspeakers

    DEFF Research Database (Denmark)

    Schneider, Henrik; Madsen, Anders Normann; Bjerregaard, Ruben


    The actual power requirement of an active loudspeaker during playback of music has not received much attention in the literature. This is probably because no single and simple solution exists and because a complete system knowledge from input voltage to output sound pressure level is required....... There are however many advantages that could be harvested from such knowledge like size, cost and efficiency improvements. In this paper a recently proposed power requirement model for active loudspeakers is experimentally validated and the model is expanded to include the closed and vented type enclosures...

  18. Validation of a Business Model for Cultural Heritage Institutions

    Directory of Open Access Journals (Sweden)

    Cristian CIUREA


    Full Text Available The paper proposes a business model for the efficiency optimization of the interaction between all actors involved in cultural heritage sector, such as galleries, libraries, archives and museums (GLAM. The validation of the business model is subject of analyses and implementations in a real environment made by different cultural institutions. The implementation of virtual exhibitions on mobile devices is described and analyzed as a key factor for increasing the cultural heritage visibility. New perspectives on the development of virtual exhibitions for mobile devices are considered. A study on the number of visitors of cultural institutions is carried out and ways to increase the number of visitors are described.

  19. Validation of coupled atmosphere-fire behavior models

    Energy Technology Data Exchange (ETDEWEB)

    Bossert, J.E.; Reisner, J.M.; Linn, R.R.; Winterkamp, J.L. [Los Alamos National Lab., NM (United States); Schaub, R. [Dynamac Corp., Kennedy Space Center, FL (United States); Riggan, P.J. [Forest Service, Riverside, CA (United States)


    Recent advances in numerical modeling and computer power have made it feasible to simulate the dynamical interaction and feedback between the heat and turbulence induced by wildfires and the local atmospheric wind and temperature fields. At Los Alamos National Laboratory, the authors have developed a modeling system that includes this interaction by coupling a high resolution atmospheric dynamics model, HIGRAD, with a fire behavior model, BEHAVE, to predict the spread of wildfires. The HIGRAD/BEHAVE model is run at very high resolution to properly resolve the fire/atmosphere interaction. At present, these coupled wildfire model simulations are computationally intensive. The additional complexity of these models require sophisticated methods for assuring their reliability in real world applications. With this in mind, a substantial part of the research effort is directed at model validation. Several instrumented prescribed fires have been conducted with multi-agency support and participation from chaparral, marsh, and scrub environments in coastal areas of Florida and inland California. In this paper, the authors first describe the data required to initialize the components of the wildfire modeling system. Then they present results from one of the Florida fires, and discuss a strategy for further testing and improvement of coupled weather/wildfire models.

  20. Validating and Verifying Biomathematical Models of Human Fatigue (United States)

    Martinez, Siera Brooke; Quintero, Luis Ortiz; Flynn-Evans, Erin


    Airline pilots experience acute and chronic sleep deprivation, sleep inertia, and circadian desynchrony due to the need to schedule flight operations around the clock. This sleep loss and circadian desynchrony gives rise to cognitive impairments, reduced vigilance and inconsistent performance. Several biomathematical models, based principally on patterns observed in circadian rhythms and homeostatic drive, have been developed to predict a pilots levels of fatigue or alertness. These models allow for the Federal Aviation Administration (FAA) and commercial airlines to make decisions about pilot capabilities and flight schedules. Although these models have been validated in a laboratory setting, they have not been thoroughly tested in operational environments where uncontrolled factors, such as environmental sleep disrupters, caffeine use and napping, may impact actual pilot alertness and performance. We will compare the predictions of three prominent biomathematical fatigue models (McCauley Model, Harvard Model, and the privately-sold SAFTE-FAST Model) to actual measures of alertness and performance. We collected sleep logs, movement and light recordings, psychomotor vigilance task (PVT), and urinary melatonin (a marker of circadian phase) from 44 pilots in a short-haul commercial airline over one month. We will statistically compare with the model predictions to lapses on the PVT and circadian phase. We will calculate the sensitivity and specificity of each model prediction under different scheduling conditions. Our findings will aid operational decision-makers in determining the reliability of each model under real-world scheduling situations.

  1. Image decomposition as a tool for validating stress analysis models

    Directory of Open Access Journals (Sweden)

    Mottershead J.


    Full Text Available It is good practice to validate analytical and numerical models used in stress analysis for engineering design by comparison with measurements obtained from real components either in-service or in the laboratory. In reality, this critical step is often neglected or reduced to placing a single strain gage at the predicted hot-spot of stress. Modern techniques of optical analysis allow full-field maps of displacement, strain and, or stress to be obtained from real components with relative ease and at modest cost. However, validations continued to be performed only at predicted and, or observed hot-spots and most of the wealth of data is ignored. It is proposed that image decomposition methods, commonly employed in techniques such as fingerprinting and iris recognition, can be employed to validate stress analysis models by comparing all of the key features in the data from the experiment and the model. Image decomposition techniques such as Zernike moments and Fourier transforms have been used to decompose full-field distributions for strain generated from optical techniques such as digital image correlation and thermoelastic stress analysis as well as from analytical and numerical models by treating the strain distributions as images. The result of the decomposition is 101 to 102 image descriptors instead of the 105 or 106 pixels in the original data. As a consequence, it is relatively easy to make a statistical comparison of the image descriptors from the experiment and from the analytical/numerical model and to provide a quantitative assessment of the stress analysis.

  2. Validation of thermal models for a prototypical MEMS thermal actuator.

    Energy Technology Data Exchange (ETDEWEB)

    Gallis, Michail A.; Torczynski, John Robert; Piekos, Edward Stanley; Serrano, Justin Raymond; Gorby, Allen D.; Phinney, Leslie Mary


    This report documents technical work performed to complete the ASC Level 2 Milestone 2841: validation of thermal models for a prototypical MEMS thermal actuator. This effort requires completion of the following task: the comparison between calculated and measured temperature profiles of a heated stationary microbeam in air. Such heated microbeams are prototypical structures in virtually all electrically driven microscale thermal actuators. This task is divided into four major subtasks. (1) Perform validation experiments on prototypical heated stationary microbeams in which material properties such as thermal conductivity and electrical resistivity are measured if not known and temperature profiles along the beams are measured as a function of electrical power and gas pressure. (2) Develop a noncontinuum gas-phase heat-transfer model for typical MEMS situations including effects such as temperature discontinuities at gas-solid interfaces across which heat is flowing, and incorporate this model into the ASC FEM heat-conduction code Calore to enable it to simulate these effects with good accuracy. (3) Develop a noncontinuum solid-phase heat transfer model for typical MEMS situations including an effective thermal conductivity that depends on device geometry and grain size, and incorporate this model into the FEM heat-conduction code Calore to enable it to simulate these effects with good accuracy. (4) Perform combined gas-solid heat-transfer simulations using Calore with these models for the experimentally investigated devices, and compare simulation and experimental temperature profiles to assess model accuracy. These subtasks have been completed successfully, thereby completing the milestone task. Model and experimental temperature profiles are found to be in reasonable agreement for all cases examined. Modest systematic differences appear to be related to uncertainties in the geometric dimensions of the test structures and in the thermal conductivity of the

  3. Validation, Optimization and Simulation of a Solar Thermoelectric Generator Model (United States)

    Madkhali, Hadi Ali; Hamil, Ali; Lee, HoSung


    This study explores thermoelectrics as a viable option for small-scale solar thermal applications. Thermoelectric technology is based on the Seebeck effect, which states that a voltage is induced when a temperature gradient is applied to the junctions of two differing materials. This research proposes to analyze, validate, simulate, and optimize a prototype solar thermoelectric generator (STEG) model in order to increase efficiency. The intent is to further develop STEGs as a viable and productive energy source that limits pollution and reduces the cost of energy production. An empirical study (Kraemer et al. in Nat Mater 10:532, 2011) on the solar thermoelectric generator reported a high efficiency performance of 4.6%. The system had a vacuum glass enclosure, a flat panel (absorber), thermoelectric generator and water circulation for the cold side. The theoretical and numerical approach of this current study validated the experimental results from Kraemer's study to a high degree. The numerical simulation process utilizes a two-stage approach in ANSYS software for Fluent and Thermal-Electric Systems. The solar load model technique uses solar radiation under AM 1.5G conditions in Fluent. This analytical model applies Dr. Ho Sung Lee's theory of optimal design to improve the performance of the STEG system by using dimensionless parameters. Applying this theory, using two cover glasses and radiation shields, the STEG model can achieve a highest efficiency of 7%.

  4. Validation, Optimization and Simulation of a Solar Thermoelectric Generator Model (United States)

    Madkhali, Hadi Ali; Hamil, Ali; Lee, HoSung


    This study explores thermoelectrics as a viable option for small-scale solar thermal applications. Thermoelectric technology is based on the Seebeck effect, which states that a voltage is induced when a temperature gradient is applied to the junctions of two differing materials. This research proposes to analyze, validate, simulate, and optimize a prototype solar thermoelectric generator (STEG) model in order to increase efficiency. The intent is to further develop STEGs as a viable and productive energy source that limits pollution and reduces the cost of energy production. An empirical study (Kraemer et al. in Nat Mater 10:532, 2011) on the solar thermoelectric generator reported a high efficiency performance of 4.6%. The system had a vacuum glass enclosure, a flat panel (absorber), thermoelectric generator and water circulation for the cold side. The theoretical and numerical approach of this current study validated the experimental results from Kraemer's study to a high degree. The numerical simulation process utilizes a two-stage approach in ANSYS software for Fluent and Thermal-Electric Systems. The solar load model technique uses solar radiation under AM 1.5G conditions in Fluent. This analytical model applies Dr. Ho Sung Lee's theory of optimal design to improve the performance of the STEG system by using dimensionless parameters. Applying this theory, using two cover glasses and radiation shields, the STEG model can achieve a highest efficiency of 7%.

  5. Thermal conductivity of microporous layers: Analytical modeling and experimental validation (United States)

    Andisheh-Tadbir, Mehdi; Kjeang, Erik; Bahrami, Majid


    A new compact relationship is developed for the thermal conductivity of the microporous layer (MPL) used in polymer electrolyte fuel cells as a function of pore size distribution, porosity, and compression pressure. The proposed model is successfully validated against experimental data obtained from a transient plane source thermal constants analyzer. The thermal conductivities of carbon paper samples with and without MPL were measured as a function of load (1-6 bars) and the MPL thermal conductivity was found between 0.13 and 0.17 W m-1 K-1. The proposed analytical model predicts the experimental thermal conductivities within 5%. A correlation generated from the analytical model was used in a multi objective genetic algorithm to predict the pore size distribution and porosity for an MPL with optimized thermal conductivity and mass diffusivity. The results suggest that an optimized MPL, in terms of heat and mass transfer coefficients, has an average pore size of 122 nm and 63% porosity.

  6. Experimental Validation of a Dynamic Model for Lightweight Robots

    Directory of Open Access Journals (Sweden)

    Alessandro Gasparetto


    Full Text Available Nowadays, one of the main topics in robotics research is dynamic performance improvement by means of a lightening of the overall system structure. The effective motion and control of these lightweight robotic systems occurs with the use of suitable motion planning and control process. In order to do so, model-based approaches can be adopted by exploiting accurate dynamic models that take into account the inertial and elastic terms that are usually neglected in a heavy rigid link configuration. In this paper, an effective method for modelling spatial lightweight industrial robots based on an Equivalent Rigid Link System approach is considered from an experimental validation perspective. A dynamic simulator implementing the formulation is used and an experimental test-bench is set-up. Experimental tests are carried out with a benchmark L-shape mechanism.

  7. Experimental validation of the multiphase extended Leblond's model (United States)

    Weisz-Patrault, Daniel


    Transformation induced plasticity is a crucial contribution of the simulation of several forming processes involving phase transitions under mechanical loads, resulting in large irreversible strain even though the applied stress is under the yield stress. One of the most elegant and widely used models is based on analytic homogenization procedures and has been proposed by Leblond et al. [1-4]. Very recently, a simple extension of the Leblond's model has been developed by Weisz-Patrault [8]. Several product phases are taken into account and several assumptions are relaxed in order to extend the applicability of the model. The present contribution compares experimental tests with numerical computations, in order to discuss the validity of the developed theory. Thus, experimental results extracted from the existing literature are analyzed. Results show a good agreement between measurements and theoretical computations.

  8. A validation study of a stochastic model of human interaction (United States)

    Burchfield, Mitchel Talmadge

    The purpose of this dissertation is to validate a stochastic model of human interactions which is part of a developmentalism paradigm. Incorporating elements of ancient and contemporary philosophy and science, developmentalism defines human development as a progression of increasing competence and utilizes compatible theories of developmental psychology, cognitive psychology, educational psychology, social psychology, curriculum development, neurology, psychophysics, and physics. To validate a stochastic model of human interactions, the study addressed four research questions: (a) Does attitude vary over time? (b) What are the distributional assumptions underlying attitudes? (c) Does the stochastic model, {-}N{intlimitssbsp{-infty}{infty}}varphi(chi,tau)\\ Psi(tau)dtau, have utility for the study of attitudinal distributions and dynamics? (d) Are the Maxwell-Boltzmann, Fermi-Dirac, and Bose-Einstein theories applicable to human groups? Approximately 25,000 attitude observations were made using the Semantic Differential Scale. Positions of individuals varied over time and the logistic model predicted observed distributions with correlations between 0.98 and 1.0, with estimated standard errors significantly less than the magnitudes of the parameters. The results bring into question the applicability of Fisherian research designs (Fisher, 1922, 1928, 1938) for behavioral research based on the apparent failure of two fundamental assumptions-the noninteractive nature of the objects being studied and normal distribution of attributes. The findings indicate that individual belief structures are representable in terms of a psychological space which has the same or similar properties as physical space. The psychological space not only has dimension, but individuals interact by force equations similar to those described in theoretical physics models. Nonlinear regression techniques were used to estimate Fermi-Dirac parameters from the data. The model explained a high degree

  9. Modal testing for model validation of structures with discrete nonlinearities. (United States)

    Ewins, D J; Weekes, B; delli Carri, A


    Model validation using data from modal tests is now widely practiced in many industries for advanced structural dynamic design analysis, especially where structural integrity is a primary requirement. These industries tend to demand highly efficient designs for their critical structures which, as a result, are increasingly operating in regimes where traditional linearity assumptions are no longer adequate. In particular, many modern structures are found to contain localized areas, often around joints or boundaries, where the actual mechanical behaviour is far from linear. Such structures need to have appropriate representation of these nonlinear features incorporated into the otherwise largely linear models that are used for design and operation. This paper proposes an approach to this task which is an extension of existing linear techniques, especially in the testing phase, involving only just as much nonlinear analysis as is necessary to construct a model which is good enough, or 'valid': i.e. capable of predicting the nonlinear response behaviour of the structure under all in-service operating and test conditions with a prescribed accuracy. A short-list of methods described in the recent literature categorized using our framework is given, which identifies those areas in which further development is most urgently required. © 2015 The Authors.

  10. Volumetric Intraoperative Brain Deformation Compensation: Model Development and Phantom Validation (United States)

    DeLorenzo, Christine; Papademetris, Xenophon; Staib, Lawrence H.; Vives, Kenneth P.; Spencer, Dennis D.; Duncan, James S.


    During neurosurgery, nonrigid brain deformation may affect the reliability of tissue localization based on preoperative images. To provide accurate surgical guidance in these cases, preoperative images must be updated to reflect the intraoperative brain. This can be accomplished by warping these preoperative images using a biomechanical model. Due to the possible complexity of this deformation, intraoperative information is often required to guide the model solution. In this paper, a linear elastic model of the brain is developed to infer volumetric brain deformation associated with measured intraoperative cortical surface displacement. The developed model relies on known material properties of brain tissue, and does not require further knowledge about intraoperative conditions. To provide an initial estimation of volumetric model accuracy, as well as determine the model’s sensitivity to the specified material parameters and surface displacements, a realistic brain phantom was developed. Phantom results indicate that the linear elastic model significantly reduced localization error due to brain shift, from >16 mm to under 5 mm, on average. In addition, though in vivo quantitative validation is necessary, preliminary application of this approach to images acquired during neocortical epilepsy cases confirms the feasibility of applying the developed model to in vivo data. PMID:22562728

  11. LANL*V2.0: global modeling and validation

    Directory of Open Access Journals (Sweden)

    S. Zaharia


    Full Text Available We describe in this paper the new version of LANL*, an artificial neural network (ANN for calculating the magnetic drift invariant L*. This quantity is used for modeling radiation belt dynamics and for space weather applications. We have implemented the following enhancements in the new version: (1 we have removed the limitation to geosynchronous orbit and the model can now be used for a much larger region. (2 The new version is based on the improved magnetic field model by Tsyganenko and Sitnov (2005 (TS05 instead of the older model by Tsyganenko et al. (2003. We have validated the model and compared our results to L* calculations with the TS05 model based on ephemerides for CRRES, Polar, GPS, a LANL geosynchronous satellite, and a virtual RBSP type orbit. We find that the neural network performs very well for all these orbits with an error typically ΔL* * V2.0 artificial neural network is orders of magnitudes faster than traditional numerical field line integration techniques with the TS05 model. It has applications to real-time radiation belt forecasting, analysis of data sets involving decades of satellite of observations, and other problems in space weather.

  12. A Report on the Validation of Beryllium Strength Models

    Energy Technology Data Exchange (ETDEWEB)

    Armstrong, Derek Elswick [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)


    This report discusses work on validating beryllium strength models with flyer plate and Taylor rod experimental data. Strength models are calibrated with Hopkinson bar and quasi-static data. The Hopkinson bar data for beryllium provides strain rates up to about 4000 per second. A limitation of the Hopkinson bar data for beryllium is that it only provides information on strain up to about 0.15. The lack of high strain data at high strain rates makes it difficult to distinguish between various strength model settings. The PTW model has been calibrated many different times over the last 12 years. The lack of high strain data for high strain rates has resulted in these calibrated PTW models for beryllium exhibiting significantly different behavior when extrapolated to high strain. For beryllium, the α parameter of PTW has recently been calibrated to high precision shear modulus data. In the past the α value for beryllium was set based on expert judgment. The new α value for beryllium was used in a calibration of the beryllium PTW model by Sky Sjue. The calibration by Sjue used EOS table information to model the temperature dependence of the heat capacity. Also, the calibration by Sjue used EOS table information to model the density changes of the beryllium sample during the Hopkinson bar and quasi-static experiments. In this paper, the calibrated PTW model by Sjue is compared against experimental data and other strength models. The other strength models being considered are a PTW model calibrated by Shuh- Rong Chen and a Steinberg-Guinan type model by John Pedicini. The three strength models are used in a comparison against flyer plate and Taylor rod data. The results show that the Chen PTW model provides better agreement to this data. The Chen PTW model settings have been previously adjusted to provide a better fit to flyer plate data, whereas the Sjue PTW model has not been changed based on flyer plate data. However, the Sjue model provides a reasonable fit to

  13. Validation of Symptom Validity Tests Using a "Child-model" of Adult Cognitive Impairments

    NARCIS (Netherlands)

    Rienstra, A.; Spaan, P. E. J.; Schmand, B.


    Validation studies of symptom validity tests (SVTs) in children are uncommon. However, since children's cognitive abilities are not yet fully developed, their performance may provide additional support for the validity of these measures in adult populations. Four SVTs, the Test of Memory Malingering

  14. Validation of symptom validity tests using a "child-model" of adult cognitive impairments

    NARCIS (Netherlands)

    Rienstra, A.; Spaan, P.E.J.; Schmand, B.


    Validation studies of symptom validity tests (SVTs) in children are uncommon. However, since children’s cognitive abilities are not yet fully developed, their performance may provide additional support for the validity of these measures in adult populations. Four SVTs, the Test of Memory Malingering

  15. Challenges in validating model results for first year ice (United States)

    Melsom, Arne; Eastwood, Steinar; Xie, Jiping; Aaboe, Signe; Bertino, Laurent


    In order to assess the quality of model results for the distribution of first year ice, a comparison with a product based on observations from satellite-borne instruments has been performed. Such a comparison is not straightforward due to the contrasting algorithms that are used in the model product and the remote sensing product. The implementation of the validation is discussed in light of the differences between this set of products, and validation results are presented. The model product is the daily updated 10-day forecast from the Arctic Monitoring and Forecasting Centre in CMEMS. The forecasts are produced with the assimilative ocean prediction system TOPAZ. Presently, observations of sea ice concentration and sea ice drift are introduced in the assimilation step, but data for sea ice thickness and ice age (or roughness) are not included. The model computes the age of the ice by recording and updating the time passed after ice formation as sea ice grows and deteriorates as it is advected inside the model domain. Ice that is younger than 365 days is classified as first year ice. The fraction of first-year ice is recorded as a tracer in each grid cell. The Ocean and Sea Ice Thematic Assembly Centre in CMEMS redistributes a daily product from the EUMETSAT OSI SAF of gridded sea ice conditions which include "ice type", a representation of the separation of regions between those infested by first year ice, and those infested by multi-year ice. The ice type is parameterized based on data for the gradient ratio GR(19,37) from SSMIS observations, and from the ASCAT backscatter parameter. This product also includes information on ambiguity in the processing of the remote sensing data, and the product's confidence level, which have a strong seasonal dependency.

  16. Nonlinear ultrasound modelling and validation of fatigue damage (United States)

    Fierro, G. P. Malfense; Ciampa, F.; Ginzburg, D.; Onder, E.; Meo, M.


    Nonlinear ultrasound techniques have shown greater sensitivity to microcracks and they can be used to detect structural damages at their early stages. However, there is still a lack of numerical models available in commercial finite element analysis (FEA) tools that are able to simulate the interaction of elastic waves with the materials nonlinear behaviour. In this study, a nonlinear constitutive material model was developed to predict the structural response under continuous harmonic excitation of a fatigued isotropic sample that showed anharmonic effects. Particularly, by means of Landau's theory and Kelvin tensorial representation, this model provided an understanding of the elastic nonlinear phenomena such as the second harmonic generation in three-dimensional solid media. The numerical scheme was implemented and evaluated using a commercially available FEA software LS-DYNA, and it showed a good numerical characterisation of the second harmonic amplitude generated by the damaged region known as the nonlinear response area (NRA). Since this process requires only the experimental second-order nonlinear parameter and rough damage size estimation as an input, it does not need any baseline testing with the undamaged structure or any dynamic modelling of the fatigue crack growth. To validate this numerical model, the second-order nonlinear parameter was experimentally evaluated at various points over the fatigue life of an aluminium (AA6082-T6) coupon and the crack propagation was measured using an optical microscope. A good correlation was achieved between the experimental set-up and the nonlinear constitutive model.

  17. Validation of protein structure models using network similarity score. (United States)

    Ghosh, Sambit; Gadiyaram, Vasundhara; Vishveshwara, Saraswathi


    Accurate structural validation of proteins is of extreme importance in studies like protein structure prediction, analysis of molecular dynamic simulation trajectories and finding subtle changes in very similar structures. The benchmarks for today's structure validation are scoring methods like global distance test-total structure (GDT-TS), TM-score and root mean square deviations (RMSD). However, there is a lack of methods that look at both the protein backbone and side-chain structures at the global connectivity level and provide information about the differences in connectivity. To address this gap, a graph spectral based method (NSS-network similarity score) which has been recently developed to rigorously compare networks in diverse fields, is adopted to compare protein structures both at the backbone and at the side-chain noncovalent connectivity levels. In this study, we validate the performance of NSS by investigating protein structures from X-ray structures, modeling (including CASP models), and molecular dynamics simulations. Further, we systematically identify the local and the global regions of the structures contributing to the difference in NSS, through the components of the score, a feature unique to this spectral based scoring scheme. It is demonstrated that the method can quantify subtle differences in connectivity compared to a reference protein structure and can form a robust basis for protein structure comparison. Additionally, we have also introduced a network-based method to analyze fluctuations in side chain interactions (edge-weights) in an ensemble of structures, which can be an useful tool for the analysis of MD trajectories. © 2017 Wiley Periodicals, Inc.

  18. Non-Linear Slosh Damping Model Development and Validation (United States)

    Yang, H. Q.; West, Jeff


    Propellant tank slosh dynamics are typically represented by a mechanical model of spring mass damper. This mechanical model is then included in the equation of motion of the entire vehicle for Guidance, Navigation and Control (GN&C) analysis. For a partially-filled smooth wall propellant tank, the critical damping based on classical empirical correlation is as low as 0.05%. Due to this low value of damping, propellant slosh is potential sources of disturbance critical to the stability of launch and space vehicles. It is postulated that the commonly quoted slosh damping is valid only under the linear regime where the slosh amplitude is small. With the increase of slosh amplitude, the critical damping value should also increase. If this nonlinearity can be verified and validated, the slosh stability margin can be significantly improved, and the level of conservatism maintained in the GN&C analysis can be lessened. The purpose of this study is to explore and to quantify the dependence of slosh damping with slosh amplitude. Accurately predicting the extremely low damping value of a smooth wall tank is very challenging for any Computational Fluid Dynamics (CFD) tool. One must resolve thin boundary layers near the wall and limit numerical damping to minimum. This computational study demonstrates that with proper grid resolution, CFD can indeed accurately predict the low damping physics from smooth walls under the linear regime. Comparisons of extracted damping values with experimental data for different tank sizes show very good agreements. Numerical simulations confirm that slosh damping is indeed a function of slosh amplitude. When slosh amplitude is low, the damping ratio is essentially constant, which is consistent with the empirical correlation. Once the amplitude reaches a critical value, the damping ratio becomes a linearly increasing function of the slosh amplitude. A follow-on experiment validated the developed nonlinear damping relationship. This discovery can

  19. Use of EARLINET climatology for validation of vertical model profiles (United States)

    Mortier, Augustin; Schulz, Michael


    For over a decade, intensive in-situ, ground-based and spaceborne remote observations are dedicated to the aerosols, a major component of the Earth atmosphere. These observations are mostly motivated by the high variability of the particles in space and time and their effect on the climate at a global scale, and at a regional scale on air quality. In the meantime, global and regional models provide aerosol concentrations (as projection, reanalysis or in near real time in chemical weather forecasting) respectively for the calculation of radiative effects and the assessment of air quality. The vertical distribution of the aerosol is a key-parameter since it affects its lifetime and reflects physical processes such as wet and dry deposition or chemical reactions. The aerosols present in low levels of the troposphere directly affect local air quality, while elevated aerosol layers can be transported long-range and contribute to pollution in remote regions. The evaluation of aerosol column and simulated vertical profiles are thus of particular interest for the performance characterisation of air quality models. The Copernicus Atmosphere Monitoring System (CAMS) delivers daily near real time aerosols products over Europe. In the framework of producing a regional a posteriori validation of the CAMS models, we propose, through this study, a validation exercise of the vertical aerosol profiles. This shall rely on the ACTRIS European Aerosol Research Lidar Network (EARLINET) measurements because of their quality and the opportunity to derive a climatology from long-term measurements. PM10 profiles are given from the models while mostly backscatter profiles are available from EARLINET database. After studying the representativeness of the EARLINET data (2006-2014), we present a comparison with the modeled vertical profiles (7 models and the Ensemble) at the location of measurement stations for the different seasons of the year 2016. The challenge of comparing the measured

  20. A New Statistical Method to Determine the Degree of Validity of Health Economic Model Outcomes against Empirical Data

    NARCIS (Netherlands)

    Corro Ramos, Isaac; van Voorn, George A K; Vemer, Pepijn; Feenstra, Talitha L.; Al, Maiwenn J.

    Background: The validation of health economic (HE) model outcomes against empirical data is of key importance. Although statistical testing seems applicable, guidelines for the validation of HE models lack guidance on statistical validation, and actual validation efforts often present subjective

  1. FDA Benchmark Medical Device Flow Models for CFD Validation. (United States)

    Malinauskas, Richard A; Hariharan, Prasanna; Day, Steven W; Herbertson, Luke H; Buesen, Martin; Steinseifer, Ulrich; Aycock, Kenneth I; Good, Bryan C; Deutsch, Steven; Manning, Keefe B; Craven, Brent A

    Computational fluid dynamics (CFD) is increasingly being used to develop blood-contacting medical devices. However, the lack of standardized methods for validating CFD simulations and blood damage predictions limits its use in the safety evaluation of devices. Through a U.S. Food and Drug Administration (FDA) initiative, two benchmark models of typical device flow geometries (nozzle and centrifugal blood pump) were tested in multiple laboratories to provide experimental velocities, pressures, and hemolysis data to support CFD validation. In addition, computational simulations were performed by more than 20 independent groups to assess current CFD techniques. The primary goal of this article is to summarize the FDA initiative and to report recent findings from the benchmark blood pump model study. Discrepancies between CFD predicted velocities and those measured using particle image velocimetry most often occurred in regions of flow separation (e.g., downstream of the nozzle throat, and in the pump exit diffuser). For the six pump test conditions, 57% of the CFD predictions of pressure head were within one standard deviation of the mean measured values. Notably, only 37% of all CFD submissions contained hemolysis predictions. This project aided in the development of an FDA Guidance Document on factors to consider when reporting computational studies in medical device regulatory submissions. There is an accompanying podcast available for this article. Please visit the journal's Web site ( to listen.

  2. Ultrasonic transducers for cure monitoring: design, modelling and validation (United States)

    Lionetto, Francesca; Montagna, Francesco; Maffezzoli, Alfonso


    The finite element method (FEM) has been applied to simulate the ultrasonic wave propagation in a multilayered transducer, expressly designed for high-frequency dynamic mechanical analysis of polymers. The FEM model includes an electro-acoustic (active element) and some acoustic (passive elements) transmission lines. The simulation of the acoustic propagation accounts for the interaction between the piezoceramic and the materials in the buffer rod and backing, and the coupling between the electric and mechanical properties of the piezoelectric material. As a result of the simulations, the geometry and size of the modelled ultrasonic transducer has been optimized and used for the realization of a prototype transducer for cure monitoring. The transducer performance has been validated by measuring the velocity changes during the polymerization of a thermosetting matrix of composite materials.

  3. Validating agent oriented methodology (AOM) for netlogo modelling and simulation (United States)

    WaiShiang, Cheah; Nissom, Shane; YeeWai, Sim; Sharbini, Hamizan


    AOM (Agent Oriented Modeling) is a comprehensive and unified agent methodology for agent oriented software development. AOM methodology was proposed to aid developers with the introduction of technique, terminology, notation and guideline during agent systems development. Although AOM methodology is claimed to be capable of developing a complex real world system, its potential is yet to be realized and recognized by the mainstream software community and the adoption of AOM is still at its infancy. Among the reason is that there are not much case studies or success story of AOM. This paper presents two case studies on the adoption of AOM for individual based modelling and simulation. It demonstrate how the AOM is useful for epidemiology study and ecological study. Hence, it further validate the AOM in a qualitative manner.

  4. Systematic approach to verification and validation: High explosive burn models

    Energy Technology Data Exchange (ETDEWEB)

    Menikoff, Ralph [Los Alamos National Laboratory; Scovel, Christina A. [Los Alamos National Laboratory


    Most material models used in numerical simulations are based on heuristics and empirically calibrated to experimental data. For a specific model, key questions are determining its domain of applicability and assessing its relative merits compared to other models. Answering these questions should be a part of model verification and validation (V and V). Here, we focus on V and V of high explosive models. Typically, model developers implemented their model in their own hydro code and use different sets of experiments to calibrate model parameters. Rarely can one find in the literature simulation results for different models of the same experiment. Consequently, it is difficult to assess objectively the relative merits of different models. This situation results in part from the fact that experimental data is scattered through the literature (articles in journals and conference proceedings) and that the printed literature does not allow the reader to obtain data from a figure in electronic form needed to make detailed comparisons among experiments and simulations. In addition, it is very time consuming to set up and run simulations to compare different models over sufficiently many experiments to cover the range of phenomena of interest. The first difficulty could be overcome if the research community were to support an online web based database. The second difficulty can be greatly reduced by automating procedures to set up and run simulations of similar types of experiments. Moreover, automated testing would be greatly facilitated if the data files obtained from a database were in a standard format that contained key experimental parameters as meta-data in a header to the data file. To illustrate our approach to V and V, we have developed a high explosive database (HED) at LANL. It now contains a large number of shock initiation experiments. Utilizing the header information in a data file from HED, we have written scripts to generate an input file for a hydro code

  5. Experimental validation of solid rocket motor damping models (United States)

    Riso, Cristina; Fransen, Sebastiaan; Mastroddi, Franco; Coppotelli, Giuliano; Trequattrini, Francesco; De Vivo, Alessio


    In design and certification of spacecraft, payload/launcher coupled load analyses are performed to simulate the satellite dynamic environment. To obtain accurate predictions, the system damping properties must be properly taken into account in the finite element model used for coupled load analysis. This is typically done using a structural damping characterization in the frequency domain, which is not applicable in the time domain. Therefore, the structural damping matrix of the system must be converted into an equivalent viscous damping matrix when a transient coupled load analysis is performed. This paper focuses on the validation of equivalent viscous damping methods for dynamically condensed finite element models via correlation with experimental data for a realistic structure representative of a slender launch vehicle with solid rocket motors. A second scope of the paper is to investigate how to conveniently choose a single combination of Young's modulus and structural damping coefficient—complex Young's modulus—to approximate the viscoelastic behavior of a solid propellant material in the frequency band of interest for coupled load analysis. A scaled-down test article inspired to the Z9-ignition Vega launcher configuration is designed, manufactured, and experimentally tested to obtain data for validation of the equivalent viscous damping methods. The Z9-like component of the test article is filled with a viscoelastic material representative of the Z9 solid propellant that is also preliminarily tested to investigate the dependency of the complex Young's modulus on the excitation frequency and provide data for the test article finite element model. Experimental results from seismic and shock tests performed on the test configuration are correlated with numerical results from frequency and time domain analyses carried out on its dynamically condensed finite element model to assess the applicability of different equivalent viscous damping methods to describe

  6. External Validity and Model Validity: A Conceptual Approach for Systematic Review Methodology

    Directory of Open Access Journals (Sweden)

    Raheleh Khorsan


    Full Text Available Background. Evidence rankings do not consider equally internal (IV, external (EV, and model validity (MV for clinical studies including complementary and alternative medicine/integrative health care (CAM/IHC research. This paper describe this model and offers an EV assessment tool (EVAT© for weighing studies according to EV and MV in addition to IV. Methods. An abbreviated systematic review methodology was employed to search, assemble, and evaluate the literature that has been published on EV/MV criteria. Standard databases were searched for keywords relating to EV, MV, and bias-scoring from inception to Jan 2013. Tools identified and concepts described were pooled to assemble a robust tool for evaluating these quality criteria. Results. This study assembled a streamlined, objective tool to incorporate for the evaluation of quality of EV/MV research that is more sensitive to CAM/IHC research. Conclusion. Improved reporting on EV can help produce and provide information that will help guide policy makers, public health researchers, and other scientists in their selection, development, and improvement in their research-tested intervention. Overall, clinical studies with high EV have the potential to provide the most useful information about “real-world” consequences of health interventions. It is hoped that this novel tool which considers IV, EV, and MV on equal footing will better guide clinical decision making.

  7. Case study for model validation : assessing a model for thermal decomposition of polyurethane foam.

    Energy Technology Data Exchange (ETDEWEB)

    Dowding, Kevin J.; Leslie, Ian H. (New Mexico State University, Las Cruces, NM); Hobbs, Michael L.; Rutherford, Brian Milne; Hills, Richard Guy (New Mexico State University, Las Cruces, NM); Pilch, Martin M.


    A case study is reported to document the details of a validation process to assess the accuracy of a mathematical model to represent experiments involving thermal decomposition of polyurethane foam. The focus of the report is to work through a validation process. The process addresses the following activities. The intended application of mathematical model is discussed to better understand the pertinent parameter space. The parameter space of the validation experiments is mapped to the application parameter space. The mathematical models, computer code to solve the models and its (code) verification are presented. Experimental data from two activities are used to validate mathematical models. The first experiment assesses the chemistry model alone and the second experiment assesses the model of coupled chemistry, conduction, and enclosure radiation. The model results of both experimental activities are summarized and uncertainty of the model to represent each experimental activity is estimated. The comparison between the experiment data and model results is quantified with various metrics. After addressing these activities, an assessment of the process for the case study is given. Weaknesses in the process are discussed and lessons learned are summarized.

  8. Validation of a Global Hydrodynamic Flood Inundation Model (United States)

    Bates, P. D.; Smith, A.; Sampson, C. C.; Alfieri, L.; Neal, J. C.


    In this work we present first validation results for a hyper-resolution global flood inundation model. We use a true hydrodynamic model (LISFLOOD-FP) to simulate flood inundation at 1km resolution globally and then use downscaling algorithms to determine flood extent and depth at 90m spatial resolution. Terrain data are taken from a custom version of the SRTM data set that has been processed specifically for hydrodynamic modelling. Return periods of flood flows along the entire global river network are determined using: (1) empirical relationships between catchment characteristics and index flood magnitude in different hydroclimatic zones derived from global runoff data; and (2) an index flood growth curve, also empirically derived. Bankful return period flow is then used to set channel width and depth, and flood defence impacts are modelled using empirical relationships between GDP, urbanization and defence standard of protection. The results of these simulations are global flood hazard maps for a number of different return period events from 1 in 5 to 1 in 1000 years. We compare these predictions to flood hazard maps developed by national government agencies in the UK and Germany using similar methods but employing detailed local data, and to observed flood extent at a number of sites including St. Louis, USA and Bangkok in Thailand. Results show that global flood hazard models can have considerable skill given careful treatment to overcome errors in the publicly available data that are used as their input.

  9. Nonlinear dispersion effects in elastic plates: numerical modelling and validation (United States)

    Kijanka, Piotr; Radecki, Rafal; Packo, Pawel; Staszewski, Wieslaw J.; Uhl, Tadeusz; Leamy, Michael J.


    Nonlinear features of elastic wave propagation have attracted significant attention recently. The particular interest herein relates to complex wave-structure interactions, which provide potential new opportunities for feature discovery and identification in a variety of applications. Due to significant complexity associated with wave propagation in nonlinear media, numerical modeling and simulations are employed to facilitate design and development of new measurement, monitoring and characterization systems. However, since very high spatio- temporal accuracy of numerical models is required, it is critical to evaluate their spectral properties and tune discretization parameters for compromise between accuracy and calculation time. Moreover, nonlinearities in structures give rise to various effects that are not present in linear systems, e.g. wave-wave interactions, higher harmonics generation, synchronism and | recently reported | shifts to dispersion characteristics. This paper discusses local computational model based on a new HYBRID approach for wave propagation in nonlinear media. The proposed approach combines advantages of the Local Interaction Simulation Approach (LISA) and Cellular Automata for Elastodynamics (CAFE). The methods are investigated in the context of their accuracy for predicting nonlinear wavefields, in particular shifts to dispersion characteristics for finite amplitude waves and secondary wavefields. The results are validated against Finite Element (FE) calculations for guided waves in copper plate. Critical modes i.e., modes determining accuracy of a model at given excitation frequency - are identified and guidelines for numerical model parameters are proposed.


    Klisch, Stephen M.; Asanbaeva, Anna; Oungoulian, Sevan R.; Masuda, Koichi; Thonar, Eugene J-MA; Davol, Andrew; Sah, Robert L.


    A cartilage growth mixture (CGM) model is proposed to address limitations of a model used in a previous study. New stress constitutive equations for the solid matrix are derived and collagen (COL) remodeling is incorporated into the CGM model by allowing the intrinsic COL material constants to evolve during growth. An analytical validation protocol based on experimental data from a recent in vitro growth study is developed. Available data included measurements of tissue volume, biochemical composition, and tensile modulus for bovine calf articular cartilage (AC) explants harvested at three depths and incubated for 13 days in 20% FBS and 20% FBS+β-aminopropionitrile. The proposed CGM model can match tissue biochemical content and volume exactly while predicting theoretical values of tensile moduli that do not significantly differ from experimental values. Also, theoretical values of a scalar COL remodeling factor are positively correlated with COL crosslink content, and mass growth functions are positively correlated with cell density. The results suggest that the CGM model may help to guide in vitro growth protocols for AC tissue via the a priori prediction of geometric and biomechanical properties. PMID:18532855

  11. Hydraulic Hybrid Excavator—Mathematical Model Validation and Energy Analysis

    Directory of Open Access Journals (Sweden)

    Paolo Casoli


    Full Text Available Recent demands to reduce pollutant emissions and improve energy efficiency have driven the implementation of hybrid solutions in mobile machinery. This paper presents the results of a numerical and experimental analysis conducted on a hydraulic hybrid excavator (HHE. The machinery under study is a middle size excavator, whose standard version was modified with the introduction of an energy recovery system (ERS. The proposed ERS layout was designed to recover the potential energy of the boom, using a hydraulic accumulator as a storage device. The recovered energy is utilized through the pilot pump of the machinery which operates as a motor, thus reducing the torque required from the internal combustion engine (ICE. The analysis reported in this paper validates the HHE model by comparing numerical and experimental data in terms of hydraulic and mechanical variables and fuel consumption. The mathematical model shows its capability to reproduce the realistic operating conditions of the realized prototype, tested on the field. A detailed energy analysis comparison between the standard and the hybrid excavator models was carried out to evaluate the energy flows along the system, showing advantages, weaknesses and possibilities to further improve the machinery efficiency. Finally, the fuel consumption estimated by the model and that measured during the experiments are presented to highlight the fuel saving percentages. The HHE model is an important starting point for the development of other energy saving solutions.

  12. Developing and investigating validity of a knowledge management game simulation model

    NARCIS (Netherlands)

    Tsjernikova, Irina


    The goals of this research project were to develop a game simulation model which supports learning knowledge management in a game environment and to investigate the validity of that model. The validity of the model is approached from two perspectives: educational validity and representational

  13. Validating neural-network refinements of nuclear mass models (United States)

    Utama, R.; Piekarewicz, J.


    Background: Nuclear astrophysics centers on the role of nuclear physics in the cosmos. In particular, nuclear masses at the limits of stability are critical in the development of stellar structure and the origin of the elements. Purpose: We aim to test and validate the predictions of recently refined nuclear mass models against the newly published AME2016 compilation. Methods: The basic paradigm underlining the recently refined nuclear mass models is based on existing state-of-the-art models that are subsequently refined through the training of an artificial neural network. Bayesian inference is used to determine the parameters of the neural network so that statistical uncertainties are provided for all model predictions. Results: We observe a significant improvement in the Bayesian neural network (BNN) predictions relative to the corresponding "bare" models when compared to the nearly 50 new masses reported in the AME2016 compilation. Further, AME2016 estimates for the handful of impactful isotopes in the determination of r -process abundances are found to be in fairly good agreement with our theoretical predictions. Indeed, the BNN-improved Duflo-Zuker model predicts a root-mean-square deviation relative to experiment of σrms≃400 keV. Conclusions: Given the excellent performance of the BNN refinement in confronting the recently published AME2016 compilation, we are confident of its critical role in our quest for mass models of the highest quality. Moreover, as uncertainty quantification is at the core of the BNN approach, the improved mass models are in a unique position to identify those nuclei that will have the strongest impact in resolving some of the outstanding questions in nuclear astrophysics.

  14. Validation Analysis of the Shoal Groundwater Flow and Transport Model

    Energy Technology Data Exchange (ETDEWEB)

    A. Hassan; J. Chapman


    Environmental restoration at the Shoal underground nuclear test is following a process prescribed by a Federal Facility Agreement and Consent Order (FFACO) between the U.S. Department of Energy, the U.S. Department of Defense, and the State of Nevada. Characterization of the site included two stages of well drilling and testing in 1996 and 1999, and development and revision of numerical models of groundwater flow and radionuclide transport. Agreement on a contaminant boundary for the site and a corrective action plan was reached in 2006. Later that same year, three wells were installed for the purposes of model validation and site monitoring. The FFACO prescribes a five-year proof-of-concept period for demonstrating that the site groundwater model is capable of producing meaningful results with an acceptable level of uncertainty. The corrective action plan specifies a rigorous seven step validation process. The accepted groundwater model is evaluated using that process in light of the newly acquired data. The conceptual model of ground water flow for the Project Shoal Area considers groundwater flow through the fractured granite aquifer comprising the Sand Springs Range. Water enters the system by the infiltration of precipitation directly on the surface of the mountain range. Groundwater leaves the granite aquifer by flowing into alluvial deposits in the adjacent basins of Fourmile Flat and Fairview Valley. A groundwater divide is interpreted as coinciding with the western portion of the Sand Springs Range, west of the underground nuclear test, preventing flow from the test into Fourmile Flat. A very low conductivity shear zone east of the nuclear test roughly parallels the divide. The presence of these lateral boundaries, coupled with a regional discharge area to the northeast, is interpreted in the model as causing groundwater from the site to flow in a northeastward direction into Fairview Valley. Steady-state flow conditions are assumed given the absence of

  15. First approximations in avalanche model validations using seismic information (United States)

    Roig Lafon, Pere; Suriñach, Emma; Bartelt, Perry; Pérez-Guillén, Cristina; Tapia, Mar; Sovilla, Betty


    Avalanche dynamics modelling is an essential tool for snow hazard management. Scenario based numerical modelling provides quantitative arguments for decision-making. The software tool RAMMS (WSL Institute for Snow and Avalanche Research SLF) is one such tool, often used by government authorities and geotechnical offices. As avalanche models improve, the quality of the numerical results will depend increasingly on user experience on the specification of input (e.g. release and entrainment volumes, secondary releases, snow temperature and quality). New model developments must continue to be validated using real phenomena data, for improving performance and reliability. The avalanches group form University of Barcelona (RISKNAT - UB), has studied the seismic signals generated from avalanches since 1994. Presently, the group manages the seismic installation at SLF's Vallée de la Sionne experimental site (VDLS). At VDLS the recorded seismic signals can be correlated to other avalanche measurement techniques, including both advanced remote sensing methods (radars, videogrammetry) and obstacle based sensors (pressure, capacitance, optical sender-reflector barriers). This comparison between different measurement techniques allows the group to address the question if seismic analysis can be used alone, on more additional avalanche tracks, to gain insight and validate numerical avalanche dynamics models in different terrain conditions. In this study, we aim to add the seismic data as an external record of the phenomena, able to validate RAMMS models. The seismic sensors are considerable easy and cheaper to install than other physical measuring tools, and are able to record data from the phenomena in every atmospheric conditions (e.g. bad weather, low light, freezing make photography, and other kind of sensors not usable). With seismic signals, we record the temporal evolution of the inner and denser parts of the avalanche. We are able to recognize the approximate position

  16. ExEP yield modeling tool and validation test results (United States)

    Morgan, Rhonda; Turmon, Michael; Delacroix, Christian; Savransky, Dmitry; Garrett, Daniel; Lowrance, Patrick; Liu, Xiang Cate; Nunez, Paul


    EXOSIMS is an open-source simulation tool for parametric modeling of the detection yield and characterization of exoplanets. EXOSIMS has been adopted by the Exoplanet Exploration Programs Standards Definition and Evaluation Team (ExSDET) as a common mechanism for comparison of exoplanet mission concept studies. To ensure trustworthiness of the tool, we developed a validation test plan that leverages the Python-language unit-test framework, utilizes integration tests for selected module interactions, and performs end-to-end crossvalidation with other yield tools. This paper presents the test methods and results, with the physics-based tests such as photometry and integration time calculation treated in detail and the functional tests treated summarily. The test case utilized a 4m unobscured telescope with an idealized coronagraph and an exoplanet population from the IPAC radial velocity (RV) exoplanet catalog. The known RV planets were set at quadrature to allow deterministic validation of the calculation of physical parameters, such as working angle, photon counts and integration time. The observing keepout region was tested by generating plots and movies of the targets and the keepout zone over a year. Although the keepout integration test required the interpretation of a user, the test revealed problems in the L2 halo orbit and the parameterization of keepout applied to some solar system bodies, which the development team was able to address. The validation testing of EXOSIMS was performed iteratively with the developers of EXOSIMS and resulted in a more robust, stable, and trustworthy tool that the exoplanet community can use to simulate exoplanet direct-detection missions from probe class, to WFIRST, up to large mission concepts such as HabEx and LUVOIR.

  17. Modelling and validation of spectral reflectance for the colon (United States)

    Hidovic-Rowe, Dzena; Claridge, Ela


    The spectral reflectance of the colon is known to be affected by malignant and pre-malignant changes in the tissue. As part of long-term research on the derivation of diagnostically important parameters characterizing colon histology, we have investigated the effects of the normal histological variability on the remitted spectra. This paper presents a detailed optical model of the normal colon comprising mucosa, submucosa and the smooth muscle layer. Each layer is characterized by five variable histological parameters: the volume fraction of blood, the haemoglobin saturation, the size of the scattering particles, including collagen, the volume fraction of the scattering particles and the layer thickness, and three optical parameters: the anisotropy factor, the refractive index of the medium and the refractive index of the scattering particles. The paper specifies the parameter ranges corresponding to normal colon tissue, including some previously unpublished ones. Diffuse reflectance spectra were modelled using the Monte Carlo method. Validation of the model-generated spectra against measured spectra demonstrated that good correspondence was achieved between the two. The analysis of the effect of the individual histological parameters on the behaviour of the spectra has shown that the spectral variability originates mainly from changes in the mucosa. However, the submucosa and the muscle layer must be included in the model as they have a significant constant effect on the spectral reflectance above 600 nm. The nature of variations in the spectra also suggests that it may be possible to carry out model inversion and to recover parameters characterizing the colon from multi-spectral images. A preliminary study, in which the mucosal blood and collagen parameters were modified to reflect histopathological changes associated with colon cancer, has shown that the spectra predicted by our model resemble measured spectral reflectance of adenocarcinomas. This suggests that

  18. Validating clustering of molecular dynamics simulations using polymer models

    Directory of Open Access Journals (Sweden)

    Phillips Joshua L


    Full Text Available Abstract Background Molecular dynamics (MD simulation is a powerful technique for sampling the meta-stable and transitional conformations of proteins and other biomolecules. Computational data clustering has emerged as a useful, automated technique for extracting conformational states from MD simulation data. Despite extensive application, relatively little work has been done to determine if the clustering algorithms are actually extracting useful information. A primary goal of this paper therefore is to provide such an understanding through a detailed analysis of data clustering applied to a series of increasingly complex biopolymer models. Results We develop a novel series of models using basic polymer theory that have intuitive, clearly-defined dynamics and exhibit the essential properties that we are seeking to identify in MD simulations of real biomolecules. We then apply spectral clustering, an algorithm particularly well-suited for clustering polymer structures, to our models and MD simulations of several intrinsically disordered proteins. Clustering results for the polymer models provide clear evidence that the meta-stable and transitional conformations are detected by the algorithm. The results for the polymer models also help guide the analysis of the disordered protein simulations by comparing and contrasting the statistical properties of the extracted clusters. Conclusions We have developed a framework for validating the performance and utility of clustering algorithms for studying molecular biopolymer simulations that utilizes several analytic and dynamic polymer models which exhibit well-behaved dynamics including: meta-stable states, transition states, helical structures, and stochastic dynamics. We show that spectral clustering is robust to anomalies introduced by structural alignment and that different structural classes of intrinsically disordered proteins can be reliably discriminated from the clustering results. To our

  19. Using of Structural Equation Modeling Techniques in Cognitive Levels Validation

    Directory of Open Access Journals (Sweden)

    Natalija Curkovic


    Full Text Available When constructing knowledge tests, cognitive level is usually one of the dimensions comprising the test specifications with each item assigned to measure a particular level. Recently used taxonomies of the cognitive levels most often represent some modification of the original Bloom’s taxonomy. There are many concerns in current literature about existence of predefined cognitive levels. The aim of this article is to investigate can structural equation modeling techniques confirm existence of different cognitive levels. For the purpose of the research, a Croatian final high-school Mathematics exam was used (N = 9626. Confirmatory factor analysis and structural regression modeling were used to test three different models. Structural equation modeling techniques did not support existence of different cognitive levels in this case. There is more than one possible explanation for that finding. Some other techniques that take into account nonlinear behaviour of the items as well as qualitative techniques might be more useful for the purpose of the cognitive levels validation. Furthermore, it seems that cognitive levels were not efficient descriptors of the items and so improvements are needed in describing the cognitive skills measured by items.

  20. ADOPT: A Historically Validated Light Duty Vehicle Consumer Choice Model

    Energy Technology Data Exchange (ETDEWEB)

    Brooker, A.; Gonder, J.; Lopp, S.; Ward, J.


    The Automotive Deployment Option Projection Tool (ADOPT) is a light-duty vehicle consumer choice and stock model supported by the U.S. Department of Energy’s Vehicle Technologies Office. It estimates technology improvement impacts on U.S. light-duty vehicles sales, petroleum use, and greenhouse gas emissions. ADOPT uses techniques from the multinomial logit method and the mixed logit method estimate sales. Specifically, it estimates sales based on the weighted value of key attributes including vehicle price, fuel cost, acceleration, range and usable volume. The average importance of several attributes changes nonlinearly across its range and changes with income. For several attributes, a distribution of importance around the average value is used to represent consumer heterogeneity. The majority of existing vehicle makes, models, and trims are included to fully represent the market. The Corporate Average Fuel Economy regulations are enforced. The sales feed into the ADOPT stock model. It captures key aspects for summing petroleum use and greenhouse gas emissions This includes capturing the change in vehicle miles traveled by vehicle age, the creation of new model options based on the success of existing vehicles, new vehicle option introduction rate limits, and survival rates by vehicle age. ADOPT has been extensively validated with historical sales data. It matches in key dimensions including sales by fuel economy, acceleration, price, vehicle size class, and powertrain across multiple years. A graphical user interface provides easy and efficient use. It manages the inputs, simulation, and results.

  1. Integrated Process Modeling-A Process Validation Life Cycle Companion. (United States)

    Zahel, Thomas; Hauer, Stefan; Mueller, Eric M; Murphy, Patrick; Abad, Sandra; Vasilieva, Elena; Maurer, Daniel; Brocard, Cécile; Reinisch, Daniela; Sagmeister, Patrick; Herwig, Christoph


    During the regulatory requested process validation of pharmaceutical manufacturing processes, companies aim to identify, control, and continuously monitor process variation and its impact on critical quality attributes (CQAs) of the final product. It is difficult to directly connect the impact of single process parameters (PPs) to final product CQAs, especially in biopharmaceutical process development and production, where multiple unit operations are stacked together and interact with each other. Therefore, we want to present the application of Monte Carlo (MC) simulation using an integrated process model (IPM) that enables estimation of process capability even in early stages of process validation. Once the IPM is established, its capability in risk and criticality assessment is furthermore demonstrated. IPMs can be used to enable holistic production control strategies that take interactions of process parameters of multiple unit operations into account. Moreover, IPMs can be trained with development data, refined with qualification runs, and maintained with routine manufacturing data which underlines the lifecycle concept. These applications will be shown by means of a process characterization study recently conducted at a world-leading contract manufacturing organization (CMO). The new IPM methodology therefore allows anticipation of out of specification (OOS) events, identify critical process parameters, and take risk-based decisions on counteractions that increase process robustness and decrease the likelihood of OOS events.

  2. Validation of MHD Models using MST RFP Plasmas (United States)

    Jacobson, C. M.; Chapman, B. E.; den Hartog, D. J.; McCollam, K. J.; Sarff, J. S.; Sovinec, C. R.


    Rigorous validation of computational models used in fusion energy sciences over a large parameter space and across multiple magnetic configurations can increase confidence in their ability to predict the performance of future devices. MST is a well diagnosed reversed-field pinch (RFP) capable of operation with plasma current ranging from 60 kA to 500 kA. The resulting Lundquist number S, a key parameter in resistive magnetohydrodynamics (MHD), ranges from 4 ×104 to 8 ×106 for standard RFP plasmas and provides substantial overlap with MHD RFP simulations. MST RFP plasmas are simulated using both DEBS, a nonlinear single-fluid visco-resistive MHD code, and NIMROD, a nonlinear extended MHD code, with S ranging from 104 to 105 for single-fluid runs, and the magnetic Prandtl number Pm = 1 . Validation metric comparisons are presented, focusing on how normalized magnetic fluctuations at the edge b scale with S. Preliminary results for the dominant n = 6 mode are b S - 0 . 20 +/- 0 . 02 for single-fluid NIMROD, b S - 0 . 25 +/- 0 . 05 for DEBS, and b S - 0 . 20 +/- 0 . 02 for experimental measurements, however there is a significant discrepancy in mode amplitudes. Preliminary two-fluid NIMROD results are also presented. Work supported by US DOE.

  3. Modelling and Validating a Deoiling Hydrocyclone for Fault Diagnosis using Multilevel Flow Modeling

    DEFF Research Database (Denmark)

    Nielsen, Emil Krabbe; Bram, Mads Valentin; Frutiger, Jerome

    applied in safety systems of complex and safety critical systems, requirerigorous and reliable model building and testing. Multilevel Flow Modeling is a qualitative methodfor diagnosing faults, and has previously only been validated by subjective and qualitative means.This work aims to synthesize...

  4. Validation of the dermal exposure model in ECETOC TRA. (United States)

    Marquart, Hans; Franken, Remy; Goede, Henk; Fransman, Wouter; Schinkel, Jody


    The ECETOC TRA model (presently version 3.1) is often used to estimate worker inhalation and dermal exposure in regulatory risk assessment. The dermal model in ECETOC TRA has not yet been validated by comparison with independent measured exposure levels. This was the goal of the present study. Measured exposure levels and relevant contextual information were gathered via literature search, websites of relevant occupational health institutes and direct requests for data to industry. Exposure data were clustered in so-called exposure cases, which are sets of data from one data source that are expected to have the same values for input parameters in the ECETOC TRA dermal exposure model. For each exposure case, the 75th percentile of measured values was calculated, because the model intends to estimate these values. The input values for the parameters in ECETOC TRA were assigned by an expert elicitation and consensus building process, based on descriptions of relevant contextual information.From more than 35 data sources, 106 useful exposure cases were derived, that were used for direct comparison with the model estimates. The exposure cases covered a large part of the ECETOC TRA dermal exposure model. The model explained 37% of the variance in the 75th percentiles of measured values. In around 80% of the exposure cases, the model estimate was higher than the 75th percentile of measured values. In the remaining exposure cases, the model estimate may not be sufficiently conservative.The model was shown to have a clear bias towards (severe) overestimation of dermal exposure at low measured exposure values, while all cases of apparent underestimation by the ECETOC TRA dermal exposure model occurred at high measured exposure values. This can be partly explained by a built-in bias in the effect of concentration of substance in product used, duration of exposure and the use of protective gloves in the model. The effect of protective gloves was calculated to be on average a

  5. Vibroacoustic Model Validation for a Curved Honeycomb Composite Panel (United States)

    Buehrle, Ralph D.; Robinson, Jay H.; Grosveld, Ferdinand W.


    Finite element and boundary element models are developed to investigate the vibroacoustic response of a curved honeycomb composite sidewall panel. Results from vibroacoustic tests conducted in the NASA Langley Structural Acoustic Loads and Transmission facility are used to validate the numerical predictions. The sidewall panel is constructed from a flexible honeycomb core sandwiched between carbon fiber reinforced composite laminate face sheets. This type of construction is being used in the development of an all-composite aircraft fuselage. In contrast to conventional rib-stiffened aircraft fuselage structures, the composite panel has nominally uniform thickness resulting in a uniform distribution of mass and stiffness. Due to differences in the mass and stiffness distribution, the noise transmission mechanisms for the composite panel are expected to be substantially different from those of a conventional rib-stiffened structure. The development of accurate vibroacoustic models will aide in the understanding of the dominant noise transmission mechanisms and enable optimization studies to be performed that will determine the most beneficial noise control treatments. Finite element and boundary element models of the sidewall panel are described. Vibroacoustic response predictions are presented for forced vibration input and the results are compared with experimental data.

  6. Radiative transfer model for contaminated slabs : experimental validations

    CERN Document Server

    Andrieu, François; Schmitt, Bernard; Douté, Sylvain; Brissaud, Olivier


    This article presents a set of spectro-goniometric measurements of different water ice samples and the comparison with an approximated radiative transfer model. The experiments were done using the spectro-radiogoniometer described in Brissaud et al. (2004). The radiative transfer model assumes an isotropization of the flux after the second interface and is fully described in Andrieu et al. (2015). Two kind of experiments were conducted. First, the specular spot was closely investigated, at high angular resolution, at the wavelength of $1.5\\,\\mbox{\\mu m}$, where ice behaves as a very absorbing media. Second, the bidirectional reflectance was sampled at various geometries, including low phase angles on 61 wavelengths ranging from $0.8\\,\\mbox{\\mu m}$ to $2.0\\,\\mbox{\\mu m}$. In order to validate the model, we made a qualitative test to demonstrate the relative isotropization of the flux. We also conducted quantitative assessments by using a bayesian inversion method in order to estimate the parameters (e.g. sampl...

  7. Neuroinflammatory targets and treatments for epilepsy validated in experimental models. (United States)

    Aronica, Eleonora; Bauer, Sebastian; Bozzi, Yuri; Caleo, Matteo; Dingledine, Raymond; Gorter, Jan A; Henshall, David C; Kaufer, Daniela; Koh, Sookyong; Löscher, Wolfgang; Louboutin, Jean-Pierre; Mishto, Michele; Norwood, Braxton A; Palma, Eleonora; Poulter, Michael O; Terrone, Gaetano; Vezzani, Annamaria; Kaminski, Rafal M


    A large body of evidence that has accumulated over the past decade strongly supports the role of inflammation in the pathophysiology of human epilepsy. Specific inflammatory molecules and pathways have been identified that influence various pathologic outcomes in different experimental models of epilepsy. Most importantly, the same inflammatory pathways have also been found in surgically resected brain tissue from patients with treatment-resistant epilepsy. New antiseizure therapies may be derived from these novel potential targets. An essential and crucial question is whether targeting these molecules and pathways may result in anti-ictogenesis, antiepileptogenesis, and/or disease-modification effects. Therefore, preclinical testing in models mimicking relevant aspects of epileptogenesis is needed to guide integrated experimental and clinical trial designs. We discuss the most recent preclinical proof-of-concept studies validating a number of therapeutic approaches against inflammatory mechanisms in animal models that could represent novel avenues for drug development in epilepsy. Finally, we suggest future directions to accelerate preclinical to clinical translation of these recent discoveries. Wiley Periodicals, Inc. © 2017 International League Against Epilepsy.

  8. Modelling and validation of Proton exchange membrane fuel cell (PEMFC) (United States)

    Mohiuddin, A. K. M.; Basran, N.; Khan, A. A.


    This paper is the outcome of a small scale fuel cell project. Fuel cell is an electrochemical device that converts energy from chemical reaction to electrical work. Proton Exchange Membrane Fuel Cell (PEMFC) is one of the different types of fuel cell, which is more efficient, having low operational temperature and fast start up capability results in high energy density. In this study, a mathematical model of 1.2 W PEMFC is developed and simulated using MATLAB software. This model describes the PEMFC behaviour under steady-state condition. This mathematical modeling of PEMFC determines the polarization curve, power generated, and the efficiency of the fuel cell. Simulation results were validated by comparing with experimental results obtained from the test of a single PEMFC with a 3 V motor. The performance of experimental PEMFC is little lower compared to simulated PEMFC, however both results were found in good agreement. Experiments on hydrogen flow rate also been conducted to obtain the amount of hydrogen consumed to produce electrical work on PEMFC.

  9. Validation of Three Scoring Risk Stratification Models for Thyroid Nodules. (United States)

    Ha, Su Min; Ahn, Hye Shin; Baek, Jung Hwan; Ahn, Hwa Young; Chung, Yun Jae; Cho, Bo Youn; Park, Sung Bin


    To minimize potential harm from overuse of fine-needle aspiration, Thyroid Imaging Reporting and Data Systems (TIRADSs) were developed for thyroid nodule risk stratification. The purpose of this study was to perform validation of three scoring risk stratification models for thyroid nodules using ultrasonography features, a web-based malignancy risk stratification system at website ( and those developed by the Korean Society of Thyroid Radiology (KSThR) and the American College of Radiology (ACR). Using ultrasonography images, radiologists assessed thyroid nodules according to the following criteria: internal content, echogenicity of the solid portion, shape, margin, and calcifications. 954 patients (mean age, 50.8 years; range, 13-86 years) with 1112 nodules were evaluated in our institute from January 2013 to December 2014. The discrimination ability of the three models was assessed by estimating the area under the receiver operating characteristic (ROC) curve. Additionally, Hosmer-Lemeshow goodness-of-fit statistics (calibration ability) were used to evaluate the agreement between the observed and expected number of nodules that were benign or malignant. Thyroid malignancy was present in 37.2% of nodules (414/1112). According to the 14-point web-based scoring risk stratification system, malignancy risk ranged from 4.5% to 100.0% and was positively associated with an increase in risk scores. The areas under the ROC curve of the validation set were 0.884 in the web-based, 0.891 in the KSThR, and 0.875 in the ACR scoring risk stratification models. The Hosmer-Lemeshow goodness-of-fit test indicated that the web-based scoring system showed the best-calibrated result with a p value of 0.078. The three scoring risk stratification models using the ultrasonography features of thyroid nodules to stratify malignancy risk showed acceptable predictive accuracy and similar areas under the curve. The web-based scoring system demonstrated

  10. Nonparametric model validations for hidden Markov models with applications in financial econometrics. (United States)

    Zhao, Zhibiao


    We address the nonparametric model validation problem for hidden Markov models with partially observable variables and hidden states. We achieve this goal by constructing a nonparametric simultaneous confidence envelope for transition density function of the observable variables and checking whether the parametric density estimate is contained within such an envelope. Our specification test procedure is motivated by a functional connection between the transition density of the observable variables and the Markov transition kernel of the hidden states. Our approach is applicable for continuous time diffusion models, stochastic volatility models, nonlinear time series models, and models with market microstructure noise.

  11. Validation of an RMS DFIG simulation model according to new German model. Validation standard FGW TR4 at balanced and unbalanced grid faults

    Energy Technology Data Exchange (ETDEWEB)

    Fortmann, Jens [REpower Systems AG, Rendsburg (Germany); Engelhardt, Stephan; Kretschmann, Joerg [Woodward SEG, Kempen (Germany); Feltes, Christian; Erlich, Istvan [Duisburg-Essen Univ., Duisburg (Germany)


    There is an increased international interest in the validation of simulation models for grid integration studies. Several countries (Spain, Australia, UK, Germany) have requirements for simulation models now as part of their grid code. In Germany, triggered by the new renewable energy law (EEG), an effort has been going on to create a standard for validating electrical simulation models of wind turbines. The model validation approach chosen will be described. The aim of this validation approach is quantify the error between measurement and simulation. This is necessary in order to give a reliable figure for the model uncertainty for the use of the model in studies. Results of FRT-measurements with balanced and unbalanced faults of a 2 MW turbine will be compared to the results of an RMS DFIG model. It can be shown that the validation approach can be applied with success both to balanced and unbalanced faults. An IEC effort has recently been started to create an international standard for wind turbine modeling and model validation. The validation approach presented could contribute to the proposed IEC simulation standard. (orig.)

  12. NAIRAS aircraft radiation model development, dose climatology, and initial validation. (United States)

    Mertens, Christopher J; Meier, Matthias M; Brown, Steven; Norman, Ryan B; Xu, Xiaojing


    [1] The Nowcast of Atmospheric Ionizing Radiation for Aviation Safety (NAIRAS) is a real-time, global, physics-based model used to assess radiation exposure to commercial aircrews and passengers. The model is a free-running physics-based model in the sense that there are no adjustment factors applied to nudge the model into agreement with measurements. The model predicts dosimetric quantities in the atmosphere from both galactic cosmic rays (GCR) and solar energetic particles, including the response of the geomagnetic field to interplanetary dynamical processes and its subsequent influence on atmospheric dose. The focus of this paper is on atmospheric GCR exposure during geomagnetically quiet conditions, with three main objectives. First, provide detailed descriptions of the NAIRAS GCR transport and dosimetry methodologies. Second, present a climatology of effective dose and ambient dose equivalent rates at typical commercial airline altitudes representative of solar cycle maximum and solar cycle minimum conditions and spanning the full range of geomagnetic cutoff rigidities. Third, conduct an initial validation of the NAIRAS model by comparing predictions of ambient dose equivalent rates with tabulated reference measurement data and recent aircraft radiation measurements taken in 2008 during the minimum between solar cycle 23 and solar cycle 24. By applying the criterion of the International Commission on Radiation Units and Measurements (ICRU) on acceptable levels of aircraft radiation dose uncertainty for ambient dose equivalent greater than or equal to an annual dose of 1 mSv, the NAIRAS model is within 25% of the measured data, which fall within the ICRU acceptable uncertainty limit of 30%. The NAIRAS model predictions of ambient dose equivalent rate are generally within 50% of the measured data for any single-point comparison. The largest differences occur at low latitudes and high cutoffs, where the radiation dose level is low. Nevertheless, analysis

  13. Validation of a Simplified Building Cooling Load Model Using a Complex Computer Simulation Model


    Stewart, Morgan Eugene


    Building energy simulation has become a useful tool for predicting cooling, heating and electrical loads for facilities. Simulation models have been validated throughout the years by comparing simulation results to actual measured values. The simulations have become more accurate as approaches were changed to be more comprehensive in their ability to model building features. These simulation models tend to require considerable experience in determining input parameters and large amounts of...

  14. Characterization Report on Fuels for NEAMS Model Validation

    Energy Technology Data Exchange (ETDEWEB)

    Gofryk, Krzysztof [Idaho National Lab. (INL), Idaho Falls, ID (United States)


    Nearly 20% of the world’s electricity today is generated by nuclear energy from uranium dioxide (UO2) fuel. The thermal conductivity of UO2 governs the conversion of heat produced from fission events into electricity and it is an important parameter in reactor design and safety. While nuclear fuel operates at high to very high temperatures, thermal conductivity and other materials properties lack sensitivity to temperature variations and to material variations at reactor temperatures. As a result, both the uncertainties in laboratory measurements at high temperatures and the small differences in properties of different materials inevitably lead to large uncertainties in models and little predictive power. Conversely, properties measured at low to moderate temperatures have more sensitivity, less uncertainty, and have larger differences in properties for different materials. These variations need to be characterized as they will afford the highest predictive capability in modeling and offer best assurances for validation and verification at all temperatures. This is well emphasized in the temperature variation of the thermal conductivity of UO2.

  15. Validated Analytical Model of a Pressure Compensation Drip Irrigation Emitter (United States)

    Shamshery, Pulkit; Wang, Ruo-Qian; Taylor, Katherine; Tran, Davis; Winter, Amos


    This work is focused on analytically characterizing the behavior of pressure-compensating drip emitters in order to design low-cost, low-power irrigation solutions appropriate for off-grid communities in developing countries. There are 2.5 billion small acreage farmers worldwide who rely solely on their land for sustenance. Drip, compared to flood, irrigation leads to up to 70% reduction in water consumption while increasing yields by 90% - important in countries like India which are quickly running out of water. To design a low-power drip system, there is a need to decrease the pumping pressure requirement at the emitters, as pumping power is the product of pressure and flow rate. To efficiently design such an emitter, the relationship between the fluid-structure interactions that occur in an emitter need to be understood. In this study, a 2D analytical model that captures the behavior of a common drip emitter was developed and validated through experiments. The effects of independently changing the channel depth, channel width, channel length and land height on the performance were studied. The model and the key parametric insights presented have the potential to be optimized in order to guide the design of low-pressure, clog-resistant, pressure-compensating emitters.

  16. Automatic validation of computational models using pseudo-3D spatio-temporal model checking. (United States)

    Pârvu, Ovidiu; Gilbert, David


    Computational models play an increasingly important role in systems biology for generating predictions and in synthetic biology as executable prototypes/designs. For real life (clinical) applications there is a need to scale up and build more complex spatio-temporal multiscale models; these could enable investigating how changes at small scales reflect at large scales and viceversa. Results generated by computational models can be applied to real life applications only if the models have been validated first. Traditional in silico model checking techniques only capture how non-dimensional properties (e.g. concentrations) evolve over time and are suitable for small scale systems (e.g. metabolic pathways). The validation of larger scale systems (e.g. multicellular populations) additionally requires capturing how spatial patterns and their properties change over time, which are not considered by traditional non-spatial approaches. We developed and implemented a methodology for the automatic validation of computational models with respect to both their spatial and temporal properties. Stochastic biological systems are represented by abstract models which assume a linear structure of time and a pseudo-3D representation of space (2D space plus a density measure). Time series data generated by such models is provided as input to parameterised image processing modules which automatically detect and analyse spatial patterns (e.g. cell) and clusters of such patterns (e.g. cellular population). For capturing how spatial and numeric properties change over time the Probabilistic Bounded Linear Spatial Temporal Logic is introduced. Given a collection of time series data and a formal spatio-temporal specification the model checker Mudi ( ) determines probabilistically if the formal specification holds for the computational model or not. Mudi is an approximate probabilistic model checking platform which enables users to choose between frequentist and

  17. Validation of population-based disease simulation models: a review of concepts and methods

    Directory of Open Access Journals (Sweden)

    Sharif Behnam


    Full Text Available Abstract Background Computer simulation models are used increasingly to support public health research and policy, but questions about their quality persist. The purpose of this article is to review the principles and methods for validation of population-based disease simulation models. Methods We developed a comprehensive framework for validating population-based chronic disease simulation models and used this framework in a review of published model validation guidelines. Based on the review, we formulated a set of recommendations for gathering evidence of model credibility. Results Evidence of model credibility derives from examining: 1 the process of model development, 2 the performance of a model, and 3 the quality of decisions based on the model. Many important issues in model validation are insufficiently addressed by current guidelines. These issues include a detailed evaluation of different data sources, graphical representation of models, computer programming, model calibration, between-model comparisons, sensitivity analysis, and predictive validity. The role of external data in model validation depends on the purpose of the model (e.g., decision analysis versus prediction. More research is needed on the methods of comparing the quality of decisions based on different models. Conclusion As the role of simulation modeling in population health is increasing and models are becoming more complex, there is a need for further improvements in model validation methodology and common standards for evaluating model credibility.

  18. A cross-validation deletion-substitution-addition model selection algorithm: Application to marginal structural models. (United States)

    Haight, Thaddeus J; Wang, Yue; van der Laan, Mark J; Tager, Ira B


    The cross-validation deletion-substitution-addition (cvDSA) algorithm is based on data-adaptive estimation methodology to select and estimate marginal structural models (MSMs) for point treatment studies as well as models for conditional means where the outcome is continuous or binary. The algorithm builds and selects models based on user-defined criteria for model selection, and utilizes a loss function-based estimation procedure to distinguish between different model fits. In addition, the algorithm selects models based on cross-validation methodology to avoid "over-fitting" data. The cvDSA routine is an R software package available for download. An alternative R-package (DSA) based on the same principles as the cvDSA routine (i.e., cross-validation, loss function), but one that is faster and with additional refinements for selection and estimation of conditional means, is also available for download. Analyses of real and simulated data were conducted to demonstrate the use of these algorithms, and to compare MSMs where the causal effects were assumed (i.e., investigator-defined), with MSMs selected by the cvDSA. The package was used also to select models for the nuisance parameter (treatment) model to estimate the MSM parameters with inverse-probability of treatment weight (IPTW) estimation. Other estimation procedures (i.e., G-computation and double robust IPTW) are available also with the package.

  19. An independent verification and validation of the Future Theater Level Model conceptual model

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, D.S. III; Kruse, K.L.; Martellaro, A.J.; Packard, S.L.; Thomas, B. Jr.; Turley, V.K.


    This report describes the methodology and results of independent verification and validation performed on a combat model in its design stage. The combat model is the Future Theater Level Model (FTLM), under development by The Joint Staff/J-8. J-8 has undertaken its development to provide an analysis tool that addresses the uncertainties of combat more directly than previous models and yields more rapid study results. The methodology adopted for this verification and validation consisted of document analyses. Included were detailed examination of the FTLM design documents (at all stages of development), the FTLM Mission Needs Statement, and selected documentation for other theater level combat models. These documents were compared to assess the FTLM as to its design stage, its purpose as an analytical combat model, and its capabilities as specified in the Mission Needs Statement. The conceptual design passed those tests. The recommendations included specific modifications as well as a recommendation for continued development. The methodology is significant because independent verification and validation have not been previously reported as being performed on a combat model in its design stage. The results are significant because The Joint Staff/J-8 will be using the recommendations from this study in determining whether to proceed with develop of the model.

  20. Modelling floor heating systems using a validated two-dimensional ground coupled numerical model

    DEFF Research Database (Denmark)

    Weitzmann, Peter; Kragh, Jesper; Roots, Peter


    and foundation on the performance of the floor heating sys-tem. The ground coupled floor heating model is validated against measurements from a single-family house. The simulation model is coupled to a whole-building energy simu-lation model with inclusion of heat losses and heat supply to the room above...... the floor. This model can be used to design energy efficient houses with floor heating focusing on the heat loss through the floor construction and foundation. It is found that it is impor-tant to model the dynamics of the floor heating system to find the correct heat loss to the ground, and further...

  1. Alaska North Slope Tundra Travel Model and Validation Study

    Energy Technology Data Exchange (ETDEWEB)

    Harry R. Bader; Jacynthe Guimond


    lack of variability in snow depth cover throughout the period of field experimentation. The amount of change in disturbance indicators was greater in the tundra communities of the Foothills than in those of the Coastal Plain. However the overall level of change in both community types was less than expected. In Coastal Plain communities, ground hardness and snow slab thickness were found to play an important role in change in active layer depth and soil moisture as a result of treatment. In the Foothills communities, snow cover had the most influence on active layer depth and soil moisture as a result of treatment. Once certain minimum thresholds for ground hardness, snow slab thickness, and snow depth were attained, it appeared that little or no additive effect was realized regarding increased resistance to disturbance in the tundra communities studied. DNR used the results of this modeling project to set a standard for maximum permissible disturbance of cross-country tundra travel, with the threshold set below the widely accepted standard of Low Disturbance levels (as determined by the U.S. Fish and Wildlife Service). DNR followed the modeling project with a validation study, which seemed to support the field trial conclusions and indicated that the standard set for maximum permissible disturbance exhibits a conservative bias in favor of environmental protection. Finally DNR established a quick and efficient tool for visual estimations of disturbance to determine when investment in field measurements is warranted. This Visual Assessment System (VAS) seemed to support the plot disturbance measurements taking during the modeling and validation phases of this project.

  2. Checklist for the qualitative evaluation of clinical studies with particular focus on external validity and model validity

    Directory of Open Access Journals (Sweden)

    Vollmar Horst C


    Full Text Available Abstract Background It is often stated that external validity is not sufficiently considered in the assessment of clinical studies. Although tools for its evaluation have been established, there is a lack of awareness of their significance and application. In this article, a comprehensive checklist is presented addressing these relevant criteria. Methods The checklist was developed by listing the most commonly used assessment criteria for clinical studies. Additionally, specific lists for individual applications were included. The categories of biases of internal validity (selection, performance, attrition and detection bias correspond to structural, treatment-related and observational differences between the test and control groups. Analogously, we have extended these categories to address external validity and model validity, regarding similarity between the study population/conditions and the general population/conditions related to structure, treatment and observation. Results A checklist is presented, in which the evaluation criteria concerning external validity and model validity are systemised and transformed into a questionnaire format. Conclusion The checklist presented in this article can be applied to both planning and evaluating of clinical studies. We encourage the prospective user to modify the checklists according to the respective application and research question. The higher expenditure needed for the evaluation of clinical studies in systematic reviews is justified, particularly in the light of the influential nature of their conclusions on therapeutic decisions and the creation of clinical guidelines.

  3. Validation of a modified Medical Resource Model for mass gatherings. (United States)

    Smith, Wayne P; Tuffin, Heather; Stratton, Samuel J; Wallis, Lee A


    A modified Medical Resource Model to predict the medical resources required at mass gatherings based on the risk profile of events has been developed. This study was undertaken to validate this tool using data from events held in both a developed and a developing country. A retrospective study was conducted utilizing prospectively gathered data from individual events at Old Trafford Stadium in Manchester, United Kingdom, and Ellis Park Stadium, Johannesburg, South Africa. Both stadia are similar in design and spectator capacity. Data for Professional Football as well as Rugby League and Rugby Union (respectively) matches were used for the study. The medical resources predicted for the events were determined by entering the risk profile of each of the events into the Medical Resource Model. A recently developed South African tool was used to predetermine medical staffing for mass gatherings. For the study, the medical resources actually required to deal with the patient load for events within the control sample from the two stadia were compared with the number of needed resources predicted by the Medical Resource Model when that tool was applied retrospectively to the study events. The comparison was used to determine if the newly developed tool was either over- or under-predicting the resource requirements. In the case of Ellis Park, the model under-predicted the basic life support (BLS) requirement for 1.5% of the events in the data set. Mean over-prediction was 209.1 minutes for BLS availability. Old Trafford displayed no events for which the Medical Resource Model would have under-predicted. The mean over-prediction of BLS availability for Old Trafford was 671.6 minutes. The intermediate life support (ILS) requirement for Ellis Park was under-predicted for seven of the total 66 events (10.6% of the events), all of which had one factor in common, that being relatively low spectator attendance numbers. Modelling for ILS at Old Trafford did not under-predict for

  4. CFD Modeling and Experimental Validation of a Solar Still

    Directory of Open Access Journals (Sweden)

    Mahmood Tahir


    Full Text Available Earth is the densest planet of the solar system with total area of 510.072 million square Km. Over 71.68% of this area is covered with water leaving a scant area of 28.32% for human to inhabit. The fresh water accounts for only 2.5% of the total volume and the rest is the brackish water. Presently, the world is facing chief problem of lack of potable water. This issue can be addressed by converting brackish water into potable through a solar distillation process and solar still is specially assigned for this purpose. Efficiency of a solar still explicitly depends on its design parameters, such as wall material, chamber depth, width and slope of the zcondensing surface. This study was aimed at investigating the solar still parameters using CFD modeling and experimental validation. The simulation data of ANSYS-FLUENT was compared with actual experimental data. A close agreement among the simulated and experimental results was seen in the presented work. It reveals that ANSYS-FLUENT is a potent tool to analyse the efficiency of the new designs of the solar distillation systems.

  5. Integral Reactor Containment Condensation Model and Experimental Validation

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Qiao [Oregon State Univ., Corvallis, OR (United States); Corradini, Michael [Univ. of Wisconsin, Madison, WI (United States)


    This NEUP funded project, NEUP 12-3630, is for experimental, numerical and analytical studies on high-pressure steam condensation phenomena in a steel containment vessel connected to a water cooling tank, carried out at Oregon State University (OrSU) and the University of Wisconsin at Madison (UW-Madison). In the three years of investigation duration, following the original proposal, the planned tasks have been completed: (1) Performed a scaling study for the full pressure test facility applicable to the reference design for the condensation heat transfer process during design basis accidents (DBAs), modified the existing test facility to route the steady-state secondary steam flow into the high pressure containment for controllable condensation tests, and extended the operations at negative gage pressure conditions (OrSU). (2) Conducted a series of DBA and quasi-steady experiments using the full pressure test facility to provide a reliable high pressure condensation database (OrSU). (3) Analyzed experimental data and evaluated condensation model for the experimental conditions, and predicted the prototypic containment performance under accidental conditions (UW-Madison). A film flow model was developed for the scaling analysis, and the results suggest that the 1/3 scaled test facility covers large portion of laminar film flow, leading to a lower average heat transfer coefficient comparing to the prototypic value. Although it is conservative in reactor safety analysis, the significant reduction of heat transfer coefficient (50%) could under estimate the prototypic condensation heat transfer rate, resulting in inaccurate prediction of the decay heat removal capability. Further investigation is thus needed to quantify the scaling distortion for safety analysis code validation. Experimental investigations were performed in the existing MASLWR test facility at OrST with minor modifications. A total of 13 containment condensation tests were conducted for pressure

  6. Blast Load Simulator Experiments for Computational Model Validation Report 3 (United States)


    these explosive events and their effects. These codes are continuously improving, but still require validation against experimental data to...these explosive events and their effects. These codes are continuously improving, but they still require validation against experimental data to...driver material from reaching the pressure gauges during the recorded time frame for those experiments. A comparison of representative pressure waveforms

  7. Experimental validation of model Hortel Whillier; Validacion experimental del model de Hottel-Whillier

    Energy Technology Data Exchange (ETDEWEB)

    Dominguez Munoz, F.; Cejudo Lopez, J. M.; Carrillo andres, A.


    Comparing the results of testing of a commercial flat-plate solar collector with a detailed implementation model of Hottel Whillier fin and tube. The validation procedure is based on comparing experimental and theoretical curves and more likely uncertainty bands. the model correctly predicts the end of profits and underestimates the 5% of losses, although a sensitivity analysis shows that this result is not attributable to the model itself but to the inputs with which it was implemented. The model has difficulty differentiating between the terms of linear and quadratic losses that appear in the quadratic fit curve. (Author) 1 refs.

  8. Second-Moment RANS Model Verification and Validation Using the Turbulence Modeling Resource Website (Invited) (United States)

    Eisfeld, Bernhard; Rumsey, Chris; Togiti, Vamshi


    The implementation of the SSG/LRR-omega differential Reynolds stress model into the NASA flow solvers CFL3D and FUN3D and the DLR flow solver TAU is verified by studying the grid convergence of the solution of three different test cases from the Turbulence Modeling Resource Website. The model's predictive capabilities are assessed based on four basic and four extended validation cases also provided on this website, involving attached and separated boundary layer flows, effects of streamline curvature and secondary flow. Simulation results are compared against experimental data and predictions by the eddy-viscosity models of Spalart-Allmaras (SA) and Menter's Shear Stress Transport (SST).

  9. Experimental validation of Swy-2 clay standard's PHREEQC model (United States)

    Szabó, Zsuzsanna; Hegyfalvi, Csaba; Freiler, Ágnes; Udvardi, Beatrix; Kónya, Péter; Székely, Edit; Falus, György


    One of the challenges of the present century is to limit the greenhouse gas emissions for the mitigation of climate change which is possible for example by a transitional technology, CCS (Carbon Capture and Storage) and, among others, by the increase of nuclear proportion in the energy mix. Clay minerals are considered to be responsible for the low permeability and sealing capacity of caprocks sealing off stored CO2 and they are also the main constituents of bentonite in high level radioactive waste disposal facilities. The understanding of clay behaviour in these deep geological environments is possible through laboratory batch experiments of well-known standards and coupled geochemical models. Such experimentally validated models are scarce even though they allow deriving more precise long-term predictions of mineral reactions and rock and bentonite degradation underground and, therefore, ensuring the safety of the above technologies and increase their public acceptance. This ongoing work aims to create a kinetic geochemical model of Na-montmorillonite standard Swy-2 in the widely used PHREEQC code, supported by solution and mineral composition results from batch experiments. Several four days experiments have been carried out in 1:35 rock:water ratio at atmospheric conditions, and with inert and CO2 supercritical phase at 100 bar and 80 ⁰C relevant for the potential Hungarian CO2 reservoir complex. Solution samples have been taken during and after experiments and their compositions were measured by ICP-OES. The treated solid phase has been analysed by XRD and ATR-FTIR and compared to in-parallel measured references (dried Swy-2). Kinetic geochemical modelling of the experimental conditions has been performed by PHREEQC version 3 using equations and kinetic rate parameters from the USGS report of Palandri and Kharaka (2004). The visualization of experimental and numerous modelling results has been automatized by R. Experiments and models show very fast

  10. Calibration and validation of earthquake catastrophe models. Case study: Impact Forecasting Earthquake Model for Algeria (United States)

    Trendafiloski, G.; Gaspa Rebull, O.; Ewing, C.; Podlaha, A.; Magee, B.


    Calibration and validation are crucial steps in the production of the catastrophe models for the insurance industry in order to assure the model's reliability and to quantify its uncertainty. Calibration is needed in all components of model development including hazard and vulnerability. Validation is required to ensure that the losses calculated by the model match those observed in past events and which could happen in future. Impact Forecasting, the catastrophe modelling development centre of excellence within Aon Benfield, has recently launched its earthquake model for Algeria as a part of the earthquake model for the Maghreb region. The earthquake model went through a detailed calibration process including: (1) the seismic intensity attenuation model by use of macroseismic observations and maps from past earthquakes in Algeria; (2) calculation of the country-specific vulnerability modifiers by use of past damage observations in the country. The use of Benouar, 1994 ground motion prediction relationship was proven as the most appropriate for our model. Calculation of the regional vulnerability modifiers for the country led to 10% to 40% larger vulnerability indexes for different building types compared to average European indexes. The country specific damage models also included aggregate damage models for residential, commercial and industrial properties considering the description of the buildings stock given by World Housing Encyclopaedia and the local rebuilding cost factors equal to 10% for damage grade 1, 20% for damage grade 2, 35% for damage grade 3, 75% for damage grade 4 and 100% for damage grade 5. The damage grades comply with the European Macroseismic Scale (EMS-1998). The model was validated by use of "as-if" historical scenario simulations of three past earthquake events in Algeria M6.8 2003 Boumerdes, M7.3 1980 El-Asnam and M7.3 1856 Djidjelli earthquake. The calculated return periods of the losses for client market portfolio align with the

  11. Stochastic modeling of oligodendrocyte generation in cell culture: model validation with time-lapse data

    Directory of Open Access Journals (Sweden)

    Noble Mark


    Full Text Available Abstract Background The purpose of this paper is two-fold. The first objective is to validate the assumptions behind a stochastic model developed earlier by these authors to describe oligodendrocyte generation in cell culture. The second is to generate time-lapse data that may help biomathematicians to build stochastic models of cell proliferation and differentiation under other experimental scenarios. Results Using time-lapse video recording it is possible to follow the individual evolutions of different cells within each clone. This experimental technique is very laborious and cannot replace model-based quantitative inference from clonal data. However, it is unrivalled in validating the structure of a stochastic model intended to describe cell proliferation and differentiation at the clonal level. In this paper, such data are reported and analyzed for oligodendrocyte precursor cells cultured in vitro. Conclusion The results strongly support the validity of the most basic assumptions underpinning the previously proposed model of oligodendrocyte development in cell culture. However, there are some discrepancies; the most important is that the contribution of progenitor cell death to cell kinetics in this experimental system has been underestimated.

  12. Modeling, implementation, and validation of arterial travel time reliability. (United States)


    Previous research funded by Florida Department of Transportation (FDOT) developed a method for estimating : travel time reliability for arterials. This method was not initially implemented or validated using field data. This : project evaluated and r...

  13. Identifying and Validating a Model of Interpersonal Performance Dimensions

    National Research Council Canada - National Science Library

    Carpenter, Tara


    .... Two studies were then completed to validate the proposed taxonomy. In the first study empirical evidence for the taxonomy was gathered using a content analysis of critical incidents taken from a job analysis...

  14. Validation Hydrodynamic Models of Three Topological Models of Secondary Facultative Ponds

    Directory of Open Access Journals (Sweden)

    Aponte-Reyes Alxander


    Full Text Available A methodology was developed to analyze boundary conditions, the size of the mesh and the turbulence of a mathematical model of CFD, which could explain hydrodynamic behavior on facultative stabilization ponds, FSP, built to pilot scale: conventional pond, CP, baffled pond, BP, and baffled-mesh pond, BMP. Models dispersion studies were performed in field for validation, taking samples into and out of the FSP, the information was used to carry out CFD model simulations of the three topologies. Evaluated mesh sizes ranged from 500,000 to 2,000,000 elements. The boundary condition in Pared surface-free slip showed good qualitative behavior and the turbulence model κ–ε Low Reynolds yielded good results. The biomass contained in LFS generates interference on dispersion studies and should be taken into account in assessing the CFD modeling, the tracer injection times, its concentration at the entrance, the effect of wind on CFD, and the flow models adopted as a basis for modeling are parameters to be taken into account for the CFD model validation and calibration.

  15. The ability model of emotional intelligence: Searching for valid measures


    Fiori, M.; Antonakis, J.


    Current measures of ability emotional intelligence (EI)--including the well-known Mayer-Salovey-Caruso Emotional Intelligence Test (MSCEIT)--suffer from several limitations, including low discriminant validity and questionable construct and incremental validity. We show that the MSCEIT is largely predicted by personality dimensions, general intelligence, and demographics having multiple R's with the MSCEIT branches up to .66; for the general EI factor this relation was even stronger (Multiple...

  16. Development and Validation of a Predictive Model for Functional Outcome After Stroke Rehabilitation: The Maugeri Model. (United States)

    Scrutinio, Domenico; Lanzillo, Bernardo; Guida, Pietro; Mastropasqua, Filippo; Monitillo, Vincenzo; Pusineri, Monica; Formica, Roberto; Russo, Giovanna; Guarnaschelli, Caterina; Ferretti, Chiara; Calabrese, Gianluigi


    Prediction of outcome after stroke rehabilitation may help clinicians in decision-making and planning rehabilitation care. We developed and validated a predictive tool to estimate the probability of achieving improvement in physical functioning (model 1) and a level of independence requiring no more than supervision (model 2) after stroke rehabilitation. The models were derived from 717 patients admitted for stroke rehabilitation. We used multivariable logistic regression analysis to build each model. Then, each model was prospectively validated in 875 patients. Model 1 included age, time from stroke occurrence to rehabilitation admission, admission motor and cognitive Functional Independence Measure scores, and neglect. Model 2 included age, male gender, time since stroke onset, and admission motor and cognitive Functional Independence Measure score. Both models demonstrated excellent discrimination. In the derivation cohort, the area under the curve was 0.883 (95% confidence intervals, 0.858-0.910) for model 1 and 0.913 (95% confidence intervals, 0.884-0.942) for model 2. The Hosmer-Lemeshow χ2 was 4.12 (P=0.249) and 1.20 (P=0.754), respectively. In the validation cohort, the area under the curve was 0.866 (95% confidence intervals, 0.840-0.892) for model 1 and 0.850 (95% confidence intervals, 0.815-0.885) for model 2. The Hosmer-Lemeshow χ2 was 8.86 (P=0.115) and 34.50 (P=0.001), respectively. Both improvement in physical functioning (hazard ratios, 0.43; 0.25-0.71; P=0.001) and a level of independence requiring no more than supervision (hazard ratios, 0.32; 0.14-0.68; P=0.004) were independently associated with improved 4-year survival. A calculator is freely available for download at This study provides researchers and clinicians with an easy-to-use, accurate, and validated predictive tool for potential application in rehabilitation research and stroke management. © 2017 American Heart Association, Inc.

  17. An approach to model validation and model-based prediction -- polyurethane foam case study.

    Energy Technology Data Exchange (ETDEWEB)

    Dowding, Kevin J.; Rutherford, Brian Milne


    Enhanced software methodology and improved computing hardware have advanced the state of simulation technology to a point where large physics-based codes can be a major contributor in many systems analyses. This shift toward the use of computational methods has brought with it new research challenges in a number of areas including characterization of uncertainty, model validation, and the analysis of computer output. It is these challenges that have motivated the work described in this report. Approaches to and methods for model validation and (model-based) prediction have been developed recently in the engineering, mathematics and statistical literatures. In this report we have provided a fairly detailed account of one approach to model validation and prediction applied to an analysis investigating thermal decomposition of polyurethane foam. A model simulates the evolution of the foam in a high temperature environment as it transforms from a solid to a gas phase. The available modeling and experimental results serve as data for a case study focusing our model validation and prediction developmental efforts on this specific thermal application. We discuss several elements of the ''philosophy'' behind the validation and prediction approach: (1) We view the validation process as an activity applying to the use of a specific computational model for a specific application. We do acknowledge, however, that an important part of the overall development of a computational simulation initiative is the feedback provided to model developers and analysts associated with the application. (2) We utilize information obtained for the calibration of model parameters to estimate the parameters and quantify uncertainty in the estimates. We rely, however, on validation data (or data from similar analyses) to measure the variability that contributes to the uncertainty in predictions for specific systems or units (unit-to-unit variability). (3) We perform statistical

  18. Fast Running Urban Dispersion Model for Radiological Dispersal Device (RDD) Releases: Model Description and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Gowardhan, Akshay [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). National Atmospheric Release Advisory Center (NARAC); Neuscamman, Stephanie [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). National Atmospheric Release Advisory Center (NARAC); Donetti, John [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). National Atmospheric Release Advisory Center (NARAC); Walker, Hoyt [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). National Atmospheric Release Advisory Center (NARAC); Belles, Rich [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). National Atmospheric Release Advisory Center (NARAC); Eme, Bill [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). National Atmospheric Release Advisory Center (NARAC); Homann, Steven [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). National Atmospheric Release Advisory Center (NARAC); Simpson, Matthew [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). National Atmospheric Release Advisory Center (NARAC); Nasstrom, John [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). National Atmospheric Release Advisory Center (NARAC)


    Aeolus is an efficient three-dimensional computational fluid dynamics code based on finite volume method developed for predicting transport and dispersion of contaminants in a complex urban area. It solves the time dependent incompressible Navier-Stokes equation on a regular Cartesian staggered grid using a fractional step method. It also solves a scalar transport equation for temperature and using the Boussinesq approximation. The model also includes a Lagrangian dispersion model for predicting the transport and dispersion of atmospheric contaminants. The model can be run in an efficient Reynolds Average Navier-Stokes (RANS) mode with a run time of several minutes, or a more detailed Large Eddy Simulation (LES) mode with run time of hours for a typical simulation. This report describes the model components, including details on the physics models used in the code, as well as several model validation efforts. Aeolus wind and dispersion predictions are compared to field data from the Joint Urban Field Trials 2003 conducted in Oklahoma City (Allwine et al 2004) including both continuous and instantaneous releases. Newly implemented Aeolus capabilities include a decay chain model and an explosive Radiological Dispersal Device (RDD) source term; these capabilities are described. Aeolus predictions using the buoyant explosive RDD source are validated against two experimental data sets: the Green Field explosive cloud rise experiments conducted in Israel (Sharon et al 2012) and the Full-Scale RDD Field Trials conducted in Canada (Green et al 2016).


    Energy Technology Data Exchange (ETDEWEB)

    Choi, A.


    Phase 1 of the 2013 Cold cap Evaluation Furnace (CEF) test was completed on June 3, 2013 after a 5-day round-the-clock feeding and pouring operation. The main goal of the test was to characterize the CEF off-gas produced from a nitric-formic acid flowsheet feed and confirm whether the CEF platform is capable of producing scalable off-gas data necessary for the revision of the DWPF melter off-gas flammability model; the revised model will be used to define new safety controls on the key operating parameters for the nitric-glycolic acid flowsheet feeds including total organic carbon (TOC). Whether the CEF off-gas data were scalable for the purpose of predicting the potential flammability of the DWPF melter exhaust was determined by comparing the predicted H{sub 2} and CO concentrations using the current DWPF melter off-gas flammability model to those measured during Phase 1; data were deemed scalable if the calculated fractional conversions of TOC-to-H{sub 2} and TOC-to-CO at varying melter vapor space temperatures were found to trend and further bound the respective measured data with some margin of safety. Being scalable thus means that for a given feed chemistry the instantaneous flow rates of H{sub 2} and CO in the DWPF melter exhaust can be estimated with some degree of conservatism by multiplying those of the respective gases from a pilot-scale melter by the feed rate ratio. This report documents the results of the Phase 1 data analysis and the necessary calculations performed to determine the scalability of the CEF off-gas data. A total of six steady state runs were made during Phase 1 under non-bubbled conditions by varying the CEF vapor space temperature from near 700 to below 300°C, as measured in a thermowell (T{sub tw}). At each steady state temperature, the off-gas composition was monitored continuously for two hours using MS, GC, and FTIR in order to track mainly H{sub 2}, CO, CO{sub 2}, NO{sub x}, and organic gases such as CH{sub 4}. The standard

  20. Mathematical modelling of filtration in submerged anaerobic MBRs (SAnMBRs): long-term validation


    Robles Martínez, Ángel; Ruano García, María Victoria; Ribes Bertomeu, José; SECO TORRECILLAS, AURORA; Ferrer, J.


    The aim of this study was the long-term validation of a model capable of reproducing the filtration process occurring in a submerged anaerobic membrane bioreactor (SAnMBR) system. The proposed model was validated using data obtained horn a SAnMBR demonstration plant fitted with industrial-scale hollow-fibre membranes. The validation was carried out using both lightly and heavily fouled membranes operating at different bulk concentrations, gas sparging intensities and transmembrane fluxes. Acr...

  1. A free wake vortex lattice model for vertical axis wind turbines: Modeling, verification and validation (United States)

    Meng, Fanzhong; Schwarze, Holger; Vorpahl, Fabian; Strobel, Michael


    Since the 1970s several research activities had been carried out on developing aerodynamic models for Vertical Axis Wind Turbines (VAWTs). In order to design large VAWTs of MW scale, more accurate aerodynamic calculation is required to predict their aero-elastic behaviours. In this paper, a 3D free wake vortex lattice model for VAWTs is developed, verified and validated. Comparisons to the experimental results show that the 3D free wake vortex lattice model developed is capable of making an accurate prediction of the general performance and the instantaneous aerodynamic forces on the blades. The comparison between momentum method and the vortex lattice model shows that free wake vortex models are needed for detailed loads calculation and for calculating highly loaded rotors.

  2. A Process for Verifying and Validating Requirements for Fault Tolerant Systems Using Model Checking (United States)

    Schneider, F.; Easterbrook, S.; Callahan, J.; Holzmann, G.; Reinholtz, W.; Ko, A.; Shahabuddin, M.


    Model checking is shown to be an effective tool in validating the behavior of a fault tolerant embedded spacecraft controller. The case study presented here shows that by judiciously abstracting away extraneous complexity, the state space of the model could be exhaustively searched allowing critical functional requirement to be validated down to the design level.

  3. Prediction of 60-Day Case Fatality After Aneurysmal Subarachnoid Hemorrhage : External Validation of a Prediction Model

    NARCIS (Netherlands)

    Dijkland, Simone A.; Roozenbeek, Bob; Brouwer, Patrick A.; Lingsma, Hester F.; Dippel, Diederik W.; Vergouw, Leonie J.; Vergouwen, Mervyn D.; van der Jagt, Mathieu

    OBJECTIVE:: External validation of prognostic models is crucial but rarely done. Our aim was to externally validate a prognostic model to predict 60-day case fatality after aneurysmal subarachnoid hemorrhage developed from the International Subarachnoid Aneurysm Trial in a retrospective unselected

  4. Validating the Multidimensional Spline Based Global Aerodynamic Model for the Cessna Citation II

    NARCIS (Netherlands)

    De Visser, C.C.; Mulder, J.A.


    The validation of aerodynamic models created using flight test data is a time consuming and often costly process. In this paper a new method for the validation of global nonlinear aerodynamic models based on multivariate simplex splines is presented. This new method uses the unique properties of the

  5. Bayesian Calibration, Validation and Uncertainty Quantification for Predictive Modelling of Tumour Growth: A Tutorial. (United States)

    Collis, Joe; Connor, Anthony J; Paczkowski, Marcin; Kannan, Pavitra; Pitt-Francis, Joe; Byrne, Helen M; Hubbard, Matthew E


    In this work, we present a pedagogical tumour growth example, in which we apply calibration and validation techniques to an uncertain, Gompertzian model of tumour spheroid growth. The key contribution of this article is the discussion and application of these methods (that are not commonly employed in the field of cancer modelling) in the context of a simple model, whose deterministic analogue is widely known within the community. In the course of the example, we calibrate the model against experimental data that are subject to measurement errors, and then validate the resulting uncertain model predictions. We then analyse the sensitivity of the model predictions to the underlying measurement model. Finally, we propose an elementary learning approach for tuning a threshold parameter in the validation procedure in order to maximize predictive accuracy of our validated model.

  6. Construction and validation of detailed kinetic models for the combustion of gasoline surrogates; Construction et validation de modeles cinetiques detailles pour la combustion de melanges modeles des essences

    Energy Technology Data Exchange (ETDEWEB)

    Touchard, S.


    The irreversible reduction of oil resources, the CO{sub 2} emission control and the application of increasingly strict standards of pollutants emission lead the worldwide researchers to work to reduce the pollutants formation and to improve the engine yields, especially by using homogenous charge combustion of lean mixtures. The numerical simulation of fuel blends oxidation is an essential tool to study the influence of fuel formulation and motor conditions on auto-ignition and on pollutants emissions. The automatic generation helps to obtain detailed kinetic models, especially at low temperature, where the number of reactions quickly exceeds thousand. The main purpose of this study is the generation and the validation of detailed kinetic models for the oxidation of gasoline blends using the EXGAS software. This work has implied an improvement of computation rules for thermodynamic and kinetic data, those were validated by numerical simulation using CHEMKIN II softwares. A large part of this work has concerned the understanding of the low temperature oxidation chemistry of the C5 and larger alkenes. Low and high temperature mechanisms were proposed and validated for 1 pentene, 1-hexene, the binary mixtures containing 1 hexene/iso octane, 1 hexene/toluene, iso octane/toluene and the ternary mixture of 1 hexene/toluene/iso octane. Simulations were also done for propene, 1-butene and iso-octane with former models including the modifications proposed in this PhD work. If the generated models allowed us to simulate with a good agreement the auto-ignition delays of the studied molecules and blends, some uncertainties still remains for some reaction paths leading to the formation of cyclic products in the case of alkenes oxidation at low temperature. It would be also interesting to carry on this work for combustion models of gasoline blends at low temperature. (author)

  7. Reliability and validation of a behavioral model of clinical behavioral formulation

    Directory of Open Access Journals (Sweden)

    Amanda M Muñoz-Martínez


    Full Text Available The aim of this study was to determine the reliability and content and predictive validity of a clinical case formulation, developed from a behavioral perspective. A mixed design integrating levels of descriptive analysis and A-B case study with follow-up was used. The study established the reliability of the following descriptive and explanatory categories: (a problem description, (b predisposing factors, (c precipitating factors, (d acquisition and (e inferred mechanism (maintenance. The analysis was performed on cases from 2005 to 2008 formulated with the model derived from the current study. With regards to validity, expert judges considered that the model had content validity. The predictive validity was established across application of model to three case studies. Discussion shows the importance of extending the investigation with the model in other populations and to establish the clinical and concurrent validity of the model.

  8. The BSSG rat model of Parkinson's disease: progressing towards a valid, predictive model of disease. (United States)

    Van Kampen, Jackalina M; Robertson, Harold A


    Parkinson's disease (PD) is a neurodegenerative disorder, classically considered a movement disorder. A great deal is known about the anatomical connections and neuropathology and pharmacological changes of PD, as they relate to the loss of dopaminergic function and the appearance of cardinal motor symptoms. Our understanding of the role of dopamine in PD has led to the development of effective pharmacological treatments of the motor symptoms in the form of dopamine replacement therapy using levodopa and dopaminergic agonists. Much of the information concerning these drug treatments has been obtained using classical neurotoxic models that mimic dopamine depletion (e.g., 1-methyl-4-phenyl-1,2,3,6-tetrahydropyridine or MPTP, 6-hydroxydopamine, reserpine). However, PD is more than a disorder of the nigrostriatal dopamine pathway. Our understanding of the neuropathology of PD has undergone massive changes, with the discovery that mutations in α-synuclein cause a familial form of PD and that PD pathology may spread, affecting multiple neurotransmitter systems and brain regions. These new developments in our understanding of PD demand that we reconsider our animal models. While classic neurotoxin models have been useful for the development of effective symptomatic treatments for motor manifestations, the paucity of a valid animal model exhibiting the progressive development of multiple key features of PD pathophysiology and phenotype has impeded the search for neuroprotective therapies, capable of slowing or halting disease progression. What characteristics would a good animal model of human PD have? In so much as is possible, a good model would exhibit as many behavioral, anatomical, biochemical, immunological, and pathological changes as are observed in the human condition, developing progressively, with clear, identifiable biomarkers along the way. Here, we review the BSSG rat model of PD, a novel environmental model of PD, with strong construct, face, and predictive

  9. Implementation and Validation of IEC Generic Type 1A Wind Turbine Generator Model

    DEFF Research Database (Denmark)

    Zhao, Haoran; Wu, Qiuwei; Margaris, Ioannis


    This paper presents the implementation of the International Electrotechnical Commission (IEC) generic Type 1A wind turbine generator (WTG) model in Power Factory (PF) and the validation of the implemented model against field measurements. The IEC generic Type 1A WTG model structure is briefly...... described. The details are explained regarding how the two mass mechanical model is implemented when the generator mass is included in the PF built-in generator model. In order to verify the IEC generic Type 1A WTG model, the model to field measurement validation method was employed. The model to field...

  10. An integrated approach for the validation of energy and environmental system analysis models: used in the validation of the flexigas excel biogas model

    NARCIS (Netherlands)

    H.C. Moll; Ir. J. Bekkering; G. Laughs; prof. dr. Wim van Gemert; R.M.J. Benders; J. Holstein; C. van Someren; F. Pierie; Drs. E.J. Hengeveld; W. Liu


    A review has been completed for a verification and validation (V&V) of the (Excel) BioGas simulator or EBS model. The EBS model calculates the environmental impact of biogas production pathways using Material and Energy Flow Analysis, time dependent dynamics, geographic information, and Life Cycle

  11. The Importance of Prediction Model Validation and Assessment in Obesity and Nutrition Research (United States)

    Ivanescu, Andrada E.; Li, Peng; George, Brandon; Brown, Andrew W.; Keith, Scott W.; Raju, Dheeraj; Allison, David B.


    Deriving statistical models to predict one variable from one or more other variables, or predictive modeling, is an important activity in obesity and nutrition research. To determine the quality of the model, it is necessary to quantify and report the predictive validity of the derived models. Conducting validation of the predictive measures provides essential information to the research community about the model. Unfortunately, many articles fail to account for the nearly inevitable reduction in predictive ability that occurs when a model derived on one dataset is applied to a new dataset. Under some circumstances, the predictive validity can be reduced to nearly zero. In this overview, we explain why reductions in predictive validity occur, define the metrics commonly used to estimate the predictive validity of a model (e.g., R2, mean squared error, sensitivity, specificity, receiver operating characteristic, concordance index), and describe methods to estimate the predictive validity (e.g., cross-validation, bootstrap, adjusted and shrunken R2). We emphasize that methods for estimating the expected reduction in predictive ability of a model in new samples are available and this expected reduction should always be reported when new predictive models are introduced. PMID:26449421

  12. Integrated corridor management (ICM) analysis, modeling, and simulation (AMS) for Minneapolis site : model calibration and validation report. (United States)


    This technical report documents the calibration and validation of the baseline (2008) mesoscopic model for the I-394 : Minneapolis, Minnesota, Pioneer Site. DynusT was selected as the mesoscopic model for analyzing operating conditions : in the I-394...

  13. Validation and calibration of structural models that combine information from multiple sources. (United States)

    Dahabreh, Issa J; Wong, John B; Trikalinos, Thomas A


    Mathematical models that attempt to capture structural relationships between their components and combine information from multiple sources are increasingly used in medicine. Areas covered: We provide an overview of methods for model validation and calibration and survey studies comparing alternative approaches. Expert commentary: Model validation entails a confrontation of models with data, background knowledge, and other models, and can inform judgments about model credibility. Calibration involves selecting parameter values to improve the agreement of model outputs with data. When the goal of modeling is quantitative inference on the effects of interventions or forecasting, calibration can be viewed as estimation. This view clarifies issues related to parameter identifiability and facilitates formal model validation and the examination of consistency among different sources of information. In contrast, when the goal of modeling is the generation of qualitative insights about the modeled phenomenon, calibration is a rather informal process for selecting inputs that result in model behavior that roughly reproduces select aspects of the modeled phenomenon and cannot be equated to an estimation procedure. Current empirical research on validation and calibration methods consists primarily of methodological appraisals or case-studies of alternative techniques and cannot address the numerous complex and multifaceted methodological decisions that modelers must make. Further research is needed on different approaches for developing and validating complex models that combine evidence from multiple sources.

  14. Object-oriented simulation model of a parabolic trough solar collector: Static and dynamic validation (United States)

    Ubieta, Eduardo; Hoyo, Itzal del; Valenzuela, Loreto; Lopez-Martín, Rafael; Peña, Víctor de la; López, Susana


    A simulation model of a parabolic-trough solar collector developed in Modelica® language is calibrated and validated. The calibration is performed in order to approximate the behavior of the solar collector model to a real one due to the uncertainty in some of the system parameters, i.e. measured data is used during the calibration process. Afterwards, the validation of this calibrated model is done. During the validation, the results obtained from the model are compared to the ones obtained during real operation in a collector from the Plataforma Solar de Almeria (PSA).

  15. A formal algorithm for verifying the validity of clustering results based on model checking. (United States)

    Huang, Shaobin; Cheng, Yuan; Lang, Dapeng; Chi, Ronghua; Liu, Guofeng


    The limitations in general methods to evaluate clustering will remain difficult to overcome if verifying the clustering validity continues to be based on clustering results and evaluation index values. This study focuses on a clustering process to analyze crisp clustering validity. First, we define the properties that must be satisfied by valid clustering processes and model clustering processes based on program graphs and transition systems. We then recast the analysis of clustering validity as the problem of verifying whether the model of clustering processes satisfies the specified properties with model checking. That is, we try to build a bridge between clustering and model checking. Experiments on several datasets indicate the effectiveness and suitability of our algorithms. Compared with traditional evaluation indices, our formal method can not only indicate whether the clustering results are valid but, in the case the results are invalid, can also detect the objects that have led to the invalidity.

  16. Some guidance on preparing validation plans for the DART Full System Models.

    Energy Technology Data Exchange (ETDEWEB)

    Gray, Genetha Anne; Hough, Patricia Diane; Hills, Richard Guy (Sandia National Laboratories, Albuquerque, NM)


    Planning is an important part of computational model verification and validation (V&V) and the requisite planning document is vital for effectively executing the plan. The document provides a means of communicating intent to the typically large group of people, from program management to analysts to test engineers, who must work together to complete the validation activities. This report provides guidelines for writing a validation plan. It describes the components of such a plan and includes important references and resources. While the initial target audience is the DART Full System Model teams in the nuclear weapons program, the guidelines are generally applicable to other modeling efforts. Our goal in writing this document is to provide a framework for consistency in validation plans across weapon systems, different types of models, and different scenarios. Specific details contained in any given validation plan will vary according to application requirements and available resources.

  17. A formal algorithm for verifying the validity of clustering results based on model checking.

    Directory of Open Access Journals (Sweden)

    Shaobin Huang

    Full Text Available The limitations in general methods to evaluate clustering will remain difficult to overcome if verifying the clustering validity continues to be based on clustering results and evaluation index values. This study focuses on a clustering process to analyze crisp clustering validity. First, we define the properties that must be satisfied by valid clustering processes and model clustering processes based on program graphs and transition systems. We then recast the analysis of clustering validity as the problem of verifying whether the model of clustering processes satisfies the specified properties with model checking. That is, we try to build a bridge between clustering and model checking. Experiments on several datasets indicate the effectiveness and suitability of our algorithms. Compared with traditional evaluation indices, our formal method can not only indicate whether the clustering results are valid but, in the case the results are invalid, can also detect the objects that have led to the invalidity.

  18. Validations of Computational Weld Models: Comparison of Residual Stresses (United States)


    valider la capacité du modèle informatique de calculer les contraintes résiduelles dans des structures réparées par ce type de soudage de rechargement...modèle informatique . Les mesures de la dureté de la plaque laissent supposer que cette propriété varie grandement dans l’espace, c.-à-d. qu’elle est

  19. Blast Load Simulator Experiments for Computational Model Validation: Report 2 (United States)


    uncertainty information in the form of confidence intervals for peak pressure and impulse result in a data set that can be used to evaluate the...simulations of these explosive events and their effects. These codes are continuously improving, but still require validation against experimental data to... data at several locations on the surfaces of the structure. The BLS is a highly tunable compressed gas-driven, closed-end shock tube designed to

  20. Validation of the Revised WAsP Park Model

    DEFF Research Database (Denmark)

    Rathmann, Ole Steen; Hansen, Brian Ohrbeck; Leon, J.P. Murcia

    The DTU Wind Energy wind-resource model WAsP contains a wind farm wake model Park (Park1). This Park model in has been revised, Park2, to improve prediction accuracy in large wind farms, based on sound physical and mathematical principles: consistent wake-modelling and perturbation theory for wak...

  1. Principle and validation of modified hysteretic models for magnetorheological dampers (United States)

    Bai, Xian-Xu; Chen, Peng; Qian, Li-Jun


    Magnetorheological (MR) dampers, semi-active actuators for vibration and shock control systems, have attracted increasing attention during the past two decades. However, it is difficult to establish a precise mathematical model for the MR dampers and their control systems due to their intrinsic strong nonlinear hysteretic behavior. A phenomenological model based on the Bouc-Wen model can be used to effectively describe the nonlinear hysteretic behavior of the MR dampers, but the structure of the phenomenological model is complex and the Bouc-Wen model is functionally redundant. In this paper, based on the phenomenological model, (1) a normalized phenomenological model is derived through incorporating a ‘normalization’ concept, and (2) a restructured model, also incorporating the ‘normalization’ concept, is proposed and realized. In order to demonstrate this, a multi-islands genetic algorithm (GA) is employed to identify the parameters of the restructured model, the normalized phenomenological model, and the phenomenological model. The performance of the three models for describing and predicting the damping force characteristics of the MR dampers are compared and analyzed using the identified parameters. The research results indicate that, as compared with the phenomenological model and the normalized phenomenological model, (1) the restructured model can not only effectively decrease the number of the model parameters and reduce the complexity of the model, but can also describe the nonlinear hysteretic behavior of MR dampers more accurately, and (2) the meanings of several model parameters of the restructured model are clearer and the initial ranges of the model parameters are more explicit, which is of significance for parameter identification.

  2. Model Validation and Verification of Data Mining from the Knowledge Workers Productivity Approach

    National Research Council Canada - National Science Library

    Najafi, A


    ... to present a hybrid method for Model Validation and Verification of Data Mining from the Knowledge Workers Productivity Approach. It is hoped that this paper will help managers to implement different corresponding measures. A case study is presented where this model measure and validates at the Alupan company. @JASEM (ProQuest: ... denotes formulae omitted.) There are two viewpoints regarding knowledge workers' productivity model, that is, the public and the specialized. According to the specialized viewpoint,...

  3. The Sandia MEMS Passive Shock Sensor : FY08 testing for functionality, model validation, and technology readiness.

    Energy Technology Data Exchange (ETDEWEB)

    Walraven, Jeremy Allen; Blecke, Jill; Baker, Michael Sean; Clemens, Rebecca C.; Mitchell, John Anthony; Brake, Matthew Robert; Epp, David S.; Wittwer, Jonathan W.


    This report summarizes the functional, model validation, and technology readiness testing of the Sandia MEMS Passive Shock Sensor in FY08. Functional testing of a large number of revision 4 parts showed robust and consistent performance. Model validation testing helped tune the models to match data well and identified several areas for future investigation related to high frequency sensitivity and thermal effects. Finally, technology readiness testing demonstrated the integrated elements of the sensor under realistic environments.

  4. Aggregating validity indicators: The salience of domain specificity and the indeterminate range in multivariate models of performance validity assessment. (United States)

    Erdodi, Laszlo A


    This study was designed to examine the "domain specificity" hypothesis in performance validity tests (PVTs) and the epistemological status of an "indeterminate range" when evaluating the credibility of a neuropsychological profile using a multivariate model of performance validity assessment. While previous research suggests that aggregating PVTs produces superior classification accuracy compared to individual instruments, the effect of the congruence between the criterion and predictor variable on signal detection and the issue of classifying borderline cases remain understudied. Data from a mixed clinical sample of 234 adults referred for cognitive evaluation (MAge = 46.6; MEducation = 13.5) were collected. Two validity composites were created: one based on five verbal PVTs (EI-5VER) and one based on five nonverbal PVTs (EI-5NV) and compared against several other PVTs. Overall, language-based tests of cognitive ability were more sensitive to elevations on the EI-5VER compared to visual-perceptual tests; whereas, the opposite was observed with the EI-5NV. However, the match between predictor and criterion variable had a more complex relationship with classification accuracy, suggesting the confluence of multiple factors (sensory modality, cognitive domain, testing paradigm). An "indeterminate range" of performance validity emerged that was distinctly different from both the Pass and the Fail group. Trichotomized criterion PVTs (Pass-Borderline-Fail) had a negative linear relationship with performance on tests of cognitive ability, providing further support for an "in-between" category separating the unequivocal Pass and unequivocal Fail classification range. The choice of criterion variable can influence classification accuracy in PVT research. Establishing a Borderline range between Pass and Fail more accurately reflected the distribution of scores on multiple PVTs. The traditional binary classification system imposes an artificial dichotomy on PVTs that

  5. Validation of crop weather models for crop assessment arid yield ...

    African Journals Online (AJOL)

    Input for the models comprised of weather, crop and soil data collected from five selected stations. Simulation results show that IRSIS model tends to over predict grain yields of maize, sorghum and wheat, a fact that could be attributed to the inadequacy of the model to accurately account for rainfall excess. On the other hand ...

  6. Estimation and Q-Matrix Validation for Diagnostic Classification Models (United States)

    Feng, Yuling


    Diagnostic classification models (DCMs) are structured latent class models widely discussed in the field of psychometrics. They model subjects' underlying attribute patterns and classify subjects into unobservable groups based on their mastery of attributes required to answer the items correctly. The effective implementation of DCMs depends…

  7. Validity of covariance models for the analysis of geographical variation

    DEFF Research Database (Denmark)

    Guillot, Gilles; Schilling, Rene L.; Porcu, Emilio


    1. Due to the availability of large molecular data-sets, covariance models are increasingly used to describe the structure of genetic variation as an alternative to more heavily parametrised biological models. 2. We focus here on a class of parametric covariance models that received sustained...

  8. Validation of smoke plume rise models using ground based lidar (United States)

    Cyle E. Wold; Shawn Urbanski; Vladimir Kovalev; Alexander Petkov; Wei Min Hao


    Biomass fires can significantly degrade regional air quality. Plume rise height is one of the critical factors determining the impact of fire emissions on air quality. Plume rise models are used to prescribe the vertical distribution of fire emissions which are critical input for smoke dispersion and air quality models. The poor state of model evaluation is due in...

  9. Rorschach score validation as a model for 21st-century personality assessment. (United States)

    Bornstein, Robert F


    Recent conceptual and methodological innovations have led to new strategies for documenting the construct validity of test scores, including performance-based test scores. These strategies have the potential to generate more definitive evidence regarding the validity of scores derived from the Rorschach Inkblot Method (RIM) and help resolve some long-standing controversies regarding the clinical utility of the Rorschach. After discussing the unique challenges in studying the Rorschach and why research in this area is important given current trends in scientific and applied psychology, I offer 3 overarching principles to maximize the construct validity of RIM scores, arguing that (a) the method that provides RIM validation measures plays a key role in generating outcome predictions; (b) RIM variables should be linked with findings from neighboring subfields; and (c) rigorous RIM score validation includes both process-focused and outcome-focused assessments. I describe a 4-step strategy for optimal RIM score derivation (formulating hypotheses, delineating process links, generating outcome predictions, and establishing limiting conditions); and a 4-component template for RIM score validation (establishing basic psychometrics, documenting outcome-focused validity, assessing process-focused validity, and integrating outcome- and process-focused validity data). The proposed framework not only has the potential to enhance the validity and utility of the RIM, but might ultimately enable the RIM to become a model of test score validation for 21st-century personality assessment.

  10. Validation of coastal oceanographic models at Forsmark. Site descriptive modelling SDM-Site Forsmark

    Energy Technology Data Exchange (ETDEWEB)

    Engqvist, Anders (A och I Engqvist Konsult HB, Vaxholm (SE)); Andrejev, Oleg (Finnish Inst. of Marine Research, Helsinki (FI))


    The Swedish Nuclear Fuel and Waste Management Company (SKB) is undertaking site characterisation at two different locations, the Forsmark and the Simpevarp areas, with the objective of siting a geological repository for spent nuclear fuel. The characterisation work is divided into an initial site investigation phase and a complete site investigation phase. In this context, the water exchange of the coastal zone is one link of the chain of possible nuclide transport mechanisms that must be assessed in the site description of potential repository areas. For the purpose of validating the pair of nested 3D-models employed to simulate the water exchange in the near-shore coastal zone in the Forsmark area, an encompassing measurement program entailing six stations has been performed. The design of this program was to first assess to what degree the forcing of the fine resolution (FR) model of the Forsmark study area at its interfacial boundary to the coarse resolution (CR) model of the entire Baltic was reproduced. In addition to this scrutiny it is of particular interest how the time-varying density-determining properties, salinity and temperature, at the borders are propagated into the FR-domain, since this corresponds to the most efficient mode of water exchange. An important part of the validation process has been to carefully evaluate which measurement data that can be considered reliable. The result was that several periods of foremost near-surface salinity data had to be discarded due to growth of algae on the conductivity sensors. Lack of thorough absolute calibration of the salinity meters also necessitates dismissal of measurement data. Relative the assessed data that can be accepted as adequate, the outcome of the validation can be summarized in five points: (i) The surface-most salinity of the CR-model drifts downward a little less than one practical salinity unit (psu) per year, requiring that the ensuing correlation analysis be subdivided into periods of a

  11. Measuring Students' Motivation: Validity Evidence for the MUSIC Model of Academic Motivation Inventory (United States)

    Jones, Brett D.; Skaggs, Gary


    This study provides validity evidence for the MUSIC Model of Academic Motivation Inventory (MUSIC Inventory; Jones, 2012), which measures college students' beliefs related to the five components of the MUSIC Model of Motivation (MUSIC model; Jones, 2009). The MUSIC model is a conceptual framework for five categories of teaching strategies (i.e.,…

  12. A New Statistical Method to Determine the Degree of Validity of Health Economic Model Outcomes against Empirical Data

    NARCIS (Netherlands)

    Corro Ramos, Isaac; Voorn, van George A.K.; Vemer, Pepijn; Feenstra, Talitha L.; Al, Maiwenn J.


    Background: The validation of health economic (HE) model outcomes against empirical data is of key importance. Although statistical testing seems applicable, guidelines for the validation of HE models lack guidance on statistical validation, and actual validation efforts often present subjective

  13. Improving external and internal validity of a model of midlife women's maternal-student role stress. (United States)

    Gigliotti, Eileen


    This study's purpose was to improve both the external and internal validity of Gigliotti's model of maternal-student role stress in midlife women. It was found that the model is generalizeable to midlife women who are non-degreed undergraduate students of varied marital and immigration status and varied college major, thus improving external validity. It was also found that investigating particular types of children's situation-specific social support improved internal validity. These results support the propositions of the Neuman systems model as well as Meleis and others' transition framework. An interventional study is proposed as the next step in this program of research.

  14. Validation of coastal oceanographic models at Laxemar-Simpevarp. Site descriptive modelling SDM-Site Laxemar

    Energy Technology Data Exchange (ETDEWEB)

    Engqvist, Anders (A och I Engqvist Konsult HB, Vaxholm (SE)); Andrejev, Oleg (Finnish Inst. of Marine Research, Helsinki (FI))


    The Swedish Nuclear Fuel and Waste Management Company (SKB) is undertaking site characterization at two different locations, the Forsmark and the Laxemar-Simpevarp areas, with the objective of siting a geological repository for spent nuclear fuel. The characterization work is divided into an initial site investigation phase and a complete site investigation phase. In this context, the water exchange of the coastal zone is one link of the chain of possible nuclide transport mechanisms that must be assessed in the site description of potential repository areas. For the purpose of validating the pair of nested 3D-models and the coupled discrete basin (CDB-) model employed to simulate the water exchange in the near-shore coastal zone in the Laxemar-Simpevarp area, an encompassing measurement program entailing data from six stations (of which two are close) has been performed. The design of this program was to first assess to what degree the forcing of the fine resolution (FR-) model of the Laxemar- Simpevarp study area at its interfacial boundary to the coarse resolution (CR-) model of the entire Baltic was reproduced. In addition to this, it is of particular interest how the time-varying density-determining properties, salinity and temperature, at the borders are propagated into the FR-domain and further influence the water exchange with the interior, more secluded, basins. An important part of the validation process has been to carefully evaluate which measurement data that can be considered reliable. The result was that some periods of foremost near-surface salinity data had to be discarded due to growth of algae on the conductivity sensors. Interference with ship traffic and lack of absolute calibration of the salinity meters necessitated dismissal of measurement data too. In this study so-called Mesan data have been consistently used for the meteorological forcing of the 3D-models. Relative the assessed data that can be accepted as adequate, the outcome of the

  15. Validating a perceptual distraction model in a personal two-zone sound system

    DEFF Research Database (Denmark)

    Rämö, Jussi; Christensen, Lasse; Bech, Søren


    This paper focuses on validating a perceptual distraction model, which aims to predict user’s perceived distraction caused by audio-on-audio interference, e.g., two competing audio sources within the same listening space. Originally, the distraction model was trained with music-on-music stimuli...... that the model performance is equally good in both zones, i.e., with both speech-on-music and music-on-speech stimuli, and comparable to the previous validation round (RMSE approximately 10%). The results further confirm that the distraction model can be used as a valuable tool in evaluating and optimizing...... using a simple loudspeaker setup, consisting of only two loudspeakers, one for the target sound source and the other for the interfering sound source. Recently, the model was successfully validated in a complex personal sound-zone system with speech-on-music stimuli. Second round of validations were...

  16. Validating a perceptual distraction model using a personal two-zone sound system

    DEFF Research Database (Denmark)

    Rämö, Jussi; Christensen, Lasse; Bech, Søren


    This paper focuses on validating a perceptual distraction model, which aims to predict user's perceived distraction caused by audio-on-audio interference. Originally, the distraction model was trained with music targets and interferers using a simple loudspeaker setup, consisting of only two...... loudspeakers. Recently, the model was successfully validated in a complex personal sound-zone system with music targets and speech interferers. In this paper, a second round of validations were conducted by physically altering the sound-zone system and running listening experiments utilizing both of the two...... sound zones within the sound-zone system. Thus, validating the model using a different sound-zone system with both speech-on-music and music-on-speech stimuli sets. The results show that the model performance is equally good in both zones, i.e., with both speech- on-music and music-on-speech stimuli...

  17. Description of a Website Resource for Turbulence Modeling Verification and Validation (United States)

    Rumsey, Christopher L.; Smith, Brian R.; Huang, George P.


    The activities of the Turbulence Model Benchmarking Working Group - which is a subcommittee of the American Institute of Aeronautics and Astronautics (AIAA) Fluid Dynamics Technical Committee - are described. The group s main purpose is to establish a web-based repository for Reynolds-averaged Navier-Stokes turbulence model documentation, including verification and validation cases. This turbulence modeling resource has been established based on feedback from a survey on what is needed to achieve consistency and repeatability in turbulence model implementation and usage, and to document and disseminate information on new turbulence models or improvements to existing models. The various components of the website are described in detail: description of turbulence models, turbulence model readiness rating system, verification cases, validation cases, validation databases, and turbulence manufactured solutions. An outline of future plans of the working group is also provided.

  18. Validating a perceptual distraction model using a personal two-zone sound system

    DEFF Research Database (Denmark)

    Rämö, Jussi; Christensen, Lasse; Bech, Søren


    , and comparable to the previous validation round. The calculated root mean squared errors in Zone A and B were 11.3% and 10.4%, respectively, compared to the 11.0% from the previous validation round. The results further confirm that the distraction model can be used as a valuable tool in evaluating and optimizing...

  19. External Validation of Prediction Models for Pneumonia in Primary Care Patients with Lower Respiratory Tract Infection

    DEFF Research Database (Denmark)

    Schierenberg, Alwin; Minnaard, Margaretha C; Hopstaken, Rogier M


    for prediction of pneumonia in primary care were externally validated in the individual patient data (IPD) of previously performed diagnostic studies. METHODS AND FINDINGS: S&S models for diagnosing pneumonia in adults presenting to primary care with lower respiratory tract infection and IPD for validation were...

  20. An improved snow scheme for the ECMWF land surface model: Description and offline validation (United States)

    Emanuel Dutra; Gianpaolo Balsamo; Pedro Viterbo; Pedro M. A. Miranda; Anton Beljaars; Christoph Schar; Kelly Elder


    A new snow scheme for the European Centre for Medium-Range Weather Forecasts (ECMWF) land surface model has been tested and validated. The scheme includes a new parameterization of snow density, incorporating a liquid water reservoir, and revised formulations for the subgrid snow cover fraction and snow albedo. Offline validation (covering a wide range of spatial and...

  1. Evaluating the Social Validity of the Early Start Denver Model: A Convergent Mixed Methods Study (United States)

    Ogilvie, Emily; McCrudden, Matthew T.


    An intervention has social validity to the extent that it is socially acceptable to participants and stakeholders. This pilot convergent mixed methods study evaluated parents' perceptions of the social validity of the Early Start Denver Model (ESDM), a naturalistic behavioral intervention for children with autism. It focused on whether the parents…

  2. Experimental validation of a Bayesian model of visual acuity.

    LENUS (Irish Health Repository)

    Dalimier, Eugénie


    Based on standard procedures used in optometry clinics, we compare measurements of visual acuity for 10 subjects (11 eyes tested) in the presence of natural ocular aberrations and different degrees of induced defocus, with the predictions given by a Bayesian model customized with aberrometric data of the eye. The absolute predictions of the model, without any adjustment, show good agreement with the experimental data, in terms of correlation and absolute error. The efficiency of the model is discussed in comparison with image quality metrics and other customized visual process models. An analysis of the importance and customization of each stage of the model is also given; it stresses the potential high predictive power from precise modeling of ocular and neural transfer functions.

  3. Atmospheric Dispersion Model Validation in Low Wind Conditions

    Energy Technology Data Exchange (ETDEWEB)

    Sawyer, Patrick


    Atmospheric plume dispersion models are used for a variety of purposes including emergency planning and response to hazardous material releases, determining force protection actions in the event of a Weapons of Mass Destruction (WMD) attack and for locating sources of pollution. This study provides a review of previous studies that examine the accuracy of atmospheric plume dispersion models for chemical releases. It considers the principles used to derive air dispersion plume models and looks at three specific models currently in use: Aerial Location of Hazardous Atmospheres (ALOHA), Emergency Prediction Information Code (EPIcode) and Second Order Closure Integrated Puff (SCIPUFF). Results from this study indicate over-prediction bias by the EPIcode and SCIPUFF models and under-prediction bias by the ALOHA model. The experiment parameters were for near field dispersion (less than 100 meters) in low wind speed conditions (less than 2 meters per second).

  4. The international coordination of climate model validation and intercomparison

    Energy Technology Data Exchange (ETDEWEB)

    Gates, W.L. [Lawrence Livermore National Lab. Livermore, CA (United States). Program for Climate Model Diagnosis and Intercomparison


    Climate modeling, whereby basic physical laws are used to integrate the physics and dynamics of climate into a consistent system, plays a key role in climate research and is the medium through. Depending upon the portion(s) of the climate system being considered, climate models range from those concerned only with the equilibrium globally-averaged surface temperature to those depicting the 3-dimensional time-dependent evolution of the coupled atmosphere, ocean, sea ice and land surface. Here only the latter class of models are considered, which are commonly known as general circulation models (or GCMs). (author)

  5. Root zone water quality model (RZWQM2): Model use, calibration and validation (United States)

    Ma, Liwang; Ahuja, Lajpat; Nolan, B.T.; Malone, Robert; Trout, Thomas; Qi, Z.


    The Root Zone Water Quality Model (RZWQM2) has been used widely for simulating agricultural management effects on crop production and soil and water quality. Although it is a one-dimensional model, it has many desirable features for the modeling community. This article outlines the principles of calibrating the model component by component with one or more datasets and validating the model with independent datasets. Users should consult the RZWQM2 user manual distributed along with the model and a more detailed protocol on how to calibrate RZWQM2 provided in a book chapter. Two case studies (or examples) are included in this article. One is from an irrigated maize study in Colorado to illustrate the use of field and laboratory measured soil hydraulic properties on simulated soil water and crop production. It also demonstrates the interaction between soil and plant parameters in simulated plant responses to water stresses. The other is from a maize-soybean rotation study in Iowa to show a manual calibration of the model for crop yield, soil water, and N leaching in tile-drained soils. Although the commonly used trial-and-error calibration method works well for experienced users, as shown in the second example, an automated calibration procedure is more objective, as shown in the first example. Furthermore, the incorporation of the Parameter Estimation Software (PEST) into RZWQM2 made the calibration of the model more efficient than a grid (ordered) search of model parameters. In addition, PEST provides sensitivity and uncertainty analyses that should help users in selecting the right parameters to calibrate.

  6. Validation of Goudriaan's model: a case study for maize.

    NARCIS (Netherlands)

    Singh, R.S.; Jacobs, A.F.G.


    The crop microclimate model of Goudriaan was tested, using data collected in a maize field in the Netherlands during one day in summer 1986. Except for the soil heat flux, latent and sensible heat fluxes were simulated reasonably well. Goudriaan's model overestimated the latent and sensible heat

  7. Design and validation of a relative trust model

    NARCIS (Netherlands)

    Hoogendoorn, M.; Jaffry, S.W.; Maanen, P.P. van; Treur, J.


    When considering intelligent agents that interact with humans, having an idea of the trust levels of the human, for example in other agents or services, can be of great importance. Most models of human trust that exist assume trust in one trustee is independent of trust in another trustee. The model

  8. Validation and Verification of LADEE Models and Software (United States)

    Gundy-Burlet, Karen


    The Lunar Atmosphere Dust Environment Explorer (LADEE) mission will orbit the moon in order to measure the density, composition and time variability of the lunar dust environment. The ground-side and onboard flight software for the mission is being developed using a Model-Based Software methodology. In this technique, models of the spacecraft and flight software are developed in a graphical dynamics modeling package. Flight Software requirements are prototyped and refined using the simulated models. After the model is shown to work as desired in this simulation framework, C-code software is automatically generated from the models. The generated software is then tested in real time Processor-in-the-Loop and Hardware-in-the-Loop test beds. Travelling Road Show test beds were used for early integration tests with payloads and other subsystems. Traditional techniques for verifying computational sciences models are used to characterize the spacecraft simulation. A lightweight set of formal methods analysis, static analysis, formal inspection and code coverage analyses are utilized to further reduce defects in the onboard flight software artifacts. These techniques are applied early and often in the development process, iteratively increasing the capabilities of the software and the fidelity of the vehicle models and test beds.

  9. Modeling Enterprise Authorization: A Unified Metamodel and Initial Validation

    Directory of Open Access Journals (Sweden)

    Matus Korman


    Full Text Available Authorization and its enforcement, access control, have stood at the beginning of the art and science of information security, and remain being crucial pillar of security in the information technology (IT and enterprises operations. Dozens of different models of access control have been proposed. Although Enterprise Architecture as the discipline strives to support the management of IT, support for modeling access policies in enterprises is often lacking, both in terms of supporting the variety of individual models of access control nowadays used, and in terms of providing a unified ontology capable of flexibly expressing access policies for all or the most of the models. This study summarizes a number of existing models of access control, proposes a unified metamodel mapped to ArchiMate, and illustrates its use on a selection of example scenarios and two business cases.

  10. Validation of transpulmonary thermodilution cardiac output measurement in a pediatric animal model.

    NARCIS (Netherlands)

    Lemson, J.; Boode, W.P. de; Hopman, J.C.W.; Singh, S.K.; Hoeven, J.G. van der


    OBJECTIVE: This study was undertaken to validate the transpulmonary thermodilution cardiac output measurement (CO(TPTD)) in a controlled newborn animal model under various hemodynamic conditions with special emphasis on low cardiac output. DESIGN: Prospective, experimental, pediatric animal study.

  11. Validating a perceptual distraction model in a personal two-zone sound system

    DEFF Research Database (Denmark)

    Rämö, Jussi; Christensen, Lasse; Bech, Søren


    This paper focuses on validating a perceptual distraction model, which aims to predict user’s perceived distraction caused by audio-on-audio interference, e.g., two competing audio sources within the same listening space. Originally, the distraction model was trained with music-on-music stimuli...... using a simple loudspeaker setup, consisting of only two loudspeakers, one for the target sound source and the other for the interfering sound source. Recently, the model was successfully validated in a complex personal sound-zone system with speech-on-music stimuli. Second round of validations were...... conducted by physically altering the sound-zone system and running a set of new listening experiments utilizing two sound zones within the sound-zone system. Thus, validating the model using a different sound-zone system with both speech-on-music and music-on-speech stimuli sets. Preliminary results show...

  12. Turbulent Scalar Transport Model Validation for High Speed Propulsive Flows Project (United States)

    National Aeronautics and Space Administration — This effort entails the validation of a RANS turbulent scalar transport model (SFM) for high speed propulsive flows, using new experimental data sets and...

  13. Nearshore Tsunami Inundation Model Validation: Toward Sediment Transport Applications (United States)

    Apotsos, Alex; Buckley, Mark; Gelfenbaum, Guy; Jaffe, Bruce; Vatvani, Deepak


    Model predictions from a numerical model, Delft3D, based on the nonlinear shallow water equations are compared with analytical results and laboratory observations from seven tsunami-like benchmark experiments, and with field observations from the 26 December 2004 Indian Ocean tsunami. The model accurately predicts the magnitude and timing of the measured water levels and flow velocities, as well as the magnitude of the maximum inundation distance and run-up, for both breaking and non-breaking waves. The shock-capturing numerical scheme employed describes well the total decrease in wave height due to breaking, but does not reproduce the observed shoaling near the break point. The maximum water levels observed onshore near Kuala Meurisi, Sumatra, following the 26 December 2004 tsunami are well predicted given the uncertainty in the model setup. The good agreement between the model predictions and the analytical results and observations demonstrates that the numerical solution and wetting and drying methods employed are appropriate for modeling tsunami inundation for breaking and non-breaking long waves. Extension of the model to include sediment transport may be appropriate for long, non-breaking tsunami waves. Using available sediment transport formulations, the sediment deposit thickness at Kuala Meurisi is predicted generally within a factor of 2.

  14. Medium term hurricane catastrophe models: a validation experiment (United States)

    Bonazzi, Alessandro; Turner, Jessica; Dobbin, Alison; Wilson, Paul; Mitas, Christos; Bellone, Enrica


    Climate variability is a major source of uncertainty for the insurance industry underwriting hurricane risk. Catastrophe models provide their users with a stochastic set of events that expands the scope of the historical catalogue by including synthetic events that are likely to happen in a defined time-frame. The use of these catastrophe models is widespread in the insurance industry but it is only in recent years that climate variability has been explicitly accounted for. In the insurance parlance "medium term catastrophe model" refers to products that provide an adjusted view of risk that is meant to represent hurricane activity on a 1 to 5 year horizon, as opposed to long term models that integrate across the climate variability of the longest available time series of observations. In this presentation we discuss how a simple reinsurance program can be used to assess the value of medium term catastrophe models. We elaborate on similar concepts as discussed in "Potential Economic Value of Seasonal Hurricane Forecasts" by Emanuel et al. (2012, WCAS) and provide an example based on 24 years of historical data of the Chicago Mercantile Hurricane Index (CHI), an insured loss proxy. Profit and loss volatility of a hypothetical primary insurer are used to score medium term models versus their long term counterpart. Results show that medium term catastrophe models could help a hypothetical primary insurer to improve their financial resiliency to varying climate conditions.


    Directory of Open Access Journals (Sweden)



    Full Text Available Post-traumatic stress disorder (PTSD is a debilitating condition that develops in a proportion of individuals following a traumatic event. Despite recent advances, ethical limitations associated with human research impede progress in understanding PTSD. Fortunately, much effort has focused on developing animal models to help study the pathophysiology of PTSD. Here, we provide an overview of animal PTSD models where a variety of stressors (physical, psychosocial, or psychogenic are used to examine the long-term effects of severe trauma. We emphasize models involving predator threat because they reproduce human individual differences in susceptibility to, and in the long-term consequences of, psychological trauma.

  16. Animal models of post-traumatic stress disorder: face validity (United States)

    Goswami, Sonal; Rodríguez-Sierra, Olga; Cascardi, Michele; Paré, Denis


    Post-traumatic stress disorder (PTSD) is a debilitating condition that develops in a proportion of individuals following a traumatic event. Despite recent advances, ethical limitations associated with human research impede progress in understanding PTSD. Fortunately, much effort has focused on developing animal models to help study the pathophysiology of PTSD. Here, we provide an overview of animal PTSD models where a variety of stressors (physical, psychosocial, or psychogenic) are used to examine the long-term effects of severe trauma. We emphasize models involving predator threat because they reproduce human individual differences in susceptibility to, and in the long-term consequences of, psychological trauma. PMID:23754973

  17. Validation of Inhibition Effect in the Cellulose Hydrolysis: a Dynamic Modelling Approach

    DEFF Research Database (Denmark)

    Morales Rodriguez, Ricardo; Tsai, Chien-Tai; Meyer, Anne S.


    Enzymatic hydrolysis is one of the main steps in the processing of bioethanol from lignocellulosic raw materials. However, complete understanding of the underlying phenomena is still under development. Hence, this study has focused on validation of the inhibition effects in the cellulosic biomass...... hydrolysis employing a dynamic mathematical model. A systematic framework for parameter estimation is used for model validation, which helps overcome the problem of parameter correlation. Data sets obtained from carefully designed enzymatic cellulose and cellobiose hydrolysis experiments, were used...

  18. Validation of Hydrodynamic Numerical Model of a Pitching Wave Energy Converter

    DEFF Research Database (Denmark)

    López, Maria del Pilar Heras; Thomas, Sarah; Kramer, Morten Mejlhede


    Validation of numerical model is essential in the development of new technologies. Commercial software and codes available simulating wave energy converters (WECs) have not been proved to work for all the available and upcoming technologies yet. The present paper presents the first stages...... of the validation process of a hydrodynamic numerical model for a pitching wave energy converter. The development of dry tests, wave flume and wave basin experiments are going to be explained, lessons learned shared and results presented....

  19. Validation of theoretical models through measured pavement response

    DEFF Research Database (Denmark)

    Ullidtz, Per


    Most models for structural evaluation of pavements are of the analytical-empirical type. An analytical model, derived from solid mechanics, is used to calculate stresses or strains at critical positions, and these stresses or strains are then used with empirical relationships to predict pavement...... mechanics was quite different from the measured stress, the peak theoretical value being only half of the measured value.On an instrumented pavement structure in the Danish Road Testing Machine, deflections were measured at the surface of the pavement under FWD loading. Different analytical models were...... then used to derive the elastic parameters of the pavement layeres, that would produce deflections matching the measured deflections. Stresses and strains were then calculated at the position of the gauges and compared to the measured values. It was found that all analytical models would predict the tensile...

  20. Models, validation, and applied geochemistry: Issues in science, communication, and philosophy (United States)

    Nordstrom, D. Kirk


    Models have become so fashionable that many scientists and engineers cannot imagine working without them. The predominant use of computer codes to execute model calculations has blurred the distinction between code and model. The recent controversy regarding model validation has brought into question what we mean by a ‘model’ and by ‘validation.’ It has become apparent that the usual meaning of validation may be common in engineering practice and seems useful in legal practice but it is contrary to scientific practice and brings into question our understanding of science and how it can best be applied to such problems as hazardous waste characterization, remediation, and aqueous geochemistry in general. This review summarizes arguments against using the phrase model validation and examines efforts to validate models for high-level radioactive waste management and for permitting and monitoring open-pit mines. Part of the controversy comes from a misunderstanding of ‘prediction’ and the need to distinguish logical from temporal prediction. Another problem stems from the difference in the engineering approach contrasted with the scientific approach. The reductionist influence on the way we approach environmental investigations also limits our ability to model the interconnected nature of reality. Guidelines are proposed to improve our perceptions and proper utilization of models. Use of the word ‘validation’ is strongly discouraged when discussing model reliability.

  1. Vehicle-Snow Interaction: Modeling, Testing and Validation (United States)


    Stochastic in nature: – Stochastic models at each scale (e.g., Gaussian Random Field at the mesoscale, semi- variogram at the macroscale) – Key...A. Approved for public release Background: Needs • Microstructure ( uncertainty ) effect not assessed • Need better understanding of deformation and...Goals and Approaches • Goals: – Develop models for the mechanical properties of different types of snow – Quantify the associated uncertainties and

  2. Validation of a Solid Rocket Motor Internal Environment Model (United States)

    Martin, Heath T.


    In a prior effort, a thermal/fluid model of the interior of Penn State University's laboratory-scale Insulation Test Motor (ITM) was constructed to predict both the convective and radiative heat transfer to the interior walls of the ITM with a minimum of empiricism. These predictions were then compared to values of total and radiative heat flux measured in a previous series of ITM test firings to assess the capabilities and shortcomings of the chosen modeling approach. Though the calculated fluxes reasonably agreed with those measured during testing, this exercise revealed means of improving the fidelity of the model to, in the case of the thermal radiation, enable direct comparison of the measured and calculated fluxes and, for the total heat flux, compute a value indicative of the average measured condition. By replacing the P1-Approximation with the discrete ordinates (DO) model for the solution of the gray radiative transfer equation, the radiation intensity field in the optically thin region near the radiometer is accurately estimated, allowing the thermal radiation flux to be calculated on the heat-flux sensor itself, which was then compared directly to the measured values. Though the fully coupling the wall thermal response with the flow model was not attempted due to the excessive computational time required, a separate wall thermal response model was used to better estimate the average temperature of the graphite surfaces upstream of the heat flux gauges and improve the accuracy of both the total and radiative heat flux computations. The success of this modeling approach increases confidence in the ability of state-of-the-art thermal and fluid modeling to accurately predict SRM internal environments, offers corrections to older methods, and supplies a tool for further studies of the dynamics of SRM interiors.

  3. A computationally fast alternative to cross-validation in penalized Gaussian graphical models

    NARCIS (Netherlands)

    Vujacic, I.; Abbruzzo, A.; de Wit, E.


    We study the problem of selecting a regularization parameter in penalized Gaussian graphical models. When the goal is to obtain a model with good predictive power, cross-validation is the gold standard. We present a new estimator of Kullback–Leibler loss in Gaussian Graphical models which provides a

  4. Dynamic chemical process modelling and validation : Theory and application to industrial and literature case study

    NARCIS (Netherlands)

    Schmal, J.P.


    Dynamic chemical process modelling is still largely considered an art. In this thesis the theory of large-scale chemical process modelling and validation is discussed and initial steps to extend the theory are explored. In particular we pay attention to the effect of the level of detail on the model

  5. Updating and prospective validation of a prognostic model for high sickness absence

    NARCIS (Netherlands)

    Roelen, C.A.M.; Heymans, M.W.; Twisk, J.W.R.; van Rhenen, W.; Pallesen, S.; Bjorvatn, B.; Moen, B.E.; Mageroy, N.


    Objectives To further develop and validate a Dutch prognostic model for high sickness absence (SA). Methods Three-wave longitudinal cohort study of 2,059 Norwegian nurses. The Dutch prognostic model was used to predict high SA among Norwegian nurses at wave 2. Subsequently, the model was updated by

  6. Validation and verification of agent models for trust: Independent compared to relative trust

    NARCIS (Netherlands)

    Hoogendoorn, M.; Jaffry, S.W.; Maanen, P.P. van


    In this paper, the results of a validation experiment for two existing computational trust models describing human trust are reported. One model uses experiences of performance in order to estimate the trust in different trustees. The second model in addition carries the notion of relative trust.

  7. A Model-Based Method for Content Validation of Automatically Generated Test Items (United States)

    Zhang, Xinxin; Gierl, Mark


    The purpose of this study is to describe a methodology to recover the item model used to generate multiple-choice test items with a novel graph theory approach. Beginning with the generated test items and working backward to recover the original item model provides a model-based method for validating the content used to automatically generate test…

  8. Social Validity of the Critical Incident Stress Management Model for School-Based Crisis Intervention (United States)

    Morrison, Julie Q.


    The Critical Incident Stress Management (CISM) model for crisis intervention was developed for use with emergency service personnel. Research regarding the use of the CISM model has been conducted among civilians and high-risk occupation groups with mixed results. The purpose of this study is to examine the social validity of the CISM model for…

  9. Natural Interaction Metaphors for Functional Validations of Virtual Car Models. (United States)

    Moehring, Mathias; Froehlich, Bernd


    Natural Interaction in virtual environments is a key requirement for the virtual validation of functional aspects in automotive product development processes. Natural Interaction is the metaphor people encounter in reality: the direct manipulation of objects by their hands. To enable this kind of Natural Interaction, we propose a pseudophysical metaphor that is both plausible enough to provide realistic interaction and robust enough to meet the needs of industrial applications. Our analysis of the most common types of objects in typical automotive scenarios guided the development of a set of refined grasping heuristics to support robust finger-based interaction of multiple hands and users. The objects' behavior in reaction to the users' finger motions is based on pseudophysical simulations, which also take various types of constrained objects into account. In dealing with real-world scenarios, we had to introduce the concept of Normal Proxies, which extend objects with appropriate normals for improved grasp detection and grasp stability. An expert review revealed that our interaction metaphors allow for an intuitive and reliable assessment of several functionalities of objects found in a car interior. Follow-up user studies showed that overall task performance and usability are similar for CAVE and HMD environments. For larger objects and more gross manipulation, using the CAVE without employing a virtual hand representation is preferred, but for more fine-grained manipulation and smaller objects, the HMD turns out to be beneficial.

  10. J-Integral modeling and validation for GTS reservoirs.

    Energy Technology Data Exchange (ETDEWEB)

    Martinez-Canales, Monica L.; Nibur, Kevin A.; Lindblad, Alex J.; Brown, Arthur A.; Ohashi, Yuki; Zimmerman, Jonathan A.; Huestis, Edwin; Hong, Soonsung; Connelly, Kevin; Margolis, Stephen B.; Somerday, Brian P.; Antoun, Bonnie R.


    Non-destructive detection methods can reliably certify that gas transfer system (GTS) reservoirs do not have cracks larger than 5%-10% of the wall thickness. To determine the acceptability of a reservoir design, analysis must show that short cracks will not adversely affect the reservoir behavior. This is commonly done via calculation of the J-Integral, which represents the energetic driving force acting to propagate an existing crack in a continuous medium. J is then compared against a material's fracture toughness (J{sub c}) to determine whether crack propagation will occur. While the quantification of the J-Integral is well established for long cracks, its validity for short cracks is uncertain. This report presents the results from a Sandia National Laboratories project to evaluate a methodology for performing J-Integral evaluations in conjunction with its finite element analysis capabilities. Simulations were performed to verify the operation of a post-processing code (J3D) and to assess the accuracy of this code and our analysis tools against companion fracture experiments for 2- and 3-dimensional geometry specimens. Evaluation is done for specimens composed of 21-6-9 stainless steel, some of which were exposed to a hydrogen environment, for both long and short cracks.

  11. Validated biomechanical model for efficiency and speed of rowing. (United States)

    Pelz, Peter F; Vergé, Angela


    The speed of a competitive rowing crew depends on the number of crew members, their body mass, sex and the type of rowing-sweep rowing or sculling. The time-averaged speed is proportional to the rower's body mass to the 1/36th power, to the number of crew members to the 1/9th power and to the physiological efficiency (accounted for by the rower's sex) to the 1/3rd power. The quality of the rowing shell and propulsion system is captured by one dimensionless parameter that takes the mechanical efficiency, the shape and drag coefficient of the shell and the Froude propulsion efficiency into account. We derive the biomechanical equation for the speed of rowing by two independent methods and further validate it by successfully predicting race times. We derive the theoretical upper limit of the Froude propulsion efficiency for low viscous flows. This upper limit is shown to be a function solely of the velocity ratio of blade to boat speed (i.e., it is completely independent of the blade shape), a result that may also be of interest for other repetitive propulsion systems. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. The Validation of Computer-based Models in Engineering: Some Lessons from Computing Science

    Directory of Open Access Journals (Sweden)

    D. J. Murray-Smith


    Full Text Available Questions of the quality of computer-based models and the formal processes of model testing, involving internal verification and external validation, are usually given only passing attention in engineering reports and in technical publications. However, such models frequently provide a basis for analysis methods, design calculations or real-time decision-making in complex engineering systems. This paper reviews techniques used for external validation of computer-based models and contrasts the somewhat casual approach which is usually adopted in this field with the more formal approaches to software testing and documentation recommended for large software projects. Both activities require intimate knowledge of the intended application, a systematic approach and considerable expertise and ingenuity in the design of tests. It is concluded that engineering degree courses dealing with modelling techniques and computer simulation should put more emphasis on model limitations, testing and validation.

  13. Validation of transport models in ASDEX Upgrade current ramps

    Energy Technology Data Exchange (ETDEWEB)

    Fietz, Sina; Hobirk, Joerg; Fable, Emiliano; Fischer, Rainer; Fuchs, Christoph; Pereverzev, Grigori; Ryter, Francois [Max Planck Institut fuer Plasmaphysik, EURATOM Association, Garching (Germany); Collaboration: the ASDEX Upgrade Team


    In order to prepare adequate ramp up and down scenarios for ITER, understanding the physics of transport during the current ramps is essential. The aim of the work was to assess the capability of several transport models to reproduce the experimental data during the current ramps. For this purpose, the calculated temperature profiles from different transport models, i.e. Coppi-Tang, Neo-Alcator, Bohm-Gyrobohm, critical gradient model and H98/2 scaling-based are compared to experimental temperature profiles under different conditions. The strong variation of the experimental electron temperature profiles are partly reproduced by the models. The importance of central and edge radiation will be emphasized, as well as the main transport properties of the models, especially in the case of strong local electron heating (ECRH). To investigate the control capabilities of a Tokamak, particularly with regard to ITER, the impact on global plasma parameters like the internal inductance and the stored energy is also investigated.

  14. Gene expression model (in)validation by Fourier analysis. (United States)

    Konopka, Tomasz; Rooman, Marianne


    The determination of the right model structure describing a gene regulation network and the identification of its parameters are major goals in systems biology. The task is often hampered by the lack of relevant experimental data with sufficiently low noise level, but the subset of genes whose concentration levels exhibit an oscillatory behavior in time can readily be analyzed on the basis of their Fourier spectrum, known to turn complex signals into few relatively noise-free parameters. Such genes therefore offer opportunities of understanding gene regulation quantitatively. Fourier analysis is applied to data on gene expression levels in mouse liver cells that oscillate according to the circadian rhythm. Several model structures in the form of linear and nonlinear differential equations are matched to the data and it is shown that although the considered models can reproduce many features of the oscillatory patterns, some can be excluded on the basis of Fourier analysis without appeal to prior knowledge of regulatory pathways. A systematic method for testing models is also proposed based on measuring the effects of variations in gene copy-number on the expression levels of coupled genes. Fourier analysis is a technique that is well-adapted to the study of biological oscillators and can be used instead or in addition to conventional modeling techniques. Its usefulness will increase as more high-resolution data become available.

  15. Dynamic modeling and validation of a lignocellulosic enzymatic hydrolysis process

    DEFF Research Database (Denmark)

    Prunescu, Remus Mihail; Sin, Gürkan


    The enzymatic hydrolysis process is one of the key steps in second generation biofuel production. After being thermally pretreated, the lignocellulosic material is liquefied by enzymes prior to fermentation. The scope of this paper is to evaluate a dynamic model of the hydrolysis process on a dem......The enzymatic hydrolysis process is one of the key steps in second generation biofuel production. After being thermally pretreated, the lignocellulosic material is liquefied by enzymes prior to fermentation. The scope of this paper is to evaluate a dynamic model of the hydrolysis process...... on a demonstration scale reactor. The following novel features are included: the application of the Convection–Diffusion–Reaction equation to a hydrolysis reactor to assess transport and mixing effects; the extension of a competitive kinetic model with enzymatic pH dependency and hemicellulose hydrolysis...

  16. Validation of an employee satisfaction model: A structural equation model approach

    Directory of Open Access Journals (Sweden)

    Ophillia Ledimo


    Full Text Available The purpose of this study was to validate an employee satisfaction model and to determine the relationships between the different dimensions of the concept, using the structural equation modelling approach (SEM. A cross-sectional quantitative survey design was used to collect data from a random sample of (n=759 permanent employees of a parastatal organisation. Data was collected using the Employee Satisfaction Survey (ESS to measure employee satisfaction dimensions. Following the steps of SEM analysis, the three domains and latent variables of employee satisfaction were specified as organisational strategy, policies and procedures, and outcomes. Confirmatory factor analysis of the latent variables was conducted, and the path coefficients of the latent variables of the employee satisfaction model indicated a satisfactory fit for all these variables. The goodness-of-fit measure of the model indicated both absolute and incremental goodness-of-fit; confirming the relationships between the latent and manifest variables. It also indicated that the latent variables, organisational strategy, policies and procedures, and outcomes, are the main indicators of employee satisfaction. This study adds to the knowledge base on employee satisfaction and makes recommendations for future research.

  17. Validation of buoyancy driven spectral tensor model using HATS data

    DEFF Research Database (Denmark)

    Chougule, A.; Mann, Jakob; Kelly, Mark C.


    We present a homogeneous spectral tensor model for wind velocity and temperature fluctuations, driven by mean vertical shear and mean temperature gradient. Results from the model, including one-dimensional velocity and temperature spectra and the associated co-spectra, are shown in this paper. Th...... is described via five parameters: the dissipation rate (ε), length scale of energy-containing eddies (L), a turbulence anisotropy parameter (Γ), gradient Richardson number (Ri) representing the atmospheric stability and the rate of destruction of temperature variance (ηθ)....

  18. Radiation Background and Attenuation Model Validation and Development

    Energy Technology Data Exchange (ETDEWEB)

    Peplow, Douglas E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Santiago, Claudio P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)


    This report describes the initial results of a study being conducted as part of the Urban Search Planning Tool project. The study is comparing the Urban Scene Simulator (USS), a one-dimensional (1D) radiation transport model developed at LLNL, with the three-dimensional (3D) radiation transport model from ORNL using the MCNP, SCALE/ORIGEN and SCALE/MAVRIC simulation codes. In this study, we have analyzed the differences between the two approaches at every step, from source term representation, to estimating flux and detector count rates at a fixed distance from a simple surface (slab), and at points throughout more complex 3D scenes.

  19. The Contribution of the Rasch Model to the Clinical Validation of Nursing Diagnoses: Integrative Literature Review. (United States)

    de Souza Oliveira-Kumakura, Ana Railka; Caldeira, Sílvia; Prado Simão, Talita; Camargo-Figuera, Fabio Alberto; de Almeida Lopes Monteiro da Cruz, Diná; Campos de Carvalho, Emília


    To analyze the knowledge related to the use of the Rasch model in validation of nursing diagnoses. Integrative literature review with search in LILACS, PUBMED, CINAHL, and SCOPUS. Five studies comprised the sample, which analyzed unidimensionality, local independence, item calibration, item reliability, separation of items and people, and differential item functioning for analyzing nursing diagnoses. The Rasch model seems to be a useful method to validate nursing diagnoses and probably also for the validation of nursing outcomes in the Nursing Outcomes Classification. The use of this model is promising, considering the advantages that it can be used in studies with several methodological designs. Methods that are able to provide more robust evidence of nursing diagnosis validity are needed to support highly accurate diagnostic findings in clinical practice. © 2016 NANDA International, Inc.

  20. Soil process modelling in CZO research: gains in data harmonisation and model validation (United States)

    van Gaans, Pauline; Andrianaki, Maria; Kobierska, Florian; Kram, Pavel; Lamacova, Anna; Lair, Georg; Nikolaidis, Nikos; Duffy, Chris; Regelink, Inge; van Leeuwen, Jeroen P.; de Ruiter, Peter


    Various soil process models were applied to four European Critical Zone observatories (CZOs), the core research sites of the FP7 project SoilTrEC: the Damma glacier forefield (CH), a set of three forested catchments on geochemically contrasing bedrocks in the Slavkov Forest (CZ), a chronosequence of soils in the former floodplain of the Danube of Fuchsenbigl/Marchfeld (AT), and the Koiliaris catchments in the north-western part of Crete, (GR). The aim of the modelling exercises was to apply and test soil process models with data from the CZOs for calibration/validation, identify potential limits to the application scope of the models, interpret soil state and soil functions at key stages of the soil life cycle, represented by the four SoilTrEC CZOs, contribute towards harmonisation of data and data acquisition. The models identified as specifically relevant were: The Penn State Integrated Hydrologic Model (PIHM), a fully coupled, multiprocess, multi-scale hydrologic model, to get a better understanding of water flow and pathways, The Soil and Water Assessment Tool (SWAT), a deterministic, continuous time (daily time step) basin scale model, to evaluate the impact of soil management practices, The Rothamsted Carbon model (Roth-C) to simulate organic carbon turnover and the Carbon, Aggregation, and Structure Turnover (CAST) model to include the role of soil aggregates in carbon dynamics, The Ligand Charge Distribution (LCD) model, to understand the interaction between organic matter and oxide surfaces in soil aggregate formation, and The Terrestrial Ecology Model (TEM) to obtain insight into the link between foodweb structure and carbon and nutrient turnover. With some exceptions all models were applied to all four CZOs. The need for specific model input contributed largely to data harmonisation. The comparisons between the CZOs turned out to be of great value for understanding the strength and limitations of the models, as well as the differences in soil conditions

  1. Continuum model for masonry: Parameter estimation and validation

    NARCIS (Netherlands)

    Lourenço, P.B.; Rots, J.G.; Blaauwendraad, J.


    A novel yield criterion that includes different strengths along each material axis is presented. The criterion includes two different fracture energies in tension and two different fracture energies in compression. The ability of the model to represent the inelastic behavior of orthotropic materials

  2. Validation of an internal hardwood log defect prediction model (United States)

    R. Edward. Thomas


    The type, size, and location of internal defects dictate the grade and value of lumber sawn from hardwood logs. However, acquiring internal defect knowledge with x-ray/computed-tomography or magnetic-resonance imaging technology can be expensive both in time and cost. An alternative approach uses prediction models based on correlations among external defect indicators...

  3. Experimental Analysis and Model Validation of an Opaque Ventilated Facade

    DEFF Research Database (Denmark)

    López, F. Peci; Jensen, Rasmus Lund; Heiselberg, Per


    Natural ventilation is a convenient way of reducing energy consumption in buildings. In this study an experimental module of an opaque ventilated façade (OVF) was built and tested for assessing its potential of supplying free ventilation and air preheating for the building. A numerical model was ...

  4. Validation of an extracted tooth model of endodontic irrigation. (United States)

    Hope, C K; Burnside, G; Chan, S N; Giles, L H; Jarad, F D


    An extracted tooth model of endodontic irrigation, incorporating reproducible inoculation and irrigation procedures, was tested against Enterococcus faecalis using a variety of different irrigants in a Latin square methodology. ANOVA revealed no significant variations between the twelve teeth or experiments undertaken on different occasions; however, variation between irrigants was significant. Copyright © 2010 Elsevier B.V. All rights reserved.

  5. Experimentally validated thermal model of thin film NiTi (United States)

    Favelukis, Jenna E.; Lavine, Adrienne S.; Carman, Gregory P.


    The primary focus of this work is to develop a new analytical approach for thermal modeling of Nickel Titanium (NiTi) shape memory alloy membranes undergoing both phase transformation and large deflections. This paper describes a thermal model of a NiTi plate or thin film, including all the modes of heat loss and latent heat dissipation during the phase transformation. This model is used to predict the NiTi temperature during cooling. The results are compared with experiments conducted on a NiTi plate and thin film (3 micrometers thick), and very good agreement is found. The thermal model is also used to predict the temperature response of a bubble actuator proposed for use in a forced flow environment. Using a 3 mm diameter, 3 micrometers thickness bubble under forced airflow conditions it is possible to achieve a frequency response faster than 300 Hz. Additional calculations were made to verify the structural stability of the actuator system. Predictions indicated that for specific geometries a pressure of at least 35 kPa can be supported by the NiTi membrane. Deflections of a bubble actuator are shown to be on the order of 10% of its diameter while the strain remains below 4%.

  6. FACES IV and the Circumplex Model: Validation Study (United States)

    Olson, David


    Family Adaptability and Cohesion Evaluation Scale (FACES) IV was developed to tap the full continuum of the cohesion and flexibility dimensions from the Circumplex Model of Marital and Family Systems. Six scales were developed, with two balanced scales and four unbalanced scales designed to tap low and high cohesion (disengaged and enmeshed) and…

  7. Validation of noise models for single-cell transcriptomics

    NARCIS (Netherlands)

    Grün, Dominic; Kester, Lennart; van Oudenaarden, Alexander

    Single-cell transcriptomics has recently emerged as a powerful technology to explore gene expression heterogeneity among single cells. Here we identify two major sources of technical variability: sampling noise and global cell-to-cell variation in sequencing efficiency. We propose noise models to

  8. Validation of a probabilistic post-fire erosion model (United States)

    Pete Robichaud; William J. Elliot; Sarah A. Lewis; Mary Ellen Miller


    Post-fire increases of runoff and erosion often occur and land managers need tools to be able to project the increased risk. The Erosion Risk Management Tool (ERMiT) uses the Water Erosion Prediction Project (WEPP) model as the underlying processor. ERMiT predicts the probability of a given amount of hillslope sediment delivery from a single rainfall or...

  9. Formal Model-Based Validation for Tally Systems

    DEFF Research Database (Denmark)

    Cochran, Dermot; Kiniry, Joseph


    . In this work, the ballot counting process for one of the most complex electoral schemes used in the world, Proportional Representation by Single Transferable Vote (PR-STV), is mechanically formally modeled. The purpose of such a formalization is to generate, using an algorithm of our design, a complete set...

  10. Toward Validation of the Genius Discipline-Specific Literacy Model (United States)

    Ellis, Edwin S.; Wills, Stephen; Deshler, Donald D.


    An analysis of the rationale and theoretical foundations of the Genius Discipline-specific Literacy Model and its use of SMARTvisuals to cue information-processing skills and strategies and focus attention on essential informational elements in high-frequency topics in history and the English language arts are presented. Quantitative data…

  11. Simulating pattern-process relationships to validate landscape genetic models (United States)

    A. J. Shirk; S. A. Cushman; E. L. Landguth


    Landscapes may resist gene flow and thereby give rise to a pattern of genetic isolation within a population. The mechanism by which a landscape resists gene flow can be inferred by evaluating the relationship between landscape models and an observed pattern of genetic isolation. This approach risks false inferences because researchers can never feasibly test all...

  12. Reference source data for GCNP noise model validation study (United States)


    The U.S. Department of the Interior (DOI), National Park Service (NPS), and the U.S. Department : of Transportation (DOT), Federal Aviation Administration (FAA) are jointly sponsoring a study to : examine the accuracy of existing models used for pred...

  13. Experimental validation and calibration of pedestrian loading models for footbridges

    DEFF Research Database (Denmark)

    Ricciardelli, Fransesco; Briatico, C; Ingólfsson, Einar Thór


    Different patterns of pedestrian loading of footbridges exist, whose occurrence depends on a number of parameters, such as the bridge span, frequency, damping and mass, and the pedestrian density and activity. In this paper analytical models for the transient action of one walker and for the stat...

  14. The Pseudo-Self-Similar Traffic Model: Application and Validation

    NARCIS (Netherlands)

    El Abdouni Khayari, Rachid; Haverkort, Boudewijn R.H.M.; Sadre, R.; Ost, Alexander


    Since the early 1990s, a variety of studies have shown that network traffic, both for local- and wide-area networks, has self-similar properties. This led to new approaches in network traffic modelling because most traditional traffic approaches result in the underestimation of performance measures

  15. Validation of Occupants’ Behaviour Models for Indoor Quality Parameter and Energy Consumption Prediction

    DEFF Research Database (Denmark)

    Fabi, Valentina; Sugliano, Martina; Andersen, Rune Korsholm


    . For this reason, the validation of occupant's behavioral models is an issue that is gaining importance.In this paper validation was carried out through dynamic Building Energy Performance simulation (BEPS); behavioral models of windows opening and thermostats set-point published in literature were implemented...... in a dynamic BEPS software and the obtained results in terms of temperature, relative humidity and CO2 concentration were compared to real measurements. Through this comparison it will be possible to verify the accuracy of the implemented behavioral models.The models were able to reproduce the general...

  16. Transient Model Validation of Fixed-Speed Induction Generator Using Wind Farm Measurements

    DEFF Research Database (Denmark)

    Rogdakis, Georgios; Garcia-Valle, Rodrigo; Arana Aristi, Iván


    In this paper, an electromagnetic transient model for fixed-speed wind turbines equipped with induction generators is developed and implemented in PSCAD/EMTDC. The model is comprised by: an induction generator, aerodynamic rotor, and a two-mass representation of the shaft system. Model validation...... is conducted by measurement comparison using recordings obtained from switching operations performed at the Nysted OffshoreWind Farm in Denmark. A sensitivity analysis is performed to determine the impact of different model parameters on the simulated response as compared with measurements. This validated...

  17. Oscillation characteristics of endodontic files: numerical model and its validation. (United States)

    Verhaagen, Bram; Lea, Simon C; de Bruin, Gerrit J; van der Sluis, Luc W M; Walmsley, A Damien; Versluis, Michel


    During a root canal treatment, an antimicrobial fluid is injected into the root canal to eradicate all bacteria from the root canal system. Agitation of the fluid using an ultrasonically vibrating miniature file results in a significant improvement in the cleaning efficacy over conventional syringe irrigation. Numerical analysis of the oscillation characteristics of the file, modeled as a tapered, driven rod, shows a sinusoidal wave pattern with an increase in amplitude and decrease in wavelength toward the free end of the file. Measurements of the file oscillation with a scanning laser vibrometer show good agreement with the numerical simulation. The numerical model of endodontic file oscillation has the potential for predicting the oscillation pattern and fracture likeliness of various file types and the acoustic streaming they induce during passive ultrasonic irrigation.

  18. Exploring the Validity of Proposed Transgenic Animal Models of Attention-Deficit Hyperactivity Disorder (ADHD). (United States)

    de la Peña, June Bryan; Dela Peña, Irene Joy; Custodio, Raly James; Botanas, Chrislean Jun; Kim, Hee Jin; Cheong, Jae Hoon


    Attention-deficit/hyperactivity disorder (ADHD) is a common, behavioral, and heterogeneous neurodevelopmental condition characterized by hyperactivity, impulsivity, and inattention. Symptoms of this disorder are managed by treatment with methylphenidate, amphetamine, and/or atomoxetine. The cause of ADHD is unknown, but substantial evidence indicates that this disorder has a significant genetic component. Transgenic animals have become an essential tool in uncovering the genetic factors underlying ADHD. Although they cannot accurately reflect the human condition, they can provide insights into the disorder that cannot be obtained from human studies due to various limitations. An ideal animal model of ADHD must have face (similarity in symptoms), predictive (similarity in response to treatment or medications), and construct (similarity in etiology or underlying pathophysiological mechanism) validity. As the exact etiology of ADHD remains unclear, the construct validity of animal models of ADHD would always be limited. The proposed transgenic animal models of ADHD have substantially increased and diversified over the years. In this paper, we compiled and explored the validity of proposed transgenic animal models of ADHD. Each of the reviewed transgenic animal models has strengths and limitations. Some fulfill most of the validity criteria of an animal model of ADHD and have been extensively used, while there are others that require further validation. Nevertheless, these transgenic animal models of ADHD have provided and will continue to provide valuable insights into the genetic underpinnings of this complex disorder.

  19. Statistical external validation and consensus modeling: a QSPR case study for Koc prediction. (United States)

    Gramatica, Paola; Giani, Elisa; Papa, Ester


    The soil sorption partition coefficient (log K(oc)) of a heterogeneous set of 643 organic non-ionic compounds, with a range of more than 6 log units, is predicted by a statistically validated QSAR modeling approach. The applied multiple linear regression (ordinary least squares, OLS) is based on a variety of theoretical molecular descriptors selected by the genetic algorithms-variable subset selection (GA-VSS) procedure. The models were validated for predictivity by different internal and external validation approaches. For external validation we applied self organizing maps (SOM) to split the original data set: the best four-dimensional model, developed on a reduced training set of 93 chemicals, has a predictivity of 78% when applied on 550 validation chemicals (prediction set). The selected molecular descriptors, which could be interpreted through their mechanistic meaning, were compared with the more common physico-chemical descriptors log K(ow) and log S(w). The chemical applicability domain of each model was verified by the leverage approach in order to propose only reliable data. The best predicted data were obtained by consensus modeling from 10 different models in the genetic algorithm model population.

  20. Microbial dormancy improves development and experimental validation of ecosystem model. (United States)

    Wang, Gangsheng; Jagadamma, Sindhu; Mayes, Melanie A; Schadt, Christopher W; Steinweg, J Megan; Gu, Lianhong; Post, Wilfred M


    Climate feedbacks from soils can result from environmental change followed by response of plant and microbial communities, and/or associated changes in nutrient cycling. Explicit consideration of microbial life-history traits and functions may be necessary to predict climate feedbacks owing to changes in the physiology and community composition of microbes and their associated effect on carbon cycling. Here we developed the microbial enzyme-mediated decomposition (MEND) model by incorporating microbial dormancy and the ability to track multiple isotopes of carbon. We tested two versions of MEND, that is, MEND with dormancy (MEND) and MEND without dormancy (MEND_wod), against long-term (270 days) carbon decomposition data from laboratory incubations of four soils with isotopically labeled substrates. MEND_wod adequately fitted multiple observations (total C-CO2 and (14)C-CO2 respiration, and dissolved organic carbon), but at the cost of significantly underestimating the total microbial biomass. MEND improved estimates of microbial biomass by 20-71% over MEND_wod. We also quantified uncertainties in parameters and model simulations using the Critical Objective Function Index method, which is based on a global stochastic optimization algorithm, as well as model complexity and observational data availability. Together our model extrapolations of the incubation study show that long-term soil incubations with experimental data for multiple carbon pools are conducive to estimate both decomposition and microbial parameters. These efforts should provide essential support to future field- and global-scale simulations, and enable more confident predictions of feedbacks between environmental change and carbon cycling.

  1. Numerical Modeling of a Ducted Rocket Combustor With Experimental Validation


    Hewitt, Patrick


    The present work was conducted with the intent of developing a high-fidelity numerical model of a unique combustion flow problem combining multi-phase fuel injection with substantial momentum and temperature into a highly complex turbulent flow. This important problem is very different from typical and more widely known liquid fuel combustion problems and is found in practice in pulverized coal combustors and ducted rocket ramjets. As the ducted rocket engine cycle is only now finding wides...

  2. Creation of sustainable leadership development: conceptual model validation:


    Dimovski, Vlado; Penger, Sandra; Peterlin, Judita


    Conceptual paper addresses the research question: How can leadership development be managed within organizations? Our proposed answer is presented in the form of conceptual model of sustainable leadership development, based on the theory of multiple intelligences by Howard Gardner and applied to leadership through appreciative inquiry, meaning that leaders possess multiple intelligences which differentiate in their individual profiles and are able to develop a wide span of intelligences durin...

  3. Gene expression model (in)validation by Fourier analysis


    Konopka Tomasz; Rooman Marianne


    Abstract Background The determination of the right model structure describing a gene regulation network and the identification of its parameters are major goals in systems biology. The task is often hampered by the lack of relevant experimental data with sufficiently low noise level, but the subset of genes whose concentration levels exhibit an oscillatory behavior in time can readily be analyzed on the basis of their Fourier spectrum, known to turn complex signals into few relatively noise-f...

  4. A Model-Based Approach to Support Validation of Medical Cyber-Physical Systems

    Directory of Open Access Journals (Sweden)

    Lenardo C. Silva


    Full Text Available Medical Cyber-Physical Systems (MCPS are context-aware, life-critical systems with patient safety as the main concern, demanding rigorous processes for validation to guarantee user requirement compliance and specification-oriented correctness. In this article, we propose a model-based approach for early validation of MCPS, focusing on promoting reusability and productivity. It enables system developers to build MCPS formal models based on a library of patient and medical device models, and simulate the MCPS to identify undesirable behaviors at design time. Our approach has been applied to three different clinical scenarios to evaluate its reusability potential for different contexts. We have also validated our approach through an empirical evaluation with developers to assess productivity and reusability. Finally, our models have been formally verified considering functional and safety requirements and model coverage.

  5. User Modelling Validation over the Security Awareness of Digital Natives

    Directory of Open Access Journals (Sweden)

    Vasileios Gkioulos


    Full Text Available Young generations make extensive use of mobile devices, such as smart-phones, tablets and laptops, for a variety of daily tasks with potentially critical impact, while the number of security breaches via portable devices increases exponentially. A plethora of security risks associated with these devices are induced by design shortcomings and vulnerabilities related to user behavior. Therefore, deploying suitable risk treatments requires the investigation of how security experts perceive the digital natives (young people, born in the digital era, when utilizing their user behavior models in the design and analysis of related systems. In this article, we present the results of a survey performed across a multinational sample of security professionals, in comparison to our earlier study over the security awareness of digital natives. Through this study, we seek to identify divergences between user behavior and the conceptual user-models that security experts utilise in their professional tasks. Our results indicate that the experts understanding over the user behaviour does not follow a solidified user-model, while influences from personal perceptions and randomness are also noticeable.

  6. Pharmacokinetic modeling of gentamicin in treatment of infective endocarditis : Model development and validation of existing models

    NARCIS (Netherlands)

    Gomes, Anna; van der Wijk, Lars; Proost, Johannes H; Sinha, Bhanu; Touw, Daan J


    Gentamicin shows large variations in half-life and volume of distribution (Vd) within and between individuals. Thus, monitoring and accurately predicting serum levels are required to optimize effectiveness and minimize toxicity. Currently, two population pharmacokinetic models are applied for

  7. Validation of the Zürich burn-biofilm model. (United States)

    Guggenheim, Merlin; Thurnheer, Thomas; Gmür, Rudolf; Giovanoli, Pietro; Guggenheim, Bernhard


    Despite advances in the use of topical and parenteral antimicrobial therapy and the practice of early tangential burn-wound excision, bacterial infection remains a major problem in the management of burn victims today. The purpose of this study was to design and evaluate a polyspecies biofilm model with bacteria known to cause severe infections in burn patients. The model is simple to prepare, maintain and analyse, and allows for short-term exposure to antimicrobials. Initial experiments showed that it was impossible to establish balanced polyspecies biofilms with an inoculum of Gram-positive and -negative bacteria. After 64.5 h of incubation, the Gram-negative bacteria (Escherichia coli and Pseudomonas aeruginosa) had suppressed the Gram-positives (Enterococcus faecalis, Staphylococcus aureus and Streptococcus intermedius). However, adding the Gram-negative bacteria after 41.5 h to an established biofilm of Gram-positives resulted in a balanced microbial consortium. After 64.5 h, all species were present in high numbers (10(7) to 10(8) colony forming units (CFU) per biofilm). Multiple repetitions showed high reproducibility of biofilm formation without significant differences between and within experiments. Combined fluorescence in situ hybridisation/confocal laser scanning microscopy (FISH/CLSM) analyses, for which biofilms had to be grown on a different non-flexible substrate (hydroxy apatite), revealed that, by 41.5 h, the biofilm consisted of an almost confluent layer of bacteria firmly adherent to the substratum. After 64.5 h (22 h after the addition of the Gram negatives), the biofilm consisted of a confluent mixture of single cells, an abundance of galaxies of bacteria with small lacunae and large amounts of extracellular matrix polysaccharides. The polyspecies biofilm model contains the most prevalent burn-associated Gram-positive and Gram-negative bacterial pathogens and mimics the Gram-negative shift observed in vivo. It shows excellent reproducibility

  8. Experimental Validation of Various Temperature Modells for Semi-Physical Tyre Model Approaches (United States)

    Hackl, Andreas; Scherndl, Christoph; Hirschberg, Wolfgang; Lex, Cornelia


    With increasing level of complexity and automation in the area of automotive engineering, the simulation of safety relevant Advanced Driver Assistance Systems (ADAS) leads to increasing accuracy demands in the description of tyre contact forces. In recent years, with improvement in tyre simulation, the needs for coping with tyre temperatures and the resulting changes in tyre characteristics are rising significantly. Therefore, experimental validation of three different temperature model approaches is carried out, discussed and compared in the scope of this article. To investigate or rather evaluate the range of application of the presented approaches in combination with respect of further implementation in semi-physical tyre models, the main focus lies on the a physical parameterisation. Aside from good modelling accuracy, focus is held on computational time and complexity of the parameterisation process. To evaluate this process and discuss the results, measurements from a Hoosier racing tyre 6.0 / 18.0 10 LCO C2000 from an industrial flat test bench are used. Finally the simulation results are compared with the measurement data.

  9. Use of the FDA nozzle model to illustrate validation techniques in computational fluid dynamics (CFD) simulations. (United States)

    Hariharan, Prasanna; D'Souza, Gavin A; Horner, Marc; Morrison, Tina M; Malinauskas, Richard A; Myers, Matthew R


    A "credible" computational fluid dynamics (CFD) model has the potential to provide a meaningful evaluation of safety in medical devices. One major challenge in establishing "model credibility" is to determine the required degree of similarity between the model and experimental results for the model to be considered sufficiently validated. This study proposes a "threshold-based" validation approach that provides a well-defined acceptance criteria, which is a function of how close the simulation and experimental results are to the safety threshold, for establishing the model validity. The validation criteria developed following the threshold approach is not only a function of Comparison Error, E (which is the difference between experiments and simulations) but also takes in to account the risk to patient safety because of E. The method is applicable for scenarios in which a safety threshold can be clearly defined (e.g., the viscous shear-stress threshold for hemolysis in blood contacting devices). The applicability of the new validation approach was tested on the FDA nozzle geometry. The context of use (COU) was to evaluate if the instantaneous viscous shear stress in the nozzle geometry at Reynolds numbers (Re) of 3500 and 6500 was below the commonly accepted threshold for hemolysis. The CFD results ("S") of velocity and viscous shear stress were compared with inter-laboratory experimental measurements ("D"). The uncertainties in the CFD and experimental results due to input parameter uncertainties were quantified following the ASME V&V 20 standard. The CFD models for both Re = 3500 and 6500 could not be sufficiently validated by performing a direct comparison between CFD and experimental results using the Student's t-test. However, following the threshold-based approach, a Student's t-test comparing |S-D| and |Threshold-S| showed that relative to the threshold, the CFD and experimental datasets for Re = 3500 were statistically similar and the model could be

  10. Model Verification and Validation Using Graphical Information Systems Tools (United States)


    at the San Matias and San Jose Gulfs, Northern Patagonia, Argentina, J. Coastal Res., 25 (4), 957-968 DOI: 10.2112/08-1035.1. O’Donnell, J. (1997...small tidal basin in San Francisco Bay (Figure 3.24). The magnitude of the shift was much less there but it was very significant for the scale of the...Figure 3.24. Overlay of satellite image on the model bathymetry for Hunters Point, San Francisco Bay. 51 The imported shoreline feature can

  11. Metal Big Area Additive Manufacturing: Process Modeling and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Simunovic, Srdjan [ORNL; Nycz, Andrzej [ORNL; Noakes, Mark W [ORNL; Chin, Charlie [Dassault Systemes; Oancea, Victor [Dassault Systemes


    Metal Big Area Additive Manufacturing (mBAAM) is a new additive manufacturing (AM) technology for printing large-scale 3D objects. mBAAM is based on the gas metal arc welding process and uses a continuous feed of welding wire to manufacture an object. An electric arc forms between the wire and the substrate, which melts the wire and deposits a bead of molten metal along the predetermined path. In general, the welding process parameters and local conditions determine the shape of the deposited bead. The sequence of the bead deposition and the corresponding thermal history of the manufactured object determine the long range effects, such as thermal-induced distortions and residual stresses. Therefore, the resulting performance or final properties of the manufactured object are dependent on its geometry and the deposition path, in addition to depending on the basic welding process parameters. Physical testing is critical for gaining the necessary knowledge for quality prints, but traversing the process parameter space in order to develop an optimized build strategy for each new design is impractical by pure experimental means. Computational modeling and optimization may accelerate development of a build process strategy and saves time and resources. Because computational modeling provides these opportunities, we have developed a physics-based Finite Element Method (FEM) simulation framework and numerical models to support the mBAAM process s development and design. In this paper, we performed a sequentially coupled heat transfer and stress analysis for predicting the final deformation of a small rectangular structure printed using the mild steel welding wire. Using the new simulation technologies, material was progressively added into the FEM simulation as the arc weld traversed the build path. In the sequentially coupled heat transfer and stress analysis, the heat transfer was performed to calculate the temperature evolution, which was used in a stress analysis to

  12. Parameter Estimation and Model Validation of Nonlinear Dynamical Networks

    Energy Technology Data Exchange (ETDEWEB)

    Abarbanel, Henry [Univ. of California, San Diego, CA (United States); Gill, Philip [Univ. of California, San Diego, CA (United States)


    In the performance period of this work under a DOE contract, the co-PIs, Philip Gill and Henry Abarbanel, developed new methods for statistical data assimilation for problems of DOE interest, including geophysical and biological problems. This included numerical optimization algorithms for variational principles, new parallel processing Monte Carlo routines for performing the path integrals of statistical data assimilation. These results have been summarized in the monograph: “Predicting the Future: Completing Models of Observed Complex Systems” by Henry Abarbanel, published by Spring-Verlag in June 2013. Additional results and details have appeared in the peer reviewed literature.

  13. Modeling and validating HL7 FHIR profiles using semantic web Shape Expressions (ShEx). (United States)

    Solbrig, Harold R; Prud'hommeaux, Eric; Grieve, Grahame; McKenzie, Lloyd; Mandel, Joshua C; Sharma, Deepak K; Jiang, Guoqian


    HL7 Fast Healthcare Interoperability Resources (FHIR) is an emerging open standard for the exchange of electronic healthcare information. FHIR resources are defined in a specialized modeling language. FHIR instances can currently be represented in either XML or JSON. The FHIR and Semantic Web communities are developing a third FHIR instance representation format in Resource Description Framework (RDF). Shape Expressions (ShEx), a formal RDF data constraint language, is a candidate for describing and validating the FHIR RDF representation. Create a FHIR to ShEx model transformation and assess its ability to describe and validate FHIR RDF data. We created the methods and tools that generate the ShEx schemas modeling the FHIR to RDF specification being developed by HL7 ITS/W3C RDF Task Force, and evaluated the applicability of ShEx in the description and validation of FHIR to RDF transformations. The ShEx models contributed significantly to workgroup consensus. Algorithmic transformations from the FHIR model to ShEx schemas and FHIR example data to RDF transformations were incorporated into the FHIR build process. ShEx schemas representing 109 FHIR resources were used to validate 511 FHIR RDF data examples from the Standards for Trial Use (STU 3) Ballot version. We were able to uncover unresolved issues in the FHIR to RDF specification and detect 10 types of errors and root causes in the actual implementation. The FHIR ShEx representations have been included in the official FHIR web pages for the STU 3 Ballot version since September 2016. ShEx can be used to define and validate the syntax of a FHIR resource, which is complementary to the use of RDF Schema (RDFS) and Web Ontology Language (OWL) for semantic validation. ShEx proved useful for describing a standard model of FHIR RDF data. The combination of a formal model and a succinct format enabled comprehensive review and automated validation. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Validation of protein models by a neural network approach

    Directory of Open Access Journals (Sweden)

    Fantucci Piercarlo


    Full Text Available Abstract Background The development and improvement of reliable computational methods designed to evaluate the quality of protein models is relevant in the context of protein structure refinement, which has been recently identified as one of the bottlenecks limiting the quality and usefulness of protein structure prediction. Results In this contribution, we present a computational method (Artificial Intelligence Decoys Evaluator: AIDE which is able to consistently discriminate between correct and incorrect protein models. In particular, the method is based on neural networks that use as input 15 structural parameters, which include energy, solvent accessible surface, hydrophobic contacts and secondary structure content. The results obtained with AIDE on a set of decoy structures were evaluated using statistical indicators such as Pearson correlation coefficients, Znat, fraction enrichment, as well as ROC plots. It turned out that AIDE performances are comparable and often complementary to available state-of-the-art learning-based methods. Conclusion In light of the results obtained with AIDE, as well as its comparison with available learning-based methods, it can be concluded that AIDE can be successfully used to evaluate the quality of protein structures. The use of AIDE in combination with other evaluation tools is expected to further enhance protein refinement efforts.

  15. Process Modeling and Validation for Metal Big Area Additive Manufacturing

    Energy Technology Data Exchange (ETDEWEB)

    Simunovic, Srdjan [ORNL; Nycz, Andrzej [ORNL; Noakes, Mark W. [ORNL; Chin, Charlie [Dassault Systemes; Oancea, Victor [Dassault Systemes


    Metal Big Area Additive Manufacturing (mBAAM) is a new additive manufacturing (AM) technology based on the metal arc welding. A continuously fed metal wire is melted by an electric arc that forms between the wire and the substrate, and deposited in the form of a bead of molten metal along the predetermined path. Objects are manufactured one layer at a time starting from the base plate. The final properties of the manufactured object are dependent on its geometry and the metal deposition path, in addition to depending on the basic welding process parameters. Computational modeling can be used to accelerate the development of the mBAAM technology as well as a design and optimization tool for the actual manufacturing process. We have developed a finite element method simulation framework for mBAAM using the new features of software ABAQUS. The computational simulation of material deposition with heat transfer is performed first, followed by the structural analysis based on the temperature history for predicting the final deformation and stress state. In this formulation, we assume that two physics phenomena are coupled in only one direction, i.e. the temperatures are driving the deformation and internal stresses, but their feedback on the temperatures is negligible. The experiment instrumentation (measurement types, sensor types, sensor locations, sensor placements, measurement intervals) and the measurements are presented. The temperatures and distortions from the simulations show good correlation with experimental measurements. Ongoing modeling work is also briefly discussed.

  16. DMFC anode polarization: Experimental analysis and model validation

    Energy Technology Data Exchange (ETDEWEB)

    Casalegno, A.; Marchesi, R. [Dipartimento di Energetica, Politecnico di Milano, Piazza Leonardo da Vinci 32, 20133 Milano (Italy)


    Anode two-phase flow has an important influence on DMFC performance and methanol crossover. In order to elucidate two-phase flow influence on anode performance, in this work, anode polarization is investigated combining experimental and modelling approach. A systematic experimental analysis of operating conditions influence on anode polarization is presented. Hysteresis due to operating condition is observed; experimental results suggest that it arises from methanol accumulation and has to be considered in evaluating DMFC performances and measurements reproducibility. A model of DMFC anode polarization is presented and utilised as tool to investigate anode two-phase flow. The proposed analysis permits one to produce a confident interpretation of the main involved phenomena. In particular, it confirms that methanol electro-oxidation kinetics is weakly dependent on methanol concentration and that methanol transport in gas phase produces an important contribution in anode feeding. Moreover, it emphasises the possibility to optimise anode flow rate in order to improve DMFC performance and reduce methanol crossover. (author)

  17. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models (United States)

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; Price, Stephen; Hoffman, Matthew; Lipscomb, William H.; Fyke, Jeremy; Vargo, Lauren; Boghozian, Adrianna; Norman, Matthew; Worley, Patrick H.


    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptops to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Ultimately, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.

  18. Validity of the One-Dimensional Limp Model for Porous Media

    Directory of Open Access Journals (Sweden)

    O. Doutres


    Full Text Available A straightforward criterion for determining the validity ofthe limp model validity for porous materials is addressed here. The limp model is an “equivalent fluid” model which gives a better description of porous behavior than the well known “rigid frame” model. It is derived from the poroelastic Biot model, assuming that the frame has no bulk stiffness. A criterion is proposed for identifying the porous materials for which the limp model can be used. It relies on a new parameter, the Frame Stiffness Influence FSI, based on porous material properties. The critical values of FSI under which the limp model can be used are determined using 1D analytical modeling for a specific boundary set: radiation of a vibrating plate covered by a porous layer. 

  19. Load Model Verification, Validation and Calibration Framework by Statistical Analysis on Field Data (United States)

    Jiao, Xiangqing; Liao, Yuan; Nguyen, Thai


    Accurate load models are critical for power system analysis and operation. A large amount of research work has been done on load modeling. Most of the existing research focuses on developing load models, while little has been done on developing formal load model verification and validation (V&V) methodologies or procedures. Most of the existing load model validation is based on qualitative rather than quantitative analysis. In addition, not all aspects of model V&V problem have been addressed by the existing approaches. To complement the existing methods, this paper proposes a novel load model verification and validation framework that can systematically and more comprehensively examine load model's effectiveness and accuracy. Statistical analysis, instead of visual check, quantifies the load model's accuracy, and provides a confidence level of the developed load model for model users. The analysis results can also be used to calibrate load models. The proposed framework can be used as a guidance to systematically examine load models for utility engineers and researchers. The proposed method is demonstrated through analysis of field measurements collected from a utility system.

  20. DIE Deflection Modeling: Empirical Validation and Tech Transfer

    Energy Technology Data Exchange (ETDEWEB)

    R. Allen Miller


    This report summarizes computer modeling work that was designed to help understand how the die casting die and machine contribute to parting plane separation during operation. Techniques developed in earlier research (8) were applied to complete a large computational experiment that systematically explored the relationship between the stiffness of the machine platens and key dimensional and structural variables (platen area covered, die thickness, platen thickness, thickness of insert and the location of the die with respect to the platen) describing the die/machine system. The results consistently show that there are many significant interactions among the variables and it is the interactions, more than the individual variables themselves, which determine the performance of the machine/die system. That said, the results consistently show that it is the stiffness of the machine platens that has the largest single impact on die separation.

  1. Theoretical temperature model with experimental validation for CLIC Accelerating Structures

    CERN Document Server

    AUTHOR|(CDS)2126138; Vamvakas, Alex; Alme, Johan

    Micron level stability of the Compact Linear Collider (CLIC) components is one of the main requirements to meet the luminosity goal for the future $48 \\,km$ long underground linear accelerator. The radio frequency (RF) power used for beam acceleration causes heat generation within the aligned structures, resulting in mechanical movements and structural deformations. A dedicated control of the air- and water- cooling system in the tunnel is therefore crucial to improve alignment accuracy. This thesis investigates the thermo-mechanical behavior of the CLIC Accelerating Structure (AS). In CLIC, the AS must be aligned to a precision of $10\\,\\mu m$. The thesis shows that a relatively simple theoretical model can be used within reasonable accuracy to predict the temperature response of an AS as a function of the applied RF power. During failure scenarios or maintenance interventions, the RF power is turned off resulting in no heat dissipation and decrease in the overall temperature of the components. The theoretica...

  2. Validation and comparison of aerodynamic modelling approaches for wind turbines (United States)

    Blondel, F.; Boisard, R.; Milekovic, M.; Ferrer, G.; Lienard, C.; Teixeira, D.


    The development of large capacity Floating Offshore Wind Turbines (FOWT) is an interdisciplinary challenge for the design solvers, requiring accurate modelling of both hydrodynamics, elasticity, servodynamics and aerodynamics all together. Floating platforms will induce low-frequency unsteadiness, and for large capacity turbines, the blade induced vibrations will lead to high-frequency unsteadiness. While yawed inflow conditions are still a challenge for commonly used aerodynamic methods such as the Blade Element Momentum method (BEM), the new sources of unsteadiness involved by large turbine scales and floater motions have to be tackled accurately, keeping the computational cost small enough to be compatible with design and certification purposes. In the light of this, this paper will focus on the comparison of three aerodynamic solvers based on BEM and vortex methods, on standard, yawed and unsteady inflow conditions. We will focus here on up-to-date wind tunnel experiments, such as the Unsteady Aerodynamics Experiment (UAE) database and the MexNext international project.

  3. Modeling users' activity on twitter networks: validation of Dunbar's number.

    Directory of Open Access Journals (Sweden)

    Bruno Gonçalves

    Full Text Available Microblogging and mobile devices appear to augment human social capabilities, which raises the question whether they remove cognitive or biological constraints on human communication. In this paper we analyze a dataset of Twitter conversations collected across six months involving 1.7 million individuals and test the theoretical cognitive limit on the number of stable social relationships known as Dunbar's number. We find that the data are in agreement with Dunbar's result; users can entertain a maximum of 100-200 stable relationships. Thus, the 'economy of attention' is limited in the online world by cognitive and biological constraints as predicted by Dunbar's theory. We propose a simple model for users' behavior that includes finite priority queuing and time resources that reproduces the observed social behavior.

  4. Modeling users' activity on Twitter networks: validation of Dunbar's number (United States)

    Goncalves, Bruno; Perra, Nicola; Vespignani, Alessandro


    Microblogging and mobile devices appear to augment human social capabilities, which raises the question whether they remove cognitive or biological constraints on human communication. In this paper we analyze a dataset of Twitter conversations collected across six months involving 1.7 million individuals and test the theoretical cognitive limit on the number of stable social relationships known as Dunbar's number. We find that the data are in agreement with Dunbar's result; users can entertain a maximum of 100-200 stable relationships. Thus, the ``economy of attention'' is limited in the online world by cognitive and biological constraints as predicted by Dunbar's theory. We propose a simple model for users' behavior that includes finite priority queuing and time resources that reproduces the observed social behavior.

  5. A Visual Analysis Concept for the Validation of Geoscientific Simulation Models. (United States)

    Unger, A; Schulte, S; Klemann, V; Dransch, D


    Geoscientific modeling and simulation helps to improve our understanding of the complex Earth system. During the modeling process, validation of the geoscientific model is an essential step. In validation, it is determined whether the model output shows sufficient agreement with observation data. Measures for this agreement are called goodness of fit. In the geosciences, analyzing the goodness of fit is challenging due to its manifold dependencies: 1) The goodness of fit depends on the model parameterization, whose precise values are not known. 2) The goodness of fit varies in space and time due to the spatio-temporal dimension of geoscientific models. 3) The significance of the goodness of fit is affected by resolution and preciseness of available observational data. 4) The correlation between goodness of fit and underlying modeled and observed values is ambiguous. In this paper, we introduce a visual analysis concept that targets these challenges in the validation of geoscientific models - specifically focusing on applications where observation data is sparse, unevenly distributed in space and time, and imprecise, which hinders a rigorous analytical approach. Our concept, developed in close cooperation with Earth system modelers, addresses the four challenges by four tailored visualization components. The tight linking of these components supports a twofold interactive drill-down in model parameter space and in the set of data samples, which facilitates the exploration of the numerous dependencies of the goodness of fit. We exemplify our visualization concept for geoscientific modeling of glacial isostatic adjustments in the last 100,000 years, validated against sea levels indicators - a prominent example for sparse and imprecise observation data. An initial use case and feedback from Earth system modelers indicate that our visualization concept is a valuable complement to the range of validation methods.

  6. Pre-engineering Spaceflight Validation of Environmental Models and the 2005 HZETRN Simulation Code (United States)

    Nealy, John E.; Cucinotta, Francis A.; Wilson, John W.; Badavi, Francis F.; Dachev, Ts. P.; Tomov, B. T.; Walker, Steven A.; DeAngelis, Giovanni; Blattnig, Steve R.; Atwell, William


    The HZETRN code has been identified by NASA for engineering design in the next phase of space exploration highlighting a return to the Moon in preparation for a Mars mission. In response, a new series of algorithms beginning with 2005 HZETRN, will be issued by correcting some prior limitations and improving control of propagated errors along with established code verification processes. Code validation processes will use new/improved low Earth orbit (LEO) environmental models with a recently improved International Space Station (ISS) shield model to validate computational models and procedures using measured data aboard ISS. These validated models will provide a basis for flight-testing the designs of future space vehicles and systems of the Constellation program in the LEO environment.

  7. Statistical Analysis Methods for Physics Models Verification and Validation

    CERN Document Server

    De Luca, Silvia


    The validation and verification process is a fundamental step for any software like Geant4 and GeantV, which aim to perform data simulation using physics models and Monte Carlo techniques. As experimental physicists, we have to face the problem to compare the results obtained using simulations with what the experiments actually observed. One way to solve the problem is to perform a consistency test. Within the Geant group, we developed a C++ compact library which will be added to the automated validation process on the Geant Validation Portal

  8. Construct validation and the Rasch model: functional ability of healthy elderly people

    DEFF Research Database (Denmark)

    Avlund, K; Kreiner, S; Schultz-Larsen, K


    in the county of Copenhagen. Functional ability was measured with the traditional activities of daily living and with a classification system developed specially for healthy elderly people. Construct validity was tested by the Rasch model for item analysis, addressing specifically the internal validity......The purpose of this study was to test the construct validity of a measure of functional ability, developed with the intention of achieving a high degree of variability and capacity for discriminating among a group of healthy elderly people. Data were collected from 734 70-year-old people in Denmark...

  9. Model Checking Verification and Validation at JPL and the NASA Fairmont IV and V Facility (United States)

    Schneider, Frank; Easterbrook, Steve; Callahan, Jack; Montgomery, Todd


    We show how a technology transfer effort was carried out. The successful use of model checking on a pilot JPL flight project demonstrates the usefulness and the efficacy of the approach. The pilot project was used to model a complex spacecraft controller. Software design and implementation validation were carried out successfully. To suggest future applications we also show how the implementation validation step can be automated. The effort was followed by the formal introduction of the modeling technique as a part of the JPL Quality Assurance process.

  10. Validation of spectral gas radiation models under oxyfuel conditions. Part A: Gas cell experiments

    DEFF Research Database (Denmark)

    Becher, Valentin; Clausen, Sønnik; Fateev, Alexander


    for the validation of new developed models. In part A of the series gas cell transmissivity spectra in the spectral range of 2.4–5.4μm of water vapor and carbon dioxide in the temperature range from 727 to 1500° C and at different concentrations were compared at a nominal resolution of 32cm−1 to line-by-line models......AbstractCombustion of hydrocarbon fuels with pure oxygen results in a different flue gas composition as combustion with air. Standard CFD spectral gas radiation models for air combustion are out of their validity range. The series of three articles provides a common spectral basis...

  11. Contribution to a dynamic wind turbine model validation from a wind farm islanding experiment

    DEFF Research Database (Denmark)

    Pedersen, Jørgen Kaas; Pedersen, Knud Ole Helgesen; Poulsen, Niels Kjølstad


    Measurements from an islanding experiment on the Rejsby Hede wind farm, Denmark, are used for the validation of the dynamic model of grid-connected, stall-controlled wind turbines equipped with induction generators. The simulated results are found to be in good agreement with the measurements...... and possible discrepancies are explained. The work with the wind turbine model validation relates to the dynamic stability investigations on incorporation of large amount of wind power in the Danish power grid, where the dynamic wind turbine model is applied....

  12. Validation of a Wave-Body Interaction Model by Experimental Tests

    DEFF Research Database (Denmark)

    Ferri, Francesco; Kramer, Morten; Pecher, Arthur


    Within the wave energy field, numerical simulation has recently acquired a worldwide consent as being a useful tool, besides physical model testing. The main goal of this work is the validation of a numerical model by experimental results. The numerical model is based on a linear wave-body intera......-body interaction theory, applied for a point absorber wave energy converter. The results show that the ratio floater size/wave amplitude is a key parameter for the validity of the applied theory....

  13. Wave Tank Testing and Model Validation of an Autonomous Wave Energy Converter

    Directory of Open Access Journals (Sweden)

    Bret Bosma


    Full Text Available A key component in bringing ocean wave energy converters from concept to commercialization is the building and testing of scaled prototypes to provide model validation. A one quarter scale prototype of an autonomous two body heaving point absorber was modeled, built, and tested for this work. Wave tank testing results are compared with two hydrodynamic and system models—implemented in both ANSYS AQWA and MATLAB/Simulink—and show model validation over certain regions of operation. This work will serve as a guide for future developers of wave energy converter devices, providing insight in taking their design from concept to prototype stage.

  14. The prospects of a quantitative measurement of agility: A validation study on an agile maturity model


    Lucas, Gren; Torkar, Richard; Robert, Feldt


    Agile development has now become a well-known approach to collaboration in professional work life. Both researchers and practitioners want validated tools to measure agility. This study sets out to validate an agile maturity measurement model with statistical tests and empirical data. First, a pretest was conducted as a case study including a survey and focus group. Second, the main study was conducted with 45 employees from two SAP customers in the US. We used internal consistency (by a Cron...

  15. Review and evaluation of performance measures for survival prediction models in external validation settings. (United States)

    Rahman, M Shafiqur; Ambler, Gareth; Choodari-Oskooei, Babak; Omar, Rumana Z


    When developing a prediction model for survival data it is essential to validate its performance in external validation settings using appropriate performance measures. Although a number of such measures have been proposed, there is only limited guidance regarding their use in the context of model validation. This paper reviewed and evaluated a wide range of performance measures to provide some guidelines for their use in practice. An extensive simulation study based on two clinical datasets was conducted to investigate the performance of the measures in external validation settings. Measures were selected from categories that assess the overall performance, discrimination and calibration of a survival prediction model. Some of these have been modified to allow their use with validation data, and a case study is provided to describe how these measures can be estimated in practice. The measures were evaluated with respect to their robustness to censoring and ease of interpretation. All measures are implemented, or are straightforward to implement, in statistical software. Most of the performance measures were reasonably robust to moderate levels of censoring. One exception was Harrell's concordance measure which tended to increase as censoring increased. We recommend that Uno's concordance measure is used to quantify concordance when there are moderate levels of censoring. Alternatively, Gönen and Heller's measure could be considered, especially if censoring is very high, but we suggest that the prediction model is re-calibrated first. We also recommend that Royston's D is routinely reported to assess discrimination since it has an appealing interpretation. The calibration slope is useful for both internal and external validation settings and recommended to report routinely. Our recommendation would be to use any of the predictive accuracy measures and provide the corresponding predictive accuracy curves. In addition, we recommend to investigate the characteristics

  16. Product Representation to support validation of simulation models in Computer aided engineering


    Kain, Andreas;Gaag, Andreas;Lindemann, Udo


    Computer aided engineering (CAE) provides proper means to support New Product Development (NPD) by simulation tools. Simulation furthers early identification of product characteristics to reduce costs and time. The applicability of simulation models in NPD strongly depends on their validity, thus validating a simulation poses a major issue to provide correct experimentation results. The authors propose a matrix based approach to combine solution neutral system representation, solution specifi...

  17. Calibration and validation of the SWAT model for a forested watershed in coastal South Carolina (United States)

    Devendra M. Amatya; Elizabeth B. Haley; Norman S. Levine; Timothy J. Callahan; Artur Radecki-Pawlik; Manoj K. Jha


    Modeling the hydrology of low-gradient coastal watersheds on shallow, poorly drained soils is a challenging task due to the complexities in watershed delineation, runoff generation processes and pathways, flooding, and submergence caused by tropical storms. The objective of the study is to calibrate and validate a GIS-based spatially-distributed hydrologic model, SWAT...

  18. Modeling and Validation of Fluid Structure Interactions in Passive Micro Valves

    NARCIS (Netherlands)

    Oosterbroek, R.E.; Berenschot, Johan W.; Schlautmann, Stefan; Lammerink, Theodorus S.J.; van den Berg, Albert; Elwenspoek, Michael Curt


    This paper reports the modeling of the stationary flow-structure interaction of different types of micro check valves. Mathematical indirect, domain coupled and decoupled, finite element models as well as analytical approximations are made. The obtained simulation results are validated by performed

  19. Examining the Reliability and Validity of Clinician Ratings on the Five-Factor Model Score Sheet (United States)

    Few, Lauren R.; Miller, Joshua D.; Morse, Jennifer Q.; Yaggi, Kirsten E.; Reynolds, Sarah K.; Pilkonis, Paul A.


    Despite substantial research use, measures of the five-factor model (FFM) are infrequently used in clinical settings due, in part, to issues related to administration time and a reluctance to use self-report instruments. The current study examines the reliability and validity of the Five-Factor Model Score Sheet (FFMSS), which is a 30-item…

  20. Faculty's Acceptance of Computer Based Technology: Cross-Validation of an Extended Model (United States)

    Ahmad, Tunku Badariah Tunku; Madarsha, Kamal Basha; Zainuddin, Ahmad Marzuki; Ismail, Nik Ahmad Hisham; Nordin, Mohamad Sahari


    The first aim of the present study is to validate an extended technology acceptance model (TAME) on the data derived from the faculty members of a university in an ongoing, computer mediated work setting. The study extended the original TAM model by including an intrinsic motivation component--computer self efficacy. In so doing, the study…

  1. Validation of BEHAVE fire behavior predictions in oak savannas using five fuel models (United States)

    Keith Grabner; John Dwyer; Bruce Cutter


    Prescribed fire is a valuable tool in the restoration and management of oak savannas. BEHAVE, a fire behavior prediction system developed by the United States Forest Service, can be a useful tool when managing oak savannas with prescribed fire. BEHAVE predictions of fire rate-of-spread and flame length were validated using four standardized fuel models: Fuel Model 1 (...

  2. Cross-validation of an employee safety climate model in Malaysia. (United States)

    Bahari, Siti Fatimah; Clarke, Sharon


    Whilst substantial research has investigated the nature of safety climate, and its importance as a leading indicator of organisational safety, much of this research has been conducted with Western industrial samples. The current study focuses on the cross-validation of a safety climate model in the non-Western industrial context of Malaysian manufacturing. The first-order factorial validity of Cheyne et al.'s (1998) [Cheyne, A., Cox, S., Oliver, A., Tomas, J.M., 1998. Modelling safety climate in the prediction of levels of safety activity. Work and Stress, 12(3), 255-271] model was tested, using confirmatory factor analysis, in a Malaysian sample. Results showed that the model fit indices were below accepted levels, indicating that the original Cheyne et al. (1998) safety climate model was not supported. An alternative three-factor model was developed using exploratory factor analysis. Although these findings are not consistent with previously reported cross-validation studies, we argue that previous studies have focused on validation across Western samples, and that the current study demonstrates the need to take account of cultural factors in the development of safety climate models intended for use in non-Western contexts. The results have important implications for the transferability of existing safety climate models across cultures (for example, in global organisations) and highlight the need for future research to examine cross-cultural issues in relation to safety climate. Copyright © 2013 National Safety Council and Elsevier Ltd. All rights reserved.

  3. Validation of regression models for nitrate concentrations in the upper groundwater in sandy soils

    NARCIS (Netherlands)

    Sonneveld, M.P.W.; Brus, D.J.; Roelsma, J.


    For Dutch sandy regions, linear regression models have been developed that predict nitrate concentrations in the upper groundwater on the basis of residual nitrate contents in the soil in autumn. The objective of our study was to validate these regression models for one particular sandy region

  4. Model-based wear measurements in total knee arthroplasty : development and validation of novel radiographic techniques

    NARCIS (Netherlands)

    IJsseldijk, van E.A.


    The primary aim of this work was to develop novel model-based mJSW measurement methods using a 3D reconstruction and compare the accuracy and precision of these methods to conventional mJSW measurement. This thesis contributed to the development, validation and clinical application of model-based

  5. Evaluation of Model Validation Techniques in Land Cover Dynamics

    Directory of Open Access Journals (Sweden)

    Xuan Zhu


    Full Text Available This paper applies different methods of map comparison to quantify the characteristics of three different land change models. The land change models used for simulation are termed as “Stochastic Markov (St_Markov”, “Cellular Automata Markov (CA_Markov” and “Multi Layer Perceptron Markov (MLP_Markov” models. Various model validation techniques such as per category method, kappa statistics, components of agreement and disagreement, three map comparison and fuzzy methods have then been applied. A comparative analysis of the validation techniques has also been discussed. In all cases, it is found that “MLP_Markov” gives the best results among the three modeling techniques. Fuzzy set theory is the method that seems best able to distinguish areas of minor spatial errors from major spatial errors. Based on the outcome of this paper, it is recommended that scientists should try to use the Kappa, three map comparison and fuzzy methods for model validation. This paper facilitates communication among land change modelers, because it illustrates the range of results for a variety of model validation techniques and articulates priorities for future research.

  6. Validation of a functional model for integration of safety into process system design

    DEFF Research Database (Denmark)

    Wu, J.; Lind, M.; Zhang, X.


    behavior sufficiently well. With the reasoning capability provided by the MFM syntax and semantics, the validation procedure is illustrated on a three-phase separator system of an MFM model. The MFM model reasoning results successfully compares against analysis results from API RP. 14-C....

  7. External Validation of a Prediction Model for Successful External Cephalic Version

    NARCIS (Netherlands)

    de Hundt, Marcella; Vlemmix, Floortje; Kok, Marjolein; van der Steeg, Jan W.; Bais, Joke M.; Mol, Ben W.; van der Post, Joris A.


    We sought external validation of a prediction model for the probability of a successful external cephalic version (ECV). We evaluated the performance of the prediction model with calibration and discrimination. For clinical practice, we developed a score chart to calculate the probability of a

  8. Validation of infrared thermography in serotonin-induced itch model in rats

    DEFF Research Database (Denmark)

    Dagnæs-Hansen, Frederik; Jasemian, Yousef; Gazerani, Parisa

    The number of scratching bouts is generally used as a standard method in animal models of itch. The aim of the present study was to validate the application of infrared thermography (IR-Th) in a serotonin-induced itch model in rats. Adult Sprague-Dawley male rats (n = 24) were used in 3 consecutive...

  9. Repeated holdout Cross-Validation of Model to Estimate Risk of Lyme Disease by Landscape Attributes (United States)

    We previously modeled Lyme disease (LD) risk at the landscape scale; here we evaluate the model's overall goodness-of-fit using holdout validation. Landscapes were characterized within road-bounded analysis units (AU). Observed LD cases (obsLD) were ascertained per AU. Data were ...

  10. A review of current calibration and validation practices in land-change modeling

    NARCIS (Netherlands)

    Vliet, van Jasper; Bregt, Arnold K.; Brown, Daniel G.; Delden, van Hedwig; Heckbert, Scott; Verburg, Peter H.


    Land-change models are increasingly used to explore land-change dynamics, as well as for policy analyses and scenario studies. In this paper we review calibration and validation approaches adopted for recently published applications of land-change models. We found that statistical analyses and

  11. From steady-state to synchronized yeast glycolytic oscilations II: model validation.

    NARCIS (Netherlands)

    du Preez, F.B.; van Niekerk, D.D.; Snoep, J.L.


    In an accompanying paper [du Preez et al., (2012) FEBS J279, 2810-2822], we adapt an existing kinetic model for steady-state yeast glycolysis to simulate limit-cycle oscillations. Here we validate the model by testing its capacity to simulate a wide range of experiments on dynamics of yeast

  12. Language teacher educators’ pedagogical knowledge: Validating a proposed model

    Directory of Open Access Journals (Sweden)

    Shahab Moradkhani


    Full Text Available The aim of the current study was twofold: identifying the constituent components of language teacher educators’ pedagogical knowledge, and investigating possible differences among teachers, teacher educators, and university professors’ opinions about these components. Data were collected from 436 participants using a questionnaire. The results of factor analysis showed that teacher educators’ pedagogical knowledge comprised of eleven components: teacher education, ELT-related theories, relevant disciplines, technology, context, research, social relations, language-related issues, reflection, teachers, and socio-political issues. Furthermore, the results of multiple sets of one-way ANOVA indicated significant rating differences in five of these components, with teachers registering lower scores, compared to teacher educators and university professors. The components of language teacher educators’ pedagogical knowledge are discussed in light of the proposed model and the available literature. The differences between the three groups of stakeholders’ ideas are also attributed to their job descriptions. This eleven-component questionnaire can be used to assess teacher educators’ pedagogical knowledge. The discrepancy between the three groups of stakeholders’ ideas also shows that a more dialogic approach should be adopted in teacher education programs.

  13. Helicopter noise in hover: Computational modelling and experimental validation (United States)

    Kopiev, V. F.; Zaytsev, M. Yu.; Vorontsov, V. I.; Karabasov, S. A.; Anikin, V. A.


    The aeroacoustic characteristics of a helicopter rotor are calculated by a new method, to assess its applicability in assessing rotor performance in hovering. Direct solution of the Euler equations in a noninertial coordinate system is used to calculate the near-field flow around the spinning rotor. The far-field noise field is calculated by the Ffowcs Williams-Hawkings (FW-H) method using permeable control surfaces that include the blade. For a multiblade rotor, the signal obtained is duplicated and shifted in phase for each successive blade. By that means, the spectral characteristics of the far-field noise may be obtained. To determine the integral aerodynamic characteristics of the rotor, software is written to calculate the thrust and torque characteristics from the near-field flow solution. The results of numerical simulation are compared with experimental acoustic and aerodynamic data for a large-scale model of a helicopter main rotor in an open test facility. Two- and four-blade configurations of the rotor are considered, in different hover conditions. The proposed method satisfactorily predicts the aerodynamic characteristics of the blades in such conditions and gives good estimates for the first harmonics of the noise. That permits the practical use of the proposed method, not only for hovering but also for forward flight.

  14. VS2DI: Model use, calibration, and validation (United States)

    Healy, Richard W.; Essaid, Hedeff I.


    VS2DI is a software package for simulating water, solute, and heat transport through soils or other porous media under conditions of variable saturation. The package contains a graphical preprocessor for constructing simulations, a postprocessor for displaying simulation results, and numerical models that solve for flow and solute transport (VS2DT) and flow and heat transport (VS2DH). Flow is described by the Richards equation, and solute and heat transport are described by advection-dispersion equations; the finite-difference method is used to solve these equations. Problems can be simulated in one, two, or three (assuming radial symmetry) dimensions. This article provides an overview of calibration techniques that have been used with VS2DI; included is a detailed description of calibration procedures used in simulating the interaction between groundwater and a stream fed by drainage from agricultural fields in central Indiana. Brief descriptions of VS2DI and the various types of problems that have been addressed with the software package are also presented.

  15. Predictive models of safety based on audit findings: Part 2: Measurement of model validity. (United States)

    Hsiao, Yu-Lin; Drury, Colin; Wu, Changxu; Paquet, Victor


    Part 1 of this study sequence developed a human factors/ergonomics (HF/E) based classification system (termed HFACS-MA) for safety audit findings and proved its measurement reliability. In Part 2, we used the human error categories of HFACS-MA as predictors of future safety performance. Audit records and monthly safety incident reports from two airlines submitted to their regulatory authority were available for analysis, covering over 6.5 years. Two participants derived consensus results of HF/E errors from the audit reports using HFACS-MA. We adopted Neural Network and Poisson regression methods to establish nonlinear and linear prediction models respectively. These models were tested for the validity of prediction of the safety data, and only Neural Network method resulted in substantially significant predictive ability for each airline. Alternative predictions from counting of audit findings and from time sequence of safety data produced some significant results, but of much smaller magnitude than HFACS-MA. The use of HF/E analysis of audit findings provided proactive predictors of future safety performance in the aviation maintenance field. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  16. Data-Driven Residential Load Modeling and Validation in GridLAB-D

    Energy Technology Data Exchange (ETDEWEB)

    Gotseff, Peter; Lundstrom, Blake


    Accurately characterizing the impacts of high penetrations of distributed energy resources (DER) on the electric distribution system has driven modeling methods from traditional static snap shots, often representing a critical point in time (e.g., summer peak load), to quasi-static time series (QSTS) simulations capturing all the effects of variable DER, associated controls and hence, impacts on the distribution system over a given time period. Unfortunately, the high time resolution DER source and load data required for model inputs is often scarce or non-existent. This paper presents work performed within the GridLAB-D model environment to synthesize, calibrate, and validate 1-second residential load models based on measured transformer loads and physics-based models suitable for QSTS electric distribution system modeling. The modeling and validation approach taken was to create a typical GridLAB-D model home that, when replicated to represent multiple diverse houses on a single transformer, creates a statistically similar load to a measured load for a given weather input. The model homes are constructed to represent the range of actual homes on an instrumented transformer: square footage, thermal integrity, heating and cooling system definition as well as realistic occupancy schedules. House model calibration and validation was performed using the distribution transformer load data and corresponding weather. The modeled loads were found to be similar to the measured loads for four evaluation metrics: 1) daily average energy, 2) daily average and standard deviation of power, 3) power spectral density, and 4) load shape.

  17. Development and Validation of a Mortality Prediction Model for Patients Receiving 14 Days of Mechanical Ventilation. (United States)

    Hough, Catherine L; Caldwell, Ellen S; Cox, Christopher E; Douglas, Ivor S; Kahn, Jeremy M; White, Douglas B; Seeley, Eric J; Bangdiwala, Shrikant I; Rubenfeld, Gordon D; Angus, Derek C; Carson, Shannon S


    The existing risk prediction model for patients requiring prolonged mechanical ventilation is not applicable until after 21 days of mechanical ventilation. We sought to develop and validate a mortality prediction model for patients earlier in the ICU course using data from day 14 of mechanical ventilation. Multicenter retrospective cohort study. Forty medical centers across the United States. Adult patients receiving at least 14 days of mechanical ventilation. None. Predictor variables were measured on day 14 of mechanical ventilation in the development cohort and included in a logistic regression model with 1-year mortality as the outcome. Variables were sequentially eliminated to develop the ProVent 14 model. This model was then generated in the validation cohort. A simplified prognostic scoring rule (ProVent 14 Score) using categorical variables was created in the development cohort and then tested in the validation cohort. Model discrimination was assessed by the area under the receiver operator characteristic curve. Four hundred ninety-one patients and 245 patients were included in the development and validation cohorts, respectively. The most parsimonious model included age, platelet count, requirement for vasopressors, requirement for hemodialysis, and nontrauma admission. The area under the receiver operator characteristic curve for the ProVent 14 model using continuous variables was 0.80 (95% CI, 0.76-0.83) in the development cohort and 0.78 (95% CI, 0.72-0.83) in the validation cohort. The ProVent 14 Score categorized age at 50 and 65 years old and platelet count at 100×10(9)/L and had similar discrimination as the ProVent 14 model in both cohorts. Using clinical variables available on day 14 of mechanical ventilation, the ProVent 14 model can identify patients receiving prolonged mechanical ventilation with a high risk of mortality within 1 year.

  18. Validation of a χ2 model of HRR target RCS variability and verification of the resulting ATR performance model (United States)

    Holt, Craig R.; Attili, Joseph B.; Schmidt, Steven L.


    A (chi) 2 model for radar cross section (RCS) variability of High Range Resolution (HRR) measurements is validated using compact range data from the U.S. Army National Ground Intelligence Center (NGIC). It is shown that targets can be represented by a mean template and by a variance template, or in this case, an effective number of degrees-of-freedom for the (chi) 2-distribution. The analysis also includes comparison of the measured tails of the RCS distribution to that predicated by the (chi) 2-distribution. The likelihood classifier is obtained, and a Monte Carlo performance model is developed to validate the statistical model at the level of ATR performance.

  19. Prediction of the thermal decomposition of organic peroxides by validated QSPR models

    Energy Technology Data Exchange (ETDEWEB)

    Prana, Vinca [Institut de Recherche de Chimie Paris, Chimie ParisTech CNRS, 11 rue P. et M. Curie, Paris 75005 (France); Institut National de l’Environnement Industriel et des Risques (INERIS), Parc Technologique Alata, BP2, Verneuil-en-Halatte 60550 (France); Rotureau, Patricia, E-mail: [Institut National de l’Environnement Industriel et des Risques (INERIS), Parc Technologique Alata, BP2, Verneuil-en-Halatte 60550 (France); Fayet, Guillaume [Institut National de l’Environnement Industriel et des Risques (INERIS), Parc Technologique Alata, BP2, Verneuil-en-Halatte 60550 (France); André, David; Hub, Serge [ARKEMA, rue Henri Moissan, BP63, Pierre Benite 69493 (France); Vicot, Patricia [Institut National de l’Environnement Industriel et des Risques (INERIS), Parc Technologique Alata, BP2, Verneuil-en-Halatte 60550 (France); Rao, Li [Institut de Recherche de Chimie Paris, Chimie ParisTech CNRS, 11 rue P. et M. Curie, Paris 75005 (France); Adamo, Carlo [Institut de Recherche de Chimie Paris, Chimie ParisTech CNRS, 11 rue P. et M. Curie, Paris 75005 (France); Institut Universitaire de France, 103 Boulevard Saint Michel, Paris F-75005 (France)


    Highlights: • QSPR models were developed for thermal stability of organic peroxides. • Two accurate MLR models were exhibited based on quantum chemical descriptors. • Performances were evaluated by a series of internal and external validations. • The new QSPR models satisfied all OCDE principles of validation for regulatory use. - Abstract: Organic peroxides are unstable chemicals which can easily decompose and may lead to explosion. Such a process can be characterized by physico-chemical parameters such as heat and temperature of decomposition, whose determination is crucial to manage related hazards. These thermal stability properties are also required within many regulatory frameworks related to chemicals in order to assess their hazardous properties. In this work, new quantitative structure–property relationships (QSPR) models were developed to predict accurately the thermal stability of organic peroxides from their molecular structure respecting the OECD guidelines for regulatory acceptability of QSPRs. Based on the acquisition of 38 reference experimental data using DSC (differential scanning calorimetry) apparatus in homogenous experimental conditions, multi-linear models were derived for the prediction of the decomposition heat and the onset temperature using different types of molecular descriptors. Models were tested by internal and external validation tests and their applicability domains were defined and analyzed. Being rigorously validated, they presented the best performances in terms of fitting, robustness and predictive power and the descriptors used in these models were linked to the peroxide bond whose breaking represents the main decomposition mechanism of organic peroxides.

  20. Predictive modeling of infrared radiative heating in tomato dry-peeling process: Part II. Model validation and sensitivity analysis (United States)

    A predictive mathematical model was developed to simulate heat transfer in a tomato undergoing double sided infrared (IR) heating in a dry-peeling process. The aims of this study were to validate the developed model using experimental data and to investigate different engineering parameters that mos...

  1. Conceptualization of Approaches and Thought Processes Emerging in Validating of Model in Mathematical Modeling in Technology Aided Environment (United States)

    Hidiroglu, Çaglar Naci; Bukova Güzel, Esra


    The aim of the present study is to conceptualize the approaches displayed for validation of model and thought processes provided in mathematical modeling process performed in technology-aided learning environment. The participants of this grounded theory study were nineteen secondary school mathematics student teachers. The data gathered from the…

  2. Stream Heat Budget Modeling of Groundwater Inputs: Model Development and Validation (United States)

    Glose, A.; Lautz, L. K.


    Models of physical processes in fluvial systems are useful for improving understanding of hydrologic systems and for predicting future conditions. Process-based models of fluid flow and heat transport in fluvial systems can be used to quantify unknown spatial and temporal patterns of hydrologic fluxes, such as groundwater discharge, and to predict system response to future change. In this study, a stream heat budget model was developed and calibrated to observed stream water temperature data for Meadowbrook Creek in Syracuse, NY. The one-dimensional (longitudinal), transient stream temperature model is programmed in Matlab and solves the equations for heat and fluid transport using a Crank-Nicholson finite difference scheme. The model considers four meteorologically driven heat fluxes: shortwave solar radiation, longwave radiation, latent heat flux, and sensible heat flux. Streambed conduction is also considered. Input data for the model were collected from June 13-18, 2012 over a 500 m reach of Meadowbrook Creek, a first order urban stream that drains a retention pond in the city of Syracuse, NY. Stream temperature data were recorded every 20 m longitudinally in the stream at 5-minute intervals using iButtons (model DS1922L, accuracy of ±0.5°C, resolution of 0.0625°C). Meteorological data, including air temperature, solar radiation, relative humidity, and wind speed, were recorded at 5-minute intervals using an on-site weather station. Groundwater temperature was measured in wells adjacent to the stream. Stream dimensions, bed temperatures, and type of bed sediments were also collected. A constant rate tracer injection of Rhodamine WT was used to quantify groundwater inputs every 10 m independently to validate model results. Stream temperatures fluctuated diurnally by ~3-5 °C during the observation period with temperatures peaking around 2 pm and cooling overnight, reaching a minimum between 6 and 7 am. Spatially, the stream shows a cooling trend along the

  3. Coupling the land surface model NOAHMP with the generic crop growth model GECROS: Model calibration and validation (United States)

    Ingwersen, Joachim; Högy, Petra; Wizemann, Hans-Dieter; Streck, Thilo


    Weather and climate simulations depend on an accurate description of the exchange of water, energy and momentum between land surface and atmosphere. In state-of-the-art land surface models the vegetation dynamics is "frozen" that means prescribed in lookup tables. As a consequence growth and development of a crop is independent from the prevailing weather conditions, and an important feedback between atmosphere and land surface is not captured. In the present study we coupled the land surface model NOAHMP with the mechanistic generic crop growth model GECROS. On the basis of a comprehensive 5-year dataset on eddy covariance energy- and water fluxes and soil water and crop data from two different climate regions of Southwest Germany, we adapted the crop growth model GECROS, integrated it with NOAHMP, calibrated the coupled model for winter wheat and silage maize and tested its robustness in multiple-year validation runs against independent measurements. For winter wheat the model performed well both for the calibration and validation phase. Inter-annual and regional differences in crop development due to temperature anomalies were well reproduced by the model. Also the decline of evapotranspiration over the maturing phase was properly simulated. In case of maize the model performed not as good as for winter wheat. We attribute this somewhat lower model performance to the pronounced differences among maize cultivars, the high sensitivity of maize development to drill and emergence date, and its higher susceptibility to early summer droughts. Moreover, the model systematically overestimated evapotranspiration during long lasting droughts like in June 2014 indicating that in the current state NOAHMP-GECROS has some limitations in simulating water stress. We attribute this weakness to the uniform root distribution and the hydraulic functions (Clapp-Hornberger) that are implemented in NOAHMP which result in a uniform depletion of the soil water profile. The novel model

  4. Systematic validation of non-equilibrium thermochemical models using Bayesian inference

    KAUST Repository

    Miki, Kenji


    © 2015 Elsevier Inc. The validation process proposed by Babuška et al. [1] is applied to thermochemical models describing post-shock flow conditions. In this validation approach, experimental data is involved only in the calibration of the models, and the decision process is based on quantities of interest (QoIs) predicted on scenarios that are not necessarily amenable experimentally. Moreover, uncertainties present in the experimental data, as well as those resulting from an incomplete physical model description, are propagated to the QoIs. We investigate four commonly used thermochemical models: a one-temperature model (which assumes thermal equilibrium among all inner modes), and two-temperature models developed by Macheret et al. [2], Marrone and Treanor [3], and Park [4]. Up to 16 uncertain parameters are estimated using Bayesian updating based on the latest absolute volumetric radiance data collected at the Electric Arc Shock Tube (EAST) installed inside the NASA Ames Research Center. Following the solution of the inverse problems, the forward problems are solved in order to predict the radiative heat flux, QoI, and examine the validity of these models. Our results show that all four models are invalid, but for different reasons: the one-temperature model simply fails to reproduce the data while the two-temperature models exhibit unacceptably large uncertainties in the QoI predictions.

  5. Learning brain aneurysm microsurgical skills in a human placenta model: predictive validity. (United States)

    de Oliveira, Marcelo Magaldi Ribeiro; Ferrarez, Carlos Eduardo; Ramos, Taise Mosso; Malheiros, Jose Augusto; Nicolato, Arthur; Machado, Carla Jorge; Ferreira, Mauro Tostes; de Oliveira, Fellype Borges; de Sousa, Cecília Félix Penido Mendes; Costa, Pollyana Helena Vieira; Gusmao, Sebastiao; Lanzino, Giuseppe; Maestro, Rolando Del


    OBJECTIVE Surgery for brain aneurysms is technically demanding. In recent years, the process to learn the technical skills necessary for these challenging procedures has been affected by a decrease in the number of surgical cases available and progressive restrictions on resident training hours. To overcome these limitations, surgical simulators such as cadaver heads and human placenta models have been developed. However, the effectiveness of these models in improving technical skills is unknown. This study assessed concurrent and predictive validity of brain aneurysm surgery simulation in a human placenta model compared with a "live" human brain cadaveric model. METHODS Two human cadaver heads and 30 human placentas were used. Twelve neurosurgeons participated in the concurrent validity part of this study, each operating on 1 human cadaver head aneurysm model and 1 human placenta model. Simulators were evaluated regarding their ability to simulate different surgical steps encountered during real surgery. The time to complete the entire aneurysm task in each simulator was analyzed. The predictive validity component of the study involved 9 neurosurgical residents divided into 3 groups to perform simulation exercises, each lasting 6 weeks. The training for the 3 groups consisted of educational video only (3 residents), human cadaver only (3 residents), and human placenta only (3 residents). All residents had equivalent microsurgical experience with superficial brain tumor surgery. After completing their practice training, residents in each of the 3 simulation groups performed surgery for an unruptured middle cerebral artery (MCA) aneurysm, and their performance was assessed by an experienced vascular neurosurgeon who watched the operative videos. RESULTS All human cadaver heads and human placentas were suitable to simulate brain aneurysm surgery. In the concurrent validity portion of the experiment, the placenta model required a longer time (p model was considered

  6. An experimentally validated simulation model for a four-stage spray dryer

    DEFF Research Database (Denmark)

    Petersen, Lars Norbert; Poulsen, Niels Kjølstad; Niemann, Hans Henrik


    is divided into four consecutive stages: a primary spray drying stage, two heated fluid bed stages, and a cooling fluid bed stage. Each of these stages in the model is assumed ideally mixed and the dynamics are described by mass- and energy balances. These balance equations are coupled with constitutive...... mathematical model is an index-1 differential algebraic equation (DAE) model with 12 states, 9 inputs, 8 disturbances, and 30 parameters. The parameters in the model are identified from well-excited experimental data obtained from the industrialtype spray dryer. The simulated outputs ofthe model are validated...... using independent well-excited experimental data from the same spray dryer. The simulated temperatures, humidities, and residual moistures in the spray dryer compare well to the validation data. The model also provides the profit of operation, the production rate, the energy consumption, and the energy...

  7. Developing patient-specific anatomic models for validation of cardiac ablation guidance procedures (United States)

    Holmes, David, III; Rettmann, Maryam; Cameron, Bruce; Camp, Jon; Robb, Richard


    Image-guided cardiac ablation has the potential to decrease procedure times and improve clinical outcome for patients with cardiac arrhythmias. There are several proposed methods for integrating patient-specific anatomy into the cardiac ablation procedure; however, these methods require thorough validation. One of the primary challenges in validation is determining ground truth as a standard for comparison. Some validation protocols have been developed for animals models and even in patients; however, these methods can be costly to implement and may increase the risk to patients. We have developed an approach to building realistic patient-specific anatomic models at a low-cost in order to validate the guidance procedure without introducing additional risk to the patients. Using a pre-procedural cardiac computed tomography scan, the blood pool of the left and right atria of a patient are segmented semi-manually. In addition, several anatomical landmarks are identified in the image data. The segmented atria and landmarks are converted into a polygonalized model which is used to build a thin-walled patient-specific blood pool model in a stereo-lithography system. Thumbscrews are inserted into the model at the landmarks. The entire model is embedded in a platinum silicone material which has been shown to have tissue-mimicking properties relative to ultrasound. Once the pliable mold has set, the blood pool model is extracted by dissolving the rigid material. The resulting physical model correctly mimics a specific patient anatomy with embedded fiducals which can be used for validation experiments. The patient-specific anatomic model approach may also be used for pre-surgical practice and training of new interventionalists.

  8. CFD modelling of hydrogen stratification in enclosures: Model validation and application to PAR performance

    Energy Technology Data Exchange (ETDEWEB)

    Hoyes, J.R., E-mail:; Ivings, M.J.


    Highlights: • The ability of CFD to predict hydrogen stratification phenomena is investigated. • Contrary to expectation, simulations on tetrahedral meshes under-predict mixing. • Simulations on structured meshes give good agreement with experimental data. • CFD model used to investigate the effects of stratification on PAR performance. • Results show stratification can have a significant effect on PAR performance. - Abstract: Computational Fluid Dynamics (CFD) models are maturing into useful tools for supporting safety analyses. This paper investigates the capabilities of CFD models for predicting hydrogen stratification in a containment vessel using data from the NEA/OECD SETH2 MISTRA experiments. Further simulations are then carried out to illustrate the qualitative effects of hydrogen stratification on the performance of Passive Autocatalytic Recombiner (PAR) units. The MISTRA experiments have well-defined initial and boundary conditions which makes them well suited for use in a validation study. Results are presented for the sensitivity to mesh resolution and mesh type. Whilst the predictions are shown to be largely insensitive to the mesh resolution they are surprisingly sensitive to the mesh type. In particular, tetrahedral meshes are found to induce small unphysical convection currents that result in molecular diffusion and turbulent mixing being under-predicted. This behaviour is not unique to the CFD model used here (ANSYS CFX) and furthermore, it may affect simulations run on other non-aligned meshes (meshes that are not aligned perpendicular to gravity), including non-aligned structured meshes. Following existing best practice guidelines can help to identify potential unphysical predictions, but as an additional precaution consideration should be given to using gravity-aligned meshes for modelling stratified flows. CFD simulations of hydrogen recombination in the Becker Technologies THAI facility are presented with high and low PAR positions

  9. Understanding Dynamic Model Validation of a Wind Turbine Generator and a Wind Power Plant: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Muljadi, Eduard; Zhang, Ying Chen; Gevorgian, Vahan; Kosterev, Dmitry


    Regional reliability organizations require power plants to validate the dynamic models that represent them to ensure that power systems studies are performed to the best representation of the components installed. In the process of validating a wind power plant (WPP), one must be cognizant of the parameter settings of the wind turbine generators (WTGs) and the operational settings of the WPP. Validating the dynamic model of a WPP is required to be performed periodically. This is because the control parameters of the WTGs and the other supporting components within a WPP may be modified to comply with new grid codes or upgrades to the WTG controller with new capabilities developed by the turbine manufacturers or requested by the plant owners or operators. The diversity within a WPP affects the way we represent it in a model. Diversity within a WPP may be found in the way the WTGs are controlled, the wind resource, the layout of the WPP (electrical diversity), and the type of WTGs used. Each group of WTGs constitutes a significant portion of the output power of the WPP, and their unique and salient behaviors should be represented individually. The objective of this paper is to illustrate the process of dynamic model validations of WTGs and WPPs, the available data recorded that must be screened before it is used for the dynamic validations, and the assumptions made in the dynamic models of the WTG and WPP that must be understood. Without understanding the correct process, the validations may lead to the wrong representations of the WTG and WPP modeled.

  10. Community-Wide Validation of Geospace Model Local K-Index Predictions to Support Model Transition to Operations (United States)

    Glocer, A.; Rastaetter, L.; Kuznetsova, M.; Pulkkinen, A.; Singer, H. J.; Balch, C.; Weimer, D.; Welling, D.; Wiltberger, M.; Raeder, J.; hide


    We present the latest result of a community-wide space weather model validation effort coordinated among the Community Coordinated Modeling Center (CCMC), NOAA Space Weather Prediction Center (SWPC), model developers, and the broader science community. Validation of geospace models is a critical activity for both building confidence in the science results produced by the models and in assessing the suitability of the models for transition to operations. Indeed, a primary motivation of this work is supporting NOAA/SWPCs effort to select a model or models to be transitioned into operations. Our validation efforts focus on the ability of the models to reproduce a regional index of geomagnetic disturbance, the local K-index. Our analysis includes six events representing a range of geomagnetic activity conditions and six geomagnetic observatories representing midlatitude and high-latitude locations. Contingency tables, skill scores, and distribution metrics are used for the quantitative analysis of model performance. We consider model performance on an event-by-event basis, aggregated over events, at specific station locations, and separated into high-latitude and midlatitude domains. A summary of results is presented in this report, and an online tool for detailed analysis is available at the CCMC.

  11. Error Modelling and Experimental Validation for a Planar 3-PPR Parallel Manipulator

    DEFF Research Database (Denmark)

    Wu, Guanglei; Bai, Shaoping; Kepler, Jørgen Asbøl


    In this paper, the positioning error of a 3-PPR planar parallel manipulator is studied with an error model and experimental validation. First, the displacement and workspace are analyzed. An error model considering both configuration errors and joint clearance errors is established. Using...... this model, the maximum positioning error was estimated for a U-shape PPR planar manipulator, the results being compared with the experimental measurements. It is found that the error distributions from the simulation is approximate to that of themeasurements....

  12. Accounting for treatment use when validating a prognostic model: a simulation study

    Directory of Open Access Journals (Sweden)

    Romin Pajouheshnia


    Full Text Available Abstract Background Prognostic models often show poor performance when applied to independent validation data sets. We illustrate how treatment use in a validation set can affect measures of model performance and present the uses and limitations of available analytical methods to account for this using simulated data. Methods We outline how the use of risk-lowering treatments in a validation set can lead to an apparent overestimation of risk by a prognostic model that was developed in a treatment-naïve cohort to make predictions of risk without treatment. Potential methods to correct for the effects of treatment use when testing or validating a prognostic model are discussed from a theoretical perspective.. Subsequently, we assess, in simulated data sets, the impact of excluding treated individuals and the use of inverse probability weighting (IPW on the estimated model discrimination (c-index and calibration (observed:expected ratio and calibration plots in scenarios with different patterns and effects of treatment use. Results Ignoring the use of effective treatments in a validation data set leads to poorer model discrimination and calibration than would be observed in the untreated target population for the model. Excluding treated individuals provided correct estimates of model performance only when treatment was randomly allocated, although this reduced the precision of the estimates. IPW followed by exclusion of the treated individuals provided correct estimates of model performance in data sets where treatment use was either random or moderately associated with an individual's risk when the assumptions of IPW were met, but yielded incorrect estimates in the presence of non-positivity or an unobserved confounder. Conclusions When validating a prognostic model developed to make predictions of risk without treatment, treatment use in the validation set can bias estimates of the performance of the model in future targeted individuals, and

  13. Development and Validation of Methodology to Model Flow in Ventilation Systems Commonly Found in Nuclear Facilities. Phase I

    Energy Technology Data Exchange (ETDEWEB)

    Strons, Philip [Argonne National Lab. (ANL), Argonne, IL (United States); Bailey, James L. [Argonne National Lab. (ANL), Argonne, IL (United States); Davis, John [Argonne National Lab. (ANL), Argonne, IL (United States); Grudzinski, James [Argonne National Lab. (ANL), Argonne, IL (United States); Hlotke, John [Argonne National Lab. (ANL), Argonne, IL (United States)


    In this work, we apply the CFD in modeling airflow and particulate transport. This modeling is then compared to field validation studies to both inform and validate the modeling assumptions. Based on the results of field tests, modeling assumptions and boundary conditions are refined and the process is repeated until the results are found to be reliable with a high level of confidence.

  14. Validation of a fluid-structure interaction numerical model for predicting flow transients in arteries. (United States)

    Kanyanta, V; Ivankovic, A; Karac, A


    Fluid-structure interaction (FSI) numerical models are now widely used in predicting blood flow transients. This is because of the importance of the interaction between the flowing blood and the deforming arterial wall to blood flow behaviour. Unfortunately, most of these FSI models lack rigorous validation and, thus, cannot guarantee the accuracy of their predictions. This paper presents the comprehensive validation of a two-way coupled FSI numerical model, developed to predict flow transients in compliant conduits such as arteries. The model is validated using analytical solutions and experiments conducted on polyurethane mock artery. Flow parameters such as pressure and axial stress (and precursor) wave speeds, wall deformations and oscillating frequency, fluid velocity and Poisson coupling effects, were used as the basis of this validation. Results show very good comparison between numerical predictions, analytical solutions and experimental data. The agreement between the three approaches is generally over 95%. The model also shows accurate prediction of Poisson coupling effects in unsteady flows through flexible pipes, which up to this stage have only being predicted analytically. Therefore, this numerical model can accurately predict flow transients in compliant vessels such as arteries.

  15. Bayesian Nonparametric Model for the Validation of Peptide Identification in Shotgun Proteomics*S⃞ (United States)

    Zhang, Jiyang; Ma, Jie; Dou, Lei; Wu, Songfeng; Qian, Xiaohong; Xie, Hongwei; Zhu, Yunping; He, Fuchu


    Tandem mass spectrometry combined with database searching allows high throughput identification of peptides in shotgun proteomics. However, validating database search results, a problem with a lot of solutions proposed, is still advancing in some aspects, such as the sensitivity, specificity, and generalizability of the validation algorithms. Here a Bayesian nonparametric (BNP) model for the validation of database search results was developed that incorporates several popular techniques in statistical learning, including the compression of feature space with a linear discriminant function, the flexible nonparametric probability density function estimation for the variable probability structure in complex problem, and the Bayesian method to calculate the posterior probability. Importantly the BNP model is compatible with the popular target-decoy database search strategy naturally. We tested the BNP model on standard proteins and real, complex sample data sets from multiple MS platforms and compared it with PeptideProphet, the cutoff-based method, and a simple nonparametric method (proposed by us previously). The performance of the BNP model was shown to be superior for all data sets searched on sensitivity and generalizability. Some high quality matches that had been filtered out by other methods were detected and assigned with high probability by the BNP model. Thus, the BNP model could be able to validate the database search results effectively and extract more information from MS/MS data. PMID:19005226

  16. Unsuccessful validation of 2004 model for predicting academic or behavioural limitations after childhood bacterial meningitis. (United States)

    de Jonge, R C J; Sanders, M S; Terwee, C B; Heymans, M W; Gemke, R J B J; Koomen, I; Spanjaard, L; van Furth, A M


    In 2004, a model identifying children at risk of academic or behavioural limitations after bacterial meningitis (BM) was presented. Risk factors were male gender, low birthweight, lower educational level of the father, Streptococcus pneumoniae, lower cerebrospinal fluid (CSF) leucocyte count, delay between admission and start of antibiotics, dexamethasone validate that prediction model in an independent cohort. Academic or behavioural limitations were determined in 93 Dutch school-age BM survivors. Risk factors for limitations were obtained from medical files. Validation was performed by applying the model in the cohort, then assessing discrimination and goodness of fit. Multiple imputation techniques were used to deal with missing values. Although fit of the model appeared good when it came to similarity of expected and observed cases (p-value of the Hosmer-Lemeshow test 0.24-0.57), discrimination was poor. Area under the curve (AUC) of the receiver operated characteristics (ROC) curve of the model was 0.83 (95% CI: 0.77-0.89) in the development cohort and 0.53 (95% CI: 0.41-0.65) in the validation cohort. External validation of the model was unsuccessful. It is not suitable for implementation in practice. ©2013 Foundation Acta Paediatrica. Published by John Wiley & Sons Ltd.

  17. Development and Validation of a Materials Preparation Model from the Perspective of Transformative Pedagogy

    Directory of Open Access Journals (Sweden)

    Hamed Barjesteh


    Full Text Available This study is a report on the design, development, and validation of a model within the main tenets of critical pedagogy (CP with a hope to implement in education in general and applied linguistics in particular. To develop a transformative L2 materials preparation (TLMP model, the researchers drew on Crawford’s (1978 principles of CP as a springboard. These principles provide the theoretical framework of the ELT program in general, but they need to be adapted to the specific features of L2 materials development. To this end, Nation and Macalister’s (2010 model of materials development were utilized to base different aspects of materials preparation. The newly developed model has driven 22 principles which was validated through a stepwise process. It was administered among 110 participants in 15 cities of Iran. Exploratory and confirmatory factor analyses were performed. The results indicated a high level of internal consistency and satisfactory construct validity. The TLMP model could be conducive for language policy makers, ELT professionals, materials and curriculum developers. Keywords: Critical pedagogy, materials development, transformative model, ELT community, development, validation

  18. Validations and improvements of airfoil trailing-edge noise prediction models using detailed experimental data

    DEFF Research Database (Denmark)

    Kamruzzaman, M.; Lutz, Th.; Würz, W.


    This paper describes an extensive assessment and a step by step validation of different turbulent boundary-layer trailing-edge noise prediction schemes developed within the European Union funded wind energy project UpWind. To validate prediction models, measurements of turbulent boundary...... and far-field radiated noise models capture well the measured peak amplitude level as well as the peak position if the turbulence noise source parameters are estimated properly including turbulence anisotropy effects. Large eddy simulation based computational aeroacoustic computations show good agreements...

  19. Experimental validation of a mathematical model for seabed liquefaction in waves

    DEFF Research Database (Denmark)

    Sumer, B. Mutlu; Kirca, Özgür; Fredsøe, Jørgen


    This paper summarizes the results of an experimental study directed towards the validation of a mathematical model for the buildup of pore water pressure and resulting liquefaction of marine soils under progressive waves. Experiments were conducted under controlled conditions with silt ( d50 = 0.......070 mm) in a wave flume with a soil pit. Waves with wave heights in the range 7.7-18 cm with the water depth 55 cm and the wave period 1.6 s enabled us to study both the liquefaction and no-liquefaction regime pore water pressure buildup. The experimental data was used to validate the model. A numerical...

  20. Validation of intercultural sensitivity three-factor model in Malaysian context

    Directory of Open Access Journals (Sweden)

    Yunus Norzita


    Full Text Available A plethora of studies have indicated the importance of intercultural sensitivity (IS in today’s highly interconnected, global world. Despite that, in Malaysian context, very limited reliable instruments are available to measure intercultural sensitivity. A study using Malaysian dataset by Tamam (2010 found a three-factor model of the Chen and Starosta’s five-factor model of Intercultural Sensitivity Scale (ISS; however, the model was exploratory and has yet to be validated. Therefore, the purpose of this study was to further validate the intercultural sensitivity three-factor model within Malaysian collectivistic and multicultural context. Using survey as a means of data collection, 1000 undergraduate students at three higher education institutions completed self-administered questionnaires. The three-factor model found earlier was subject to Confirmatory Factor Analysis using Analysis of Moments Structures (Amos software. The results showed that the intercultural sensitivity three-factor model showed a good fit thus indicating that the three-factor model is a viable alternative to the original model. In conclusion, the three-factor model is a valid and reliable instrument to measure intercultural sensitivity within Malaysian collectivistic and multicultural context.

  1. Validation of road vehicle and traffic emission models - A review and meta-analysis (United States)

    Smit, Robin; Ntziachristos, Leonidas; Boulter, Paul


    Road transport is often the main source of air pollution in urban areas, and there is an increasing need to estimate its contribution precisely so that pollution-reduction measures (e.g. emission standards, scrapage programs, traffic management, ITS) are designed and implemented appropriately. This paper presents a meta-analysis of 50 studies dealing with the validation of various types of traffic emission model, including 'average speed', 'traffic situation', 'traffic variable', 'cycle variable', and 'modal' models. The validation studies employ measurements in tunnels, ambient concentration measurements, remote sensing, laboratory tests, and mass-balance techniques. One major finding of the analysis is that several models are only partially validated or not validated at all. The mean prediction errors are generally within a factor of 1.3 of the observed values for CO 2, within a factor of 2 for HC and NO x, and within a factor of 3 for CO and PM, although differences as high as a factor of 5 have been reported. A positive mean prediction error for NO x (i.e. overestimation) was established for all model types and practically all validation techniques. In the case of HC, model predictions have been moving from underestimation to overestimation since the 1980s. The large prediction error for PM may be associated with different PM definitions between models and observations (e.g. size, measurement principle, exhaust/non-exhaust contribution). Statistical analyses show that the mean prediction error is generally not significantly different ( p vital elements currently lacking in traffic emissions modelling: 1) guidance on the allowable error margins for different applications/scales, and 2) estimates of prediction errors. It is recommended that current and future emission models incorporate the capability to quantify prediction errors, and that clear guidelines are developed internationally with respect to expected accuracy.

  2. Developing and Validating a Survival Prediction Model for NSCLC Patients Through Distributed Learning Across 3 Countries. (United States)

    Jochems, Arthur; Deist, Timo M; El Naqa, Issam; Kessler, Marc; Mayo, Chuck; Reeves, Jackson; Jolly, Shruti; Matuszak, Martha; Ten Haken, Randall; van Soest, Johan; Oberije, Cary; Faivre-Finn, Corinne; Price, Gareth; de Ruysscher, Dirk; Lambin, Philippe; Dekker, Andre


    Tools for survival prediction for non-small cell lung cancer (NSCLC) patients treated with chemoradiation or radiation therapy are of limited quality. In this work, we developed a predictive model of survival at 2 years. The model is based on a large volume of historical patient data and serves as a proof of concept to demonstrate the distributed learning approach. Clinical data from 698 lung cancer patients, treated with curative intent with chemoradiation or radiation therapy alone, were collected and stored at 2 different cancer institutes (559 patients at Maastro clinic (Netherlands) and 139 at Michigan university [United States]). The model was further validated on 196 patients originating from The Christie (United Kingdon). A Bayesian network model was adapted for distributed learning (the animation can be viewed at Two-year posttreatment survival was chosen as the endpoint. The Maastro clinic cohort data are publicly available at, and the developed models can be found at Variables included in the final model were T and N category, age, performance status, and total tumor dose. The model has an area under the curve (AUC) of 0.66 on the external validation set and an AUC of 0.62 on a 5-fold cross validation. A model based on the T and N category performed with an AUC of 0.47 on the validation set, significantly worse than our model (P<.001). Learning the model in a centralized or distributed fashion yields a minor difference on the probabilities of the conditional probability tables (0.6%); the discriminative performance of the models on the validation set is similar (P=.26). Distributed learning from federated databases allows learning of predictive models on data originating from multiple institutions while avoiding many of the data-sharing barriers. We believe

  3. Refining and validating a conceptual model of Clinical Nurse Leader integrated care delivery. (United States)

    Bender, Miriam; Williams, Marjory; Su, Wei; Hites, Lisle


    To empirically validate a conceptual model of Clinical Nurse Leader integrated care delivery. There is limited evidence of frontline care delivery models that consistently achieve quality patient outcomes. Clinical Nurse Leader integrated care delivery is a promising nursing model with a growing record of success. However, theoretical clarity is necessary to generate causal evidence of effectiveness. Sequential mixed methods. A preliminary Clinical Nurse Leader practice model was refined and survey items developed to correspond with model domains, using focus groups and a Delphi process with a multi-professional expert panel. The survey was administered in 2015 to clinicians and administrators involved in Clinical Nurse Leader initiatives. Confirmatory factor analysis and structural equation modelling were used to validate the measurement and model structure. Final sample n = 518. The model incorporates 13 components organized into five conceptual domains: 'Readiness for Clinical Nurse Leader integrated care delivery'; 'Structuring Clinical Nurse Leader integrated care delivery'; 'Clinical Nurse Leader Practice: Continuous Clinical Leadership'; 'Outcomes of Clinical Nurse Leader integrated care delivery'; and 'Value'. Sample data had good fit with specified model and two-level measurement structure. All hypothesized pathways were significant, with strong coefficients suggesting good fit between theorized and observed path relationships. The validated model articulates an explanatory pathway of Clinical Nurse Leader integrated care delivery, including Clinical Nurse Leader practices that result in improved care dynamics and patient outcomes. The validated model provides a basis for testing in practice to generate evidence that can be deployed across the healthcare spectrum. © 2016 John Wiley & Sons Ltd.

  4. Finite Element Model of the Knee for Investigation of Injury Mechanisms: Development and Validation (United States)

    Kiapour, Ali; Kiapour, Ata M.; Kaul, Vikas; Quatman, Carmen E.; Wordeman, Samuel C.; Hewett, Timothy E.; Demetropoulos, Constantine K.; Goel, Vijay K.


    Multiple computational models have been developed to study knee biomechanics. However, the majority of these models are mainly validated against a limited range of loading conditions and/or do not include sufficient details of the critical anatomical structures within the joint. Due to the multifactorial dynamic nature of knee injuries, anatomic finite element (FE) models validated against multiple factors under a broad range of loading conditions are necessary. This study presents a validated FE model of the lower extremity with an anatomically accurate representation of the knee joint. The model was validated against tibiofemoral kinematics, ligaments strain/force, and articular cartilage pressure data measured directly from static, quasi-static, and dynamic cadaveric experiments. Strong correlations were observed between model predictions and experimental data (r > 0.8 and p ligament (ACL) and medial collateral ligament (MCL) strains, 17 N of ACL load, and 1 mm of tibiofemoral center of pressure. Similarly, the FE model was able to accurately predict tibiofemoral kinematics and ACL and MCL strains during simulated bipedal landings (dynamic loading). In addition to minimal deviation from direct cadaveric measurements, all model predictions fell within 95% confidence intervals of the average experimental data. Agreement between model predictions and experimental data demonstrates the ability of the developed model to predict the kinematics of the human knee joint as well as the complex, nonuniform stress and strain fields that occur in biological soft tissue. Such a model will facilitate the in-depth understanding of a multitude of potential knee injury mechanisms with special emphasis on ACL injury. PMID:24763546

  5. Is my model good enough? Best practices for verification and validation of musculoskeletal models and simulations of movement. (United States)

    Hicks, Jennifer L; Uchida, Thomas K; Seth, Ajay; Rajagopal, Apoorva; Delp, Scott L


    Computational modeling and simulation of neuromusculoskeletal (NMS) systems enables researchers and clinicians to study the complex dynamics underlying human and animal movement. NMS models use equations derived from physical laws and biology to help solve challenging real-world problems, from designing prosthetics that maximize running speed to developing exoskeletal devices that enable walking after a stroke. NMS modeling and simulation has proliferated in the biomechanics research community over the past 25 years, but the lack of verification and validation standards remains a major barrier to wider adoption and impact. The goal of this paper is to establish practical guidelines for verification and validation of NMS models and simulations that researchers, clinicians, reviewers, and others can adopt to evaluate the accuracy and credibility of modeling studies. In particular, we review a general process for verification and validation applied to NMS models and simulations, including careful formulation of a research question and methods, traditional verification and validation steps, and documentation and sharing of results for use and testing by other researchers. Modeling the NMS system and simulating its motion involves methods to represent neural control, musculoskeletal geometry, muscle-tendon dynamics, contact forces, and multibody dynamics. For each of these components, we review modeling choices and software verification guidelines; discuss variability, errors, uncertainty, and sensitivity relationships; and provide recommendations for verification and validation by comparing experimental data and testing robustness. We present a series of case studies to illustrate key principles. In closing, we discuss challenges the community must overcome to ensure that modeling and simulation are successfully used to solve the broad spectrum of problems that limit human mobility.

  6. Development and validation of a multi-body model of the canine stifle joint. (United States)

    Stylianou, Antonis P; Guess, Trent M; Cook, James L


    Multi-body musculoskeletal models that can be used concurrently to predict joint contact pressures and muscle forces would be extremely valuable in studying the mechanics of joint injury. The purpose of this study was to develop an anatomically correct canine stifle joint model and validate it against experimental data. A cadaver pelvic limb from one adult dog was used in this study. The femoral head was subjected to axial motion in a mechanical tester. Kinematic and force data were used to validate the computational model. The maximum RMS error between the predicted and measured kinematics during the complete testing cycle was 11.9 mm translational motion between the tibia and the femur and 4.3° rotation between patella and femur. This model is the first step in the development of a musculoskeletal model of the hind limb with anatomically correct joints to study cartilage loading under dynamic conditions.

  7. [Validation of the integration of health belief model and planned behavior theory]. (United States)

    Sun, Xin-ying; Guo, Yan; Sun, Jing


    To establish and validate a new model of health belief model(HBM) combined with theory of planned behavior(TPB). Path analysis was applied to set up a new model predicting iron-fortified soy sauce consumption behavior using baseline survey data among women in rural and urban areas in Beijing, and the model was validated in follow-up survey. It was proved that health values had powerful direct effect on behavior identity and had relatively strong direct effect on attitudes towards behavior; behavior identity had strong effect on behavior barriers, mostly in a direct way, and on behavior intention in a direct or indirect way; control belief was an important external factor influencing behavior intention; behavior intention was the most direct and most important one of factors influencing actual behavior; and convenience to buy was an important external factor influencing actual behavior. The integrated model of TPB and HBM explains behavior better and may be attempted in other similar researches.

  8. DMFC performance and methanol cross-over: Experimental analysis and model validation

    Energy Technology Data Exchange (ETDEWEB)

    Casalegno, A.; Marchesi, R. [Dipartimento di Energia, Politecnico di Milano, Piazza Leonardo da Vinci 32, 20133 Milano (Italy)


    A combined experimental and modelling approach is proposed to analyze methanol cross-over and its effect on DMFC performance. The experimental analysis is performed in order to allow an accurate investigation of methanol cross-over influence on DMFC performance, hence measurements were characterized in terms of uncertainty and reproducibility. The findings suggest that methanol cross-over is mainly determined by diffusion transport and affects cell performance partly via methanol electro-oxidation at the cathode. The modelling analysis is carried out to further investigate methanol cross-over phenomenon. A simple model evaluates the effectiveness of two proposed interpretations regarding methanol cross-over and its effects. The model is validated using the experimental data gathered. Both the experimental analysis and the proposed and validated model allow a substantial step forward in the understanding of the main phenomena associated with methanol cross-over. The findings confirm the possibility to reduce methanol cross-over by optimizing anode feeding. (author)

  9. Modeling and Validation across Scales: Parametrizing the effect of the forested landscape

    DEFF Research Database (Denmark)

    Dellwik, Ebba; Badger, Merete; Angelou, Nikolas

    When validating the performance of a flow model in forested areas, it is important that the model accurately represents the forest effects. This presentation concerns the use of remote-sensing technology for describing forest effects, and more specifically, how positioning lidar data can...... be transferred into a parametrization of forests in wind models. The presentation covers three scales: the single tree, the forest edges and clearings, and the large-scale forested landscape in which the forest effects are parameterized with a roughness length. Flow modeling results and validation against...... observations are presented along with the different forest presentations for each of the cases. In a new research project called InnoWind, the use of satellite-based alternatives to airborne lidar campaigns are investigated, and examples of satellite products in wind power modeling are discussed....

  10. Validation of a Hot Water Distribution Model Using Laboratory and Field Data

    Energy Technology Data Exchange (ETDEWEB)

    Backman, C.; Hoeschele, M.


    Characterizing the performance of hot water distribution systems is a critical step in developing best practice guidelines for the design and installation of high performance hot water systems. Developing and validating simulation models is critical to this effort, as well as collecting accurate input data to drive the models. In this project, the ARBI team validated the newly developed TRNSYS Type 604 pipe model against both detailed laboratory and field distribution system performance data. Validation efforts indicate that the model performs very well in handling different pipe materials, insulation cases, and varying hot water load conditions. Limitations of the model include the complexity of setting up the input file and long simulation run times. In addition to completing validation activities, this project looked at recent field hot water studies to better understand use patterns and potential behavioral changes as homeowners convert from conventional storage water heaters to gas tankless units. Based on these datasets, we conclude that the current Energy Factor test procedure overestimates typical use and underestimates the number of hot water draws. This has implications for both equipment and distribution system performance. Gas tankless water heaters were found to impact how people use hot water, but the data does not necessarily suggest an increase in usage. Further study in hot water usage and patterns is needed to better define these characteristics in different climates and home vintages.

  11. Development and validation of a predictive model for excessive postpartum blood loss: A retrospective, cohort study. (United States)

    Rubio-Álvarez, Ana; Molina-Alarcón, Milagros; Arias-Arias, Ángel; Hernández-Martínez, Antonio


    postpartum haemorrhage is one of the leading causes of maternal morbidity and mortality worldwide. Despite the use of uterotonics agents as preventive measure, it remains a challenge to identify those women who are at increased risk of postpartum bleeding. to develop and to validate a predictive model to assess the risk of excessive bleeding in women with vaginal birth. retrospective cohorts study. "Mancha-Centro Hospital" (Spain). the elaboration of the predictive model was based on a derivation cohort consisting of 2336 women between 2009 and 2011. For validation purposes, a prospective cohort of 953 women between 2013 and 2014 were employed. Women with antenatal fetal demise, multiple pregnancies and gestations under 35 weeks were excluded METHODS: we used a multivariate analysis with binary logistic regression, Ridge Regression and areas under the Receiver Operating Characteristic curves to determine the predictive ability of the proposed model. there was 197 (8.43%) women with excessive bleeding in the derivation cohort and 63 (6.61%) women in the validation cohort. Predictive factors in the final model were: maternal age, primiparity, duration of the first and second stages of labour, neonatal birth weight and antepartum haemoglobin levels. Accordingly, the predictive ability of this model in the derivation cohort was 0.90 (95% CI: 0.85-0.93), while it remained 0.83 (95% CI: 0.74-0.92) in the validation cohort. this predictive model is proved to have an excellent predictive ability in the derivation cohort, and its validation in a latter population equally shows a good ability for prediction. This model can be employed to identify women with a higher risk of postpartum haemorrhage. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Validation of a common data model for active safety surveillance research (United States)

    Ryan, Patrick B; Reich, Christian G; Hartzema, Abraham G; Stang, Paul E


    Objective Systematic analysis of observational medical databases for active safety surveillance is hindered by the variation in data models and coding systems. Data analysts often find robust clinical data models difficult to understand and ill suited to support their analytic approaches. Further, some models do not facilitate the computations required for systematic analysis across many interventions and outcomes for large datasets. Translating the data from these idiosyncratic data models to a common data model (CDM) could facilitate both the analysts' understanding and the suitability for large-scale systematic analysis. In addition to facilitating analysis, a suitable CDM has to faithfully represent the source observational database. Before beginning to use the Observational Medical Outcomes Partnership (OMOP) CDM and a related dictionary of standardized terminologies for a study of large-scale systematic active safety surveillance, the authors validated the model's suitability for this use by example. Validation by example To validate the OMOP CDM, the model was instantiated into a relational database, data from 10 different observational healthcare databases were loaded into separate instances, a comprehensive array of analytic methods that operate on the data model was created, and these methods were executed against the databases to measure performance. Conclusion There was acceptable representation of the data from 10 observational databases in the OMOP CDM using the standardized terminologies selected, and a range of analytic methods was developed and executed with sufficient performance to be useful for active safety surveillance. PMID:22037893

  13. Biomarker-based prognosis in hepatocellular carcinoma: validation and extension of the BALAD model. (United States)

    Fox, R; Berhane, S; Teng, M; Cox, T; Tada, T; Toyoda, H; Kumada, T; Kagebayashi, C; Satomura, S; Johnson, P J


    The Japanese 'BALAD' model offers the first objective, biomarker-based, tool for assessment of prognosis in hepatocellular carcinoma, but relies on dichotomisation of the constituent data, has not been externally validated, and cannot be applied to the individual patients. In this Japanese/UK collaboration, we replicated the original BALAD model on a UK cohort and then built a new model, BALAD-2, on the original raw Japanese data using variables in their continuous form. Regression analyses using flexible parametric models with fractional polynomials enabled fitting of appropriate baseline hazard functions and functional form of covariates. The resulting models were validated in the respective cohorts to measure the predictive performance. The key prognostic features were confirmed to be Bilirubin and Albumin together with the serological cancer biomarkers, AFP-L3, AFP, and DCP. With appropriate recalibration, the model offered clinically relevant discrimination of prognosis in both the Japanese and UK data sets and accurately predicted patient-level survival. The original BALAD model has been validated in an international setting. The refined BALAD-2 model permits estimation of patient-level survival in UK and Japanese cohorts.

  14. Validation of Microscopic Traffic Models Based on GPS Precise Measurement of Vehicle Dynamics

    Directory of Open Access Journals (Sweden)

    Tomas Apeltauer


    Full Text Available A necessary stage in the development of traffic models is model validation, where the developed model is verified by comparing its outputs with observed data. The most frequently used variables are average value of speed, flow intensity and flow density (during a selected period.It is possible to use these values for the calibration of macroscopic models, but one cannot always obtain a relevant microscopic dynamic model in this way. A typical use of the microsimulation models is the capacity assessment, where this sort of data (flow, speed and queues is considered to be standard and sufficient. However microsimulation is also increasingly being used for other assessments (e.g. noise and emissions where the correct representation of each vehicle’s acceleration and deceleration plays a crucial role. Another emerging area is the use of microsimulation to predict near-miss situations and conflicts to identify dangerous and accident prone locations. In such assessments the vehicle trajectory, distance from other vehicles as well as velocity and acceleration are very important.Additional source of data, which can be used to validate vehicle dynamics in microsimulation models, is the Global Positioning System (GPS that is able to determine vehicle position with centimeter accuracy.In this article we discuss validation of selected microscopic traffic models, based on the comparison of simulated vehicle dynamics with observed dynamic characteristics of vehicles recorded by the precise geodetic GPS equipment.

  15. Multi-criteria validation of artificial neural network rainfall-runoff modeling

    Directory of Open Access Journals (Sweden)

    R. Modarres


    Full Text Available In this study we propose a comprehensive multi-criteria validation test for rainfall-runoff modeling by artificial neural networks. This study applies 17 global statistics and 3 additional non-parametric tests to evaluate the ANNs. The weakness of global statistics for validation of ANN is demonstrated by rainfall-runoff modeling of the Plasjan Basin in the western region of the Zayandehrud watershed, Iran. Although the global statistics showed that the multi layer perceptron with 4 hidden layers (MLP4 is the best ANN for the basin comparing with other MLP networks and empirical regression model, the non-parametric tests illustrate that neither the ANNs nor the regression model are able to reproduce the probability distribution of observed runoff in validation phase. However, the MLP4 network is the best network to reproduce the mean and variance of the observed runoff based on non-parametric tests. The performance of ANNs and empirical model was also demonstrated for low, medium and high flows. Although the MLP4 network gives the best performance among ANNs for low, medium and high flows based on different statistics, the empirical model shows better results. However, none of the models is able to simulate the frequency distribution of low, medium and high flows according to non-parametric tests. This study illustrates that the modelers should select appropriate and relevant evaluation measures from the set of existing metrics based on the particular requirements of each individual applications.

  16. On the Validation of a Numerical Model for the Analysis of Soil-Structure Interaction Problems

    Directory of Open Access Journals (Sweden)

    Jorge Luis Palomino Tamayo

    Full Text Available Abstract Modeling and simulation of mechanical response of structures, relies on the use of computational models. Therefore, verification and validation procedures are the primary means of assessing accuracy, confidence and credibility in modeling. This paper is concerned with the validation of a three dimensional numerical model based on the finite element method suitable for the dynamic analysis of soil-structure interaction problems. The soil mass, structure, structure's foundation and the appropriate boundary conditions can be represented altogether in a single model by using a direct approach. The theory of porous media of Biot is used to represent the soil mass as a two-phase material which is considered to be fully saturated with water; meanwhile other parts of the system are treated as one-phase materials. Plasticity of the soil mass is the main source of non-linearity in the problem and therefore an iterative-incremental algorithm based on the Newton-Raphson procedure is used to solve the nonlinear equilibrium equations. For discretization in time, the Generalized Newmark-β method is used. The soil is represented by a plasticity-based, effective-stress constitutive model suitable for liquefaction. Validation of the present numerical model is done by comparing analytical and centrifuge test results of soil and soil-pile systems with those results obtained with the present numerical model. A soil-pile-structure interaction problem is also presented in order to shown the potentiality of the numerical tool.

  17. Coloured Petri Nets and CPN Tools for Modelling and Validation of Concurrent Systems

    DEFF Research Database (Denmark)

    Jensen, Kurt; Kristensen, Lars Michael; Wells, Lisa Marie


    Coloured Petri Nets (CPNs) is a language for the modeling and validation og systems in which concurrency, communication, and synchronisation play a major role. Coloured Petri Nets is a descrete-event modeling language combining Petri Nets with the funcitonal programming language Standard ML. Petri...... taken to execute events in the modelled system. CPN Tolls is an industrial-strength computer tool for construction and analysing CPN models. Using CPN Tools, it is possible to investigate the behaviour of the modelled system using simulation, to verify properties by means of state sp0ece methods...

  18. Validation and refinement of an Australian customised birthweight model using routinely collected data. (United States)

    Gibbons, Kristen; Chang, Allan; Flenady, Vicki; Mahomed, Kassam; Gardener, Glenn; Gray, Peter H


    Published customised birthweight models designed to account for individual constitutional variation have not been validated in an independent population to verify the results. To validate our previously reported customised birthweight model with additional data from the same hospital and to revise this model using a larger, more refined dataset. With the accumulation of further data, a set of coefficients was derived based on the 12-year dataset. Using shrinkage statistics, records between July 2005 and December 2008 were used to validate the model. Stepwise multiple regression using a more refined dataset of births between January 1997 and December 2008 was used to derive updated coefficients. Performance of the model was assessed using individualised birthweight ratios and the absolute difference between customised and actual birthweight. Previous coefficients were validated, with shrinkage of less than 1%, indicating that the model is stable over time. An updated set of coefficients based on a dataset of 61,630 births, including refined ethnicity categories and the addition of a smoking term, is presented, which resulted in improved model statistics (primarily an improved multiple correlation coefficient of 0.51). The customised birthweight model appears to be stable over time in the same hospital. Initial comparisons to literature indicate that models from different geographic locations may lead to similar coefficients; but, there remains a need to formally assess this aspect of birthweight models. The updated coefficients differ slightly from those previously published and are considered superior because of refinement in the dataset. © 2010 The Authors. Australian and New Zealand Journal of Obstetrics and Gynaecology © 2010 The Royal Australian and New Zealand College of Obstetricians and Gynaecologists.

  19. Experimental validation of finite element modelling of a modular metal-on-polyethylene total hip replacement. (United States)

    Hua, Xijin; Wang, Ling; Al-Hajjar, Mazen; Jin, Zhongmin; Wilcox, Ruth K; Fisher, John


    Finite element models are becoming increasingly useful tools to conduct parametric analysis, design optimisation and pre-clinical testing for hip joint replacements. However, the verification of the finite element model is critically important. The purposes of this study were to develop a three-dimensional anatomic finite element model for a modular metal-on-polyethylene total hip replacement for predicting its contact mechanics and to conduct experimental validation for a simple finite element model which was simplified from the anatomic finite element model. An anatomic modular metal-on-polyethylene total hip replacement model (anatomic model) was first developed and then simplified with reasonable accuracy to a simple modular total hip replacement model (simplified model) for validation. The contact areas on the articulating surface of three polyethylene liners of modular metal-on-polyethylene total hip replacement bearings with different clearances were measured experimentally in the Leeds ProSim hip joint simulator under a series of loading conditions and different cup inclination angles. The contact areas predicted from the simplified model were then compared with that measured experimentally under the same conditions. The results showed that the simplification made for the anatomic model did not change the predictions of contact mechanics of the modular metal-on-polyethylene total hip replacement substantially (less than 12% for contact stresses and contact areas). Good agreements of contact areas between the finite element predictions from the simplified model and experimental measurements were obtained, with maximum difference of 14% across all conditions considered. This indicated that the simplification and assumptions made in the anatomic model were reasonable and the finite element predictions from the simplified model were valid. © IMechE 2014.

  20. Validation of ASTEC v2.0 corium jet fragmentation model using FARO experiments

    Energy Technology Data Exchange (ETDEWEB)

    Hermsmeyer, S., E-mail:; Pla, P.; Sangiorgi, M.


    Highlights: • Model validation base extended to six FARO experiments. • Focus on the calculation of the fragmented particle diameter. • Capability and limits of the ASTEC fragmentation model. • Sensitivity analysis of model outputs. - Abstract: ASTEC is an integral code for the prediction of Severe Accidents in Nuclear Power Plants. As such, it needs to cover all physical processes that could occur during accident progression, yet keeping its models simple enough for the ensemble to stay manageable and produce results within an acceptable time. The present paper is concerned with the validation of the Corium jet fragmentation model of ASTEC v2.0 rev3 by means of a selection of six experiments carried out within the FARO facility. The different conditions applied within these six experiments help to analyse the model behaviour in different situations and to expose model limits. In addition to comparing model outputs with experimental measurements, sensitivity analyses are applied to investigate the model. Results of the paper are (i) validation runs, accompanied by an identification of situations where the implemented fragmentation model does not match the experiments well, and discussion of results; (ii) its special attention to the models calculating the diameter of fragmented particles, the identification of a fault in one model implemented, and the discussion of simplification and ad hoc modification to improve the model fit; and, (iii) an investigation of the sensitivity of predictions towards inputs and parameters. In this way, the paper offers a thorough investigation of the merit and limitation of the fragmentation model used in ASTEC.

  1. Validation of X1 motorcycle model in industrial plant layout by using WITNESSTM simulation software (United States)

    Hamzas, M. F. M. A.; Bareduan, S. A.; Zakaria, M. Z.; Tan, W. J.; Zairi, S.


    This paper demonstrates a case study on simulation, modelling and analysis for X1 Motorcycles Model. In this research, a motorcycle assembly plant has been selected as a main place of research study. Simulation techniques by using Witness software were applied to evaluate the performance of the existing manufacturing system. The main objective is to validate the data and find out the significant impact on the overall performance of the system for future improvement. The process of validation starts when the layout of the assembly line was identified. All components are evaluated to validate whether the data is significance for future improvement. Machine and labor statistics are among the parameters that were evaluated for process improvement. Average total cycle time for given workstations is used as criterion for comparison of possible variants. From the simulation process, the data used are appropriate and meet the criteria for two-sided assembly line problems.

  2. Projecting global tropical cyclone economic damages with validation of tropical cyclone economic damage model (United States)

    Iseri, Y.; Iwasaki, A.; Miyazaki, C.; Kanae, S.


    Tropical cyclones (TCs) sometimes cause serious damages to human society and thus possible changes of TC properties in the future have been concerned. In fact, the Fifth Assessment Report (AR5) by IPCC (Intergovernmental Panel on Climate Change) mentions likely increasing in intensity and rain rate of TCs. In addition, future change of socioeconomic condition (e.g. population growth) might worsen TC impacts in the future. Thereby, in this study, we developed regression models to estimate economic damages by TCs (hereafter TC damage model), and employed those models to project TC economic damages under several future climate and socioeconomic scenarios. We developed the TC damage models for each of 4 regions; western North Pacific, North American, North Indian, and Southern Hemisphere. The inputs for TC damage model are tropical cyclone central pressure, populations in the area exposed by tropical cyclone wind, and GDP (Gross Domestic Product) per capita. The TC damage models we firstly developed tended to overestimate very low damages and also underestimate very high damages. Thereby we modified structure of TC damage models to improve model performance, and then executed extensive validation of the model. The modified model presented better performance in estimating very low and high TC damages. After the modification and validation of the model, we determined the structure of TC damage models and projected TC economic damages. The result indicated increase in TC economic damage in global scale, while TC economic damage against world GDP would decrease in the future, which result is consistent with previous study.

  3. Development and validation of logistic prognostic models by predefined SAS-macros

    Directory of Open Access Journals (Sweden)

    Ziegler, Christoph


    Full Text Available In medical decision making about therapies or diagnostic procedures in the treatment of patients the prognoses of the course or of the magnitude of diseases plays a relevant role. Beside of the subjective attitude of the clinician mathematical models can help in providing such prognoses. Such models are mostly multivariate regression models. In the case of a dichotomous outcome the logistic model will be applied as the standard model. In this paper we will describe SAS-macros for the development of such a model, for examination of the prognostic performance, and for model validation. The rational for this developmental approach of a prognostic modelling and the description of the macros can only given briefly in this paper. Much more details are given in. These 14 SAS-macros are a tool for setting up the whole process of deriving a prognostic model. Especially the possibility of validating the model by a standardized software tool gives an opportunity, which is not used in general in published prognostic models. Therefore, this can help to develop new models with good prognostic performance for use in medical applications.

  4. A New Statistical Method to Determine the Degree of Validity of Health Economic Model Outcomes against Empirical Data. (United States)

    Corro Ramos, Isaac; van Voorn, George A K; Vemer, Pepijn; Feenstra, Talitha L; Al, Maiwenn J


    The validation of health economic (HE) model outcomes against empirical data is of key importance. Although statistical testing seems applicable, guidelines for the validation of HE models lack guidance on statistical validation, and actual validation efforts often present subjective judgment of graphs and point estimates. To discuss the applicability of existing validation techniques and to present a new method for quantifying the degrees of validity statistically, which is useful for decision makers. A new Bayesian method is proposed to determine how well HE model outcomes compare with empirical data. Validity is based on a pre-established accuracy interval in which the model outcomes should fall. The method uses the outcomes of a probabilistic sensitivity analysis and results in a posterior distribution around the probability that HE model outcomes can be regarded as valid. We use a published diabetes model (Modelling Integrated Care for Diabetes based on Observational data) to validate the outcome "number of patients who are on dialysis or with end-stage renal disease." Results indicate that a high probability of a valid outcome is associated with relatively wide accuracy intervals. In particular, 25% deviation from the observed outcome implied approximately 60% expected validity. Current practice in HE model validation can be improved by using an alternative method based on assessing whether the model outcomes fit to empirical data at a predefined level of accuracy. This method has the advantage of assessing both model bias and parameter uncertainty and resulting in a quantitative measure of the degree of validity that penalizes models predicting the mean of an outcome correctly but with overly wide credible intervals. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.


    Energy Technology Data Exchange (ETDEWEB)

    Jin, M.; Manchester, W. B.; Van der Holst, B.; Gruesbeck, J. R.; Frazin, R. A.; Landi, E.; Toth, G.; Gombosi, T. I. [Atmospheric Oceanic and Space Sciences, University of Michigan, Ann Arbor, MI 48109 (United States); Vasquez, A. M. [Instituto de Astronomia y Fisica del Espacio (CONICET-UBA) and FCEN (UBA), CC 67, Suc 28, Ciudad de Buenos Aires (Argentina); Lamy, P. L.; Llebaria, A.; Fedorov, A., E-mail: [Laboratoire d' Astrophysique de Marseille, Universite de Provence, Marseille (France)


    The recent solar minimum with very low activity provides us a unique opportunity for validating solar wind models. During CR2077 (2008 November 20 through December 17), the number of sunspots was near the absolute minimum of solar cycle 23. For this solar rotation, we perform a multi-spacecraft validation study for the recently developed three-dimensional, two-temperature, Alfven-wave-driven global solar wind model (a component within the Space Weather Modeling Framework). By using in situ observations from the Solar Terrestrial Relations Observatory (STEREO) A and B, Advanced Composition Explorer (ACE), and Venus Express, we compare the observed proton state (density, temperature, and velocity) and magnetic field of the heliosphere with that predicted by the model. Near the Sun, we validate the numerical model with the electron density obtained from the solar rotational tomography of Solar and Heliospheric Observatory/Large Angle and Spectrometric Coronagraph C2 data in the range of 2.4 to 6 solar radii. Electron temperature and density are determined from differential emission measure tomography (DEMT) of STEREO A and B Extreme Ultraviolet Imager data in the range of 1.035 to 1.225 solar radii. The electron density and temperature derived from the Hinode/Extreme Ultraviolet Imaging Spectrometer data are also used to compare with the DEMT as well as the model output. Moreover, for the first time, we compare ionic charge states of carbon, oxygen, silicon, and iron observed in situ with the ACE/Solar Wind Ion Composition Spectrometer with those predicted by our model. The validation results suggest that most of the model outputs for CR2077 can fit the observations very well. Based on this encouraging result, we therefore expect great improvement for the future modeling of coronal mass ejections (CMEs) and CME-driven shocks.

  6. Construct and criterion-related validation of nutrient profiling models: A systematic review of the literature. (United States)

    Cooper, Sheri L; Pelly, Fiona E; Lowe, John B


    Nutrient profiling (NP) is defined as the science of ranking foods according to their nutritional composition for the purpose of preventing disease or promoting health. The application of NP is ultimately to assist consumers to make healthier food choices, and thus provide a cost effective public health strategy to reduce the incidence of diet-related chronic disease. To our knowledge, no review has assessed the evidence to confirm the validity of NP models. We conducted a systematic review to investigate the construct and criterion-related validity of NP models in ranking food according to their nutritional composition for the purpose of preventing disease and promoting health. We searched peer-reviewed research published to 30 June 2015 and used PUBMED, Global Health (CABI), and SCOPUS databases. Within study bias was assessed using an adapted version of the QUADAS-2 (Quality Assessment of Diagnostic Accuracy Studies -2) tool for all diagnostic studies and the Cochrane Collaboration's Risk of Bias tool for all non-diagnostic studies. The GRADE (Grades of Recommendation, Assessment, Development, and Evaluation) approach was used to guide our judgement of the quality of the body of evidence for each outcome measure. From a total of 83 studies, 69 confirmed the construct validity of NP models; however most of these studies contained methodological weaknesses. Six studies used objective external measures to confirm the criterion-related validity of NP models; which inherently improved quality. The overall quality of evidence on the accuracy of NP models was judged to be very low to moderate using the GRADE approach. Many carefully designed studies to establish both construct and criterion-related validity are necessary to authenticate the application of NP models and provide the evidence to support the current definition of NP. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Water balance at an arid site: a model validation study of bare soil evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Jones, T.L.; Campbell, G.S.; Gee, G.W.


    This report contains results of model validation studies conducted by Pacific Northwest Laboratory (PNL) for the Department of Energy's (DOE) National Low Level Waste Management Program (NLLWMP). The model validation tests consisted of using unsaturated water flow models to simulate water balance experiments conducted at the Buried Waste Test Facility (BWTF) located at the Department of Energy's Hanford site, near Richland, Washington. The BWTF is a lysimeter facility designed to collect field data on long-term water balance and radionuclide tracer movement. It has been operated by PNL for the NLLWMP since 1978. An experimental test case, developed from data collected at the BWTF, was used to evaluate predictions from different water flow models. The major focus of the validation study was to evaluate how the use of different evaporation models affected the accuracy of predictions of evaporation, storage, and drainage made by the whole model. Four evaporation models were tested including two empirical models and two mechanistic models. The empirical models estimate actual evaporation from potential evaporation; the mechanistic models describe water vapor diffusion within the soil profile and between the soil and the atmosphere in terms of fundamental soil properties, and transport processes. The water flow models that included the diffusion-type evaporation submodels performed best overall. The empirical models performed poorly in their description of evaporation and profile water storage during summer months. The predictions of drainage were supported quite well by the experimental data. This indicates that the method used to estimate hydraulic conductivity needed for the Darcian submodel was adequate. This important result supports recommendations for these procedures that were made previously based on laboratory results.

  8. Validation and calibration of a computer simulation model of pediatric HIV infection.

    Directory of Open Access Journals (Sweden)

    Andrea L Ciaranello

    Full Text Available Computer simulation models can project long-term patient outcomes and inform health policy. We internally validated and then calibrated a model of HIV disease in children before initiation of antiretroviral therapy to provide a framework against which to compare the impact of pediatric HIV treatment strategies.We developed a patient-level (Monte Carlo model of HIV progression among untreated children 1,300 untreated, HIV-infected African children.In internal validation analyses, model-generated survival curves fit IeDEA data well; modeled and observed survival at 16 months of age were 91.2% and 91.1%, respectively. RMSE varied widely with variations in CD4% parameters; the best fitting parameter set (RMSE = 0.00423 resulted when CD4% was 45% at birth and declined by 6%/month (ages 0-3 months and 0.3%/month (ages >3 months. In calibration analyses, increases in IeDEA-derived mortality risks were necessary to fit UNAIDS survival data.The CEPAC-Pediatric model performed well in internal validation analyses. Increases in modeled mortality risks required to match UNAIDS data highlight the importance of pre-enrollment mortality in many pediatric cohort studies.

  9. Validation Assessment of a Glass-to-Metal Seal Finite-Element Model

    Energy Technology Data Exchange (ETDEWEB)

    Jamison, Ryan Dale [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Buchheit, Thomas E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Emery, John M [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Romero, Vicente J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Stavig, Mark E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Newton, Clay S. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Brown, Arthur [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)


    Sealing glasses are ubiquitous in high pressure and temperature engineering applications, such as hermetic feed-through electrical connectors. A common connector technology are glass-to-metal seals where a metal shell compresses a sealing glass to create a hermetic seal. Though finite-element analysis has been used to understand and design glass-to-metal seals for many years, there has been little validation of these models. An indentation technique was employed to measure the residual stress on the surface of a simple glass-to-metal seal. Recently developed rate- dependent material models of both Schott 8061 and 304L VAR stainless steel have been applied to a finite-element model of the simple glass-to-metal seal. Model predictions of residual stress based on the evolution of material models are shown. These model predictions are compared to measured data. Validity of the finite- element predictions is discussed. It will be shown that the finite-element model of the glass-to-metal seal accurately predicts the mean residual stress in the glass near the glass-to-metal interface and is valid for this quantity of interest.

  10. Validation Testing of a Peridynamic Impact Damage Model Using NASA's Micro-Particle Gun (United States)

    Baber, Forrest E.; Zelinski, Brian J.; Guven, Ibrahim; Gray, Perry


    Through a collaborative effort between the Virginia Commonwealth University and Raytheon, a peridynamic model for sand impact damage has been developed1-3. Model development has focused on simulating impacts of sand particles on ZnS traveling at velocities consistent with aircraft take-off and landing speeds. The model reproduces common features of impact damage including pit and radial cracks, and, under some conditions, lateral cracks. This study focuses on a preliminary validation exercise in which simulation results from the peridynamic model are compared to a limited experimental data set generated by NASA's recently developed micro-particle gun (MPG). The MPG facility measures the dimensions and incoming and rebound velocities of the impact particles. It also links each particle to a specific impact site and its associated damage. In this validation exercise parameters of the peridynamic model are adjusted to fit the experimentally observed pit diameter, average length of radial cracks and rebound velocities for 4 impacts of 300 µm glass beads on ZnS. Results indicate that a reasonable fit of these impact characteristics can be obtained by suitable adjustment of the peridynamic input parameters, demonstrating that the MPG can be used effectively as a validation tool for impact modeling and that the peridynamic sand impact model described herein possesses not only a qualitative but also a quantitative ability to simulate sand impact events.

  11. Validation of ASTEC V2 models for the behaviour of corium in the vessel lower head

    Energy Technology Data Exchange (ETDEWEB)

    Carénini, L., E-mail:; Fleurot, J.; Fichot, F.


    The paper is devoted to the presentation of validation cases carried out for the models describing the corium behaviour in the “lower plenum” of the reactor vessel implemented in the V2.0 version of the ASTEC integral code, jointly developed by IRSN (France) and GRS (Germany). In the ASTEC architecture, these models are grouped within the single ICARE module and they are all activated in typical accident scenarios. Therefore, it is important to check the validity of each individual model, as long as experiments are available for which a single physical process is involved. The results of ASTEC applications against the following experiments are presented: FARO (corium jet fragmentation), LIVE (heat transfer between a molten pool and the vessel), MASCA (separation and stratification of corium non miscible phases) and OLHF (mechanical failure of the vessel). Compared to the previous ASTEC V1.3 version, the validation matrix is extended. This work allows determining recommended values for some model parameters (e.g. debris particle size in the fragmentation model and criterion for debris bed liquefaction). Almost all the processes governing the corium behaviour, its thermal interaction with the vessel wall and the vessel failure are modelled in ASTEC and these models have been assessed individually with satisfactory results. The main uncertainties appear to be related to the calculation of transient evolutions.

  12. Validation of risk stratification models in acute myeloid leukemia using sequencing-based molecular profiling. (United States)

    Wang, M; Lindberg, J; Klevebring, D; Nilsson, C; Mer, A S; Rantalainen, M; Lehmann, S; Grönberg, H


    Risk stratification of acute myeloid leukemia (AML) patients needs improvement. Several AML risk classification models based on somatic mutations or gene-expression profiling have been proposed. However, systematic and independent validation of these models is required for future clinical implementation. We performed whole-transcriptome RNA-sequencing and panel-based deep DNA sequencing of 23 genes in 274 intensively treated AML patients (Clinseq-AML). We also utilized the The Cancer Genome Atlas (TCGA)-AML study (N=142) as a second validation cohort. We evaluated six previously proposed molecular-based models for AML risk stratification and two revised risk classification systems combining molecular- and clinical data. Risk groups stratified by five out of six models showed different overall survival in cytogenetic normal-AML patients in the Clinseq-AML cohort (P-value0.5). Risk classification systems integrating mutational or gene-expression data were found to add prognostic value to the current European Leukemia Net (ELN) risk classification. The prognostic value varied between models and across cohorts, highlighting the importance of independent validation to establish evidence of efficacy and general applicability. All but one model replicated in the Clinseq-AML cohort, indicating the potential for molecular-based AML risk models. Risk classification based on a combination of molecular and clinical data holds promise for improved AML patient stratification in the future.

  13. Mathematical modeling and validation in physiology applications to the cardiovascular and respiratory systems

    CERN Document Server

    Bachar, Mostafa; Kappel, Franz


    This volume synthesizes theoretical and practical aspects of both the mathematical and life science viewpoints needed for modeling of the cardiovascular-respiratory system specifically and physiological systems generally.  Theoretical points include model design, model complexity and validation in the light of available data, as well as control theory approaches to feedback delay and Kalman filter applications to parameter identification. State of the art approaches using parameter sensitivity are discussed for enhancing model identifiability through joint analysis of model structure and data. Practical examples illustrate model development at various levels of complexity based on given physiological information. The sensitivity-based approaches for examining model identifiability are illustrated by means of specific modeling  examples. The themes presented address the current problem of patient-specific model adaptation in the clinical setting, where data is typically limited.

  14. Validation of standardized and structured EHR data in the dual model approach. (United States)

    Rinner, Christoph; Duftschmid, Georg


    We present a W3C XML Schema-based method to validate standardized EHR data against semantic constraints that build the knowledge layer within the dual model approach. The approach was tested with three EN/ISO13606 archetypes and an HL7 CDA implementation guide for diabetes therapies.

  15. Measurement and data analysis methods for field-scale wind erosion studies and model validation

    NARCIS (Netherlands)

    Zobeck, T.M.; Sterk, G.; Funk, R.F.; Rajot, J.L.; Stout, J.E.; Scott Van Pelt, R.


    Accurate and reliable methods of measuring windblown sediment are needed to confirm, validate, and improve erosion models, assess the intensity of aeolian processes and related damage, determine the source of pollutants, and for other applications. This paper outlines important principles to

  16. Validating a model of effective teaching behaviour of pre-service teachers

    NARCIS (Netherlands)

    Maulana, Ridwan; Lorenz, Michelle; van de Grift, Wim


    Although effective teaching behaviour is central for pupil outcomes, the extent to which pre-service teachers behave effectively in the classroom and how their behaviour relates to pupils’ engagement remain unanswered. The present study aims to validate a theoretical model linking effective

  17. Experimental Validation of a Mathematical Model for Seabed Liquefaction Under Waves

    DEFF Research Database (Denmark)

    Sumer, B. Mutlu; Kirca, Özgür; Fredsøe, Jørgen


    This paper summarizes the results of an experimental study directed towards the validation of a mathematical model for the buildup of pore water pressure and resulting liquefaction of marine soils under progressive waves. Experiments were conducted under controlled conditions with silt (d(50) = 0...

  18. A microvascular compartment model validated using 11C-methylglucose liver PET in pigs

    DEFF Research Database (Denmark)

    Munk, Ole Lajord; Keiding, Susanne; Baker, Charles


    The standard compartment model (CM) is widely used to analyze dynamic PET data. The CM is fitted to time-activity curves to estimate rate constants that describe the transport of tracer between well-mixed compartments. The aim of this study was to develop and validate a more realistic microvascul...

  19. Validating a Model of Effective Teaching Behaviour of Pre-Service Teachers (United States)

    Maulana, Ridwan; Helms-Lorenz, Michelle; Van de Grift, Wim


    Although effective teaching behaviour is central for pupil outcomes, the extent to which pre-service teachers behave effectively in the classroom and how their behaviour relates to pupils' engagement remain unanswered. The present study aims to validate a theoretical model linking effective pre-service teaching behaviour and pupil's engagement,…

  20. Experimental validation of a dynamic waste heat recovery system model for control purposes

    NARCIS (Netherlands)

    Feru, E.; Kupper, F.; Rojer, C.; Seykens, X.L.J.; Scappin, F.; Willems, F.P.T.; Smits, J.; Jager, B. de; Steinbuch, M.


    This paper presents the identification and validation of a dynamic Waste Heat Recovery (WHR) system model. Driven by upcoming CO2 emission targets and increasing fuel costs, engine exhaust gas heat utilization has recently attracted much attention to improve fuel efficiency, especially for

  1. Validation of Hydrodynamic Numerical Model of a Pitching Wave Energy Converter

    DEFF Research Database (Denmark)

    López, Maria del Pilar Heras; Thomas, Sarah; Kramer, Morten Mejlhede


    Validation of numerical model is essential in the development of new technologies. Commercial software and codes available simulating wave energy converters (WECs) have not been proved to work for all the available and upcoming technologies yet. The present paper presents the first stages of the ...

  2. Validation of an integral conceptual model of frailty in older residents of assisted living facilities

    NARCIS (Netherlands)

    Gobbens, Robbert J J; Krans, Anita; van Assen, Marcel A L M


    Objective: The aim of this cross-sectional study was to examine the validity of an integral model of the associations between life-course determinants, disease(s), frailty, and adverse outcomes in older persons who are resident in assisted living facilities. Methods: Between June 2013 and May 2014

  3. Understanding and Measuring Evaluation Capacity: A Model and Instrument Validation Study (United States)

    Taylor-Ritzler, Tina; Suarez-Balcazar, Yolanda; Garcia-Iriarte, Edurne; Henry, David B.; Balcazar, Fabricio E.


    This study describes the development and validation of the Evaluation Capacity Assessment Instrument (ECAI), a measure designed to assess evaluation capacity among staff of nonprofit organizations that is based on a synthesis model of evaluation capacity. One hundred and sixty-nine staff of nonprofit organizations completed the ECAI. The 68-item…

  4. Validation of an ANN Flow Prediction Model Using a Multt-Station Cluster Analysis

    NARCIS (Netherlands)

    Demirel, M.C.; Booij, Martijn J.; Kahya, E.


    The objective of this study is to validate a flow prediction model for a hydrometric station using a multistation criterion in addition to standard single-station performance criteria. In this contribution we used cluster analysis to identify the regional flow height, i.e., water-level patterns and

  5. Flexible Programmes in Higher Professional Education: Expert Validation of a Flexible Educational Model (United States)

    Schellekens, Ad; Paas, Fred; Verbraeck, Alexander; van Merrienboer, Jeroen J. G.


    In a preceding case study, a process-focused demand-driven approach for organising flexible educational programmes in higher professional education (HPE) was developed. Operations management and instructional design contributed to designing a flexible educational model by means of discrete-event simulation. Educational experts validated the model…

  6. The impact of school leadership on school level factors: validation of a causal model

    NARCIS (Netherlands)

    Krüger, M.L.; Witziers, B.; Sleegers, P.


    This study aims to contribute to a better understanding of the antecedents and effects of educational leadership, and of the influence of the principal's leadership on intervening and outcome variables. A path analysis was conducted to test and validate a causal model. The results show no direct or

  7. Development and internal validation of a multivariable model to predict perinatal death in pregnancy hypertension

    NARCIS (Netherlands)

    Payne, Beth A.; Groen, Henk; Ukah, U. Vivian; Ansermino, J. Mark; Bhutta, Zulfiqar; Grobman, William; Hall, David R.; Hutcheon, Jennifer A.; Magee, Laura A.; von Dadelszen, Peter


    Objective: To develop and internally validate a prognostic model for perinatal death that could guide community-based antenatal care of women with a hypertensive disorder of pregnancy (HDP) in low-resourced settings as part of a mobile health application. Study design: Using data from 1688 women

  8. Validation of an integral conceptual model of frailty in older residents of assisted living facilities

    NARCIS (Netherlands)

    Gobbens, R.J.J.; Krans, A.; van Assen, M.A.L.M.


    Objective The aim of this cross-sectional study was to examine the validity of an integral model of the associations between life-course determinants, disease(s), frailty, and adverse outcomes in older persons who are resident in assisted living facilities. Methods Between June 2013 and May 2014

  9. Validation of DSM-IV Model of Psychiatric Syndromes in Children with Autism Spectrum Disorders (United States)

    Lecavalier, Luc; Gadow, Kenneth D.; DeVincent, Carla J.; Edwards, Michael C.


    The objective of this study was to assess the internal construct validity of the DSM-IV as a conceptual model for characterizing behavioral syndromes in children with ASD. Parent and teachers completed the Child Symptom Inventory-4, a DSM-IV-referenced rating scale, for 6-to-12 year old clinic referrals with an ASD (N = 498). Ratings were…

  10. Analysis and classification of data sets for calibration and validation of agro-ecosystem models

    DEFF Research Database (Denmark)

    Kersebaum, K C; Boote, K J; Jorgenson, J S


    Experimental field data are used at different levels of complexity to calibrate, validate and improve agro-ecosystem models to enhance their reliability for regional impact assessment. A methodological framework and software are presented to evaluate and classify data sets into four classes...

  11. Comprehension of Multiple Documents with Conflicting Information: A Two-Step Model of Validation (United States)

    Richter, Tobias; Maier, Johanna


    In this article, we examine the cognitive processes that are involved when readers comprehend conflicting information in multiple texts. Starting from the notion of routine validation during comprehension, we argue that readers' prior beliefs may lead to a biased processing of conflicting information and a one-sided mental model of controversial…

  12. The PKRC's Value as a Professional Development Model Validated (United States)

    Larson, Dale


    After a brief review of the 4-H professional development standards, a new model for determining the value of continuing professional development is introduced and applied to the 4-H standards. The validity of the 4-H standards is affirmed. 4-H Extension professionals are encouraged to celebrate the strength of their standards and to engage the…

  13. Mathematical Capture of Human Data for Computer Model Building and Validation (United States)


    model building and validation Gladstone Reid, MSBME; Gordon Cooke, MEME ; Robert Demarco, MSBME; Caitlin Weaver, MSBME, MSME; John Riedener, MSSE...experimental automation design and implementation. GORDON COOKE, MEME , is a Principal Investigator at the TBRL. He was also a Chief Engineer, USMA

  14. A new criterion for assessing discriminant validity in variance-based structural equation modeling

    NARCIS (Netherlands)

    Henseler, Jörg; Ringle, Christian M.; Sarstedt, Marko


    Discriminant validity assessment has become a generally accepted prerequisite for analyzing relationships between latent variables. For variance-based structural equation modeling, such as partial least squares, the Fornell-Larcker criterion and the examination of cross-loadings are the dominant

  15. Dynamics of childhood growth and obesity development and validation of a quantitative mathematical model (United States)

    Clinicians and policy makers need the ability to predict quantitatively how childhood bodyweight will respond to obesity interventions. We developed and validated a mathematical model of childhood energy balance that accounts for healthy growth and development of obesity, and that makes quantitative...

  16. Creating a benchmark of vertical axis wind turbines in dynamic stall for validating numerical models

    DEFF Research Database (Denmark)

    Castelein, D.; Ragni, D.; Tescione, G.


    An experimental campaign using Particle Image Velocimetry (2C-PIV) technique has been conducted on a H-type Vertical Axis Wind Turbine (VAWT) to create a benchmark for validating and comparing numerical models. The turbine is operated at tip speed ratios (TSR) of 4.5 and 2, at an average chord...

  17. Modelling and validation approach for the development of a yaw controlled passenger car

    NARCIS (Netherlands)

    Lupker, H.A.; Jansen, S.T.H.


    The objective of this paper is to present a systematic modelling approach to vehicle handling simulation, validation and optimisation. Moreover, special attention will be paid to the use of mathematical tools for the development of a passenger car with active rear wheel steering for yaw motion

  18. Validation of a Theoretical Model of Diagnostic Classroom Assessment: A Mixed Methods Study (United States)

    Koh, Nancy


    The purpose of the study was to validate a theoretical model of diagnostic, formative classroom assessment called, "Proximal Assessment for Learner Diagnosis" (PALD). To achieve its purpose, the study employed a two-stage, mixed-methods design. The study utilized multiple data sources from 11 elementary level mathematics teachers who…

  19. Validation Of Naval Platform Electromagnetic Tools Via Model And Full-Scale Measurements

    NARCIS (Netherlands)

    van der Graaff, Jasper; Leferink, Frank Bernardus Johannes


    Reliable EMC predictions are very important in the design of a naval platform's topside. Currently, EMC predictions of a Navy ship are verified by scale model and full-scale measurements. In the near future, the validation of software tools leads to an increased confidence in EMC predictions and

  20. Towards a Generic Information Data Model for Verification, Validation & Accreditation VV&A

    NARCIS (Netherlands)

    Roza, Z.C.; Voogd, J.M.; Giannoulis, C.


    The Generic Methodology for Verification, Validation and Acceptance (GM-VV) is intended to provide a common generic framework for making formal and well balanced acceptance decisions on a specific usage of models, simulations and data. GM-VV will offer the international M&S community with a

  1. Validation study of the magnetically self-consistent inner magnetosphere model RAM-SCB (United States)

    Yu, Yiqun; Jordanova, Vania; Zaharia, Sorin; Koller, Josef; Zhang, Jichun; Kistler, Lynn M.


    The validation of the magnetically self-consistent inner magnetospheric model RAM-SCB developed at Los Alamos National Laboratory is presented here. The model consists of two codes: a kinetic ring current-atmosphere interaction model (RAM) and a 3-D equilibrium magnetic field code (SCB). The validation is conducted by simulating two magnetic storm events and then comparing the model results against a variety of satellite in situ observations, including the magnetic field from Cluster and Polar spacecraft, ion differential flux from the Cluster/CODIF (Composition and Distribution Function) analyzer, and the ground-based SYM-H index. The model prediction of the magnetic field is in good agreement with observations, which indicates the model's capability of representing well the inner magnetospheric field configuration. This provides confidence for the RAM-SCB model to be utilized for field line and drift shell tracing, which are needed in radiation belt studies. While the SYM-H index, which reflects the total ring current energy content, is generally reasonably reproduced by the model using the Weimer electric field model, the modeled ion differential flux clearly depends on the electric field strength, local time, and magnetic activity level. A self-consistent electric field approach may be needed to improve the model performance in this regard.

  2. Transient validation of RELAP5 model with the DISS facility in once through operation mode (United States)

    Serrano-Aguilera, J. J.; Valenzuela, L.


    Thermal-hydraulic code RELAP5 has been used to model a Solar Direct Steam Generation (DSG) system. Experimental data from the DISS facility located at Plataforma Solar de Almería is compared to the numerical results of the RELAP5 model in order to validate it. Both the model and the experimental set-up are in once through operation mode where no injection or active control is regarded. Time dependent boundary conditions are taken into account. This work is a preliminary study of further research that will be carried out in order to achieve a thorough validation of RELAP5 models in the context of DSG in line-focus solar collectors.

  3. Groundwater Model Validation for the Project Shoal Area, Corrective Action Unit 447

    Energy Technology Data Exchange (ETDEWEB)

    Hassan, Ahmed [Desert Research Inst. (DRI), Las Vegas, NV (United States). Division of Hydrologic Sciences; Chapman, Jenny [Desert Research Inst. (DRI), Las Vegas, NV (United States). Division of Hydrologic Sciences; Lyles, Brad [Desert Research Inst. (DRI), Las Vegas, NV (United States). Division of Hydrologic Sciences


    Stoller has examined newly collected water level data in multiple wells at the Shoal site. On the basis of these data and information presented in the report, we are currently unable to confirm that the model is successfully validated. Most of our concerns regarding the model stem from two findings: (1) measured water level data do not provide clear evidence of a prevailing lateral flow direction; and (2) the groundwater flow system has been and continues to be in a transient state, which contrasts with assumed steady-state conditions in the model. The results of DRI's model validation efforts and observations made regarding water level behavior are discussed in the following sections. A summary of our conclusions and recommendations for a path forward are also provided in this letter report.

  4. Using the mouse to model human disease: increasing validity and reproducibility

    Directory of Open Access Journals (Sweden)

    Monica J. Justice


    Full Text Available Experiments that use the mouse as a model for disease have recently come under scrutiny because of the repeated failure of data, particularly derived from preclinical studies, to be replicated or translated to humans. The usefulness of mouse models has been questioned because of irreproducibility and poor recapitulation of human conditions. Newer studies, however, point to bias in reporting results and improper data analysis as key factors that limit reproducibility and validity of preclinical mouse research. Inaccurate and incomplete descriptions of experimental conditions also contribute. Here, we provide guidance on best practice in mouse experimentation, focusing on appropriate selection and validation of the model, sources of variation and their influence on phenotypic outcomes, minimum requirements for control sets, and the importance of rigorous statistics. Our goal is to raise the standards in mouse disease modeling to enhance reproducibility, reliability and clinical translation of findings.

  5. Validity of the Janssen Model for Layered Structures in a Silo

    Directory of Open Access Journals (Sweden)

    Abdul Qadir


    Full Text Available Granular materials are found every where despite they are poorly understood at microscopic level. The main hindrance is how to connect the microscopic properties with the macroscopic behavior and propose a rigorous unified theory. One method is to test the existing theoretical models in various configurations. In this connection we have performed experiments in different configurations of granules in a silo to determine the validity of the Janssen model under such arrangements. Two and four layered structures of different bead diameters are prepared. The effective mass at the bottom of the container in such cases have been measured. Moreover, the investigation of layered structures reveals that such configurations also follow well the Janssen model. An interesting percolation phenomenon was observed when smaller beads were stacked on larger ones, despite the model remained valid. Furthermore, it is demonstrated that Janssen law holds for larger bead diameters.

  6. Heat Transfer Modeling and Validation for Optically Thick Alumina Fibrous Insulation (United States)

    Daryabeigi, Kamran


    Combined radiation/conduction heat transfer through unbonded alumina fibrous insulation was modeled using the diffusion approximation for modeling the radiation component of heat transfer in the optically thick insulation. The validity of the heat transfer model was investigated by comparison to previously reported experimental effective thermal conductivity data over the insulation density range of 24 to 96 kg/cu m, with a pressure range of 0.001 to 750 torr (0.1 to 101.3 x 10(exp 3) Pa), and test sample hot side temperature range of 530 to 1360 K. The model was further validated by comparison to thermal conductivity measurements using the transient step heating technique on an insulation sample at a density of 144 kg/cu m over a pressure range of 0.001 to 760 torr, and temperature range of 290 to 1090 K.

  7. A model validation framework for climate change projection and impact assessment

    DEFF Research Database (Denmark)

    Madsen, Henrik; Refsgaard, Jens C.; Andréassian, Vazken


    a differential split‐sample test using best available proxy data that reflect the expected future conditions at the site being considered. Such proxy data may be obtained from long historical records comprising nonstationarity, paleo data, or controlled experiments. The test can be applied with different...... using proxies of future conditions. In general, a model that has been setup for solving a specific problem at a particular site should be tested in order to document its predictive capability and credibility. In a climate change context such tests, often referred to as model validations tests...... predictions performed under stationary (split‐sample tests) or non‐ stationary conditions (differential split‐ sample test), and if the model is applied at the site where it was calibrated or at a different site (proxy site tests). This model validation scheme has been assessed in relation to use of different...

  8. Experimental Validation of the Butyl-Rubber Finite Element (FE) Material Model for the Blast-Mitigating Floor Mat (United States)


    Experimental Validation of the Butyl- Rubber Finite Element (FE) Material Model for the Blast-Mitigating Floor Mat by Masayuki Sakamoto...MD 20783-1138 ARL-SR-0329 August 2015 Experimental Validation of the Butyl- Rubber Finite Element (FE) Material Model for the Blast...SUBTITLE Experimental Validation of the Butyl- Rubber Finite Element (FE) Material Model for the Blast-Mitigating Floor Mat 5a. CONTRACT NUMBER 5b

  9. Charge Reduction Potentials of Several Refrigerants Based on Experimentally Validated Micro-Channel Heat Exchangers Performance and Charge Model


    Padilla Fuentes, Yadira; Hrnjak, Predrag S.


    This paper presents an experimentally validated simulation model developed to obtain accurate prediction of evaporator microchannel heat exchanger performance and charge. Effects of using various correlations are presented and discussed with focus on serpentine microchannel evaporators. Experiments with propane are used to validate the model. The experimentally validated model is used to compare the charge reduction potential of various refrigerants. The procedure for charge reduction analysi...

  10. Development and Validation of a Polarimetric-MCScene 3D Atmospheric Radiation Model

    Energy Technology Data Exchange (ETDEWEB)

    Berk, Alexander [Spectral Sciences, Inc., Burlington, MA (United States); Hawes, Frederick [Spectral Sciences, Inc., Burlington, MA (United States); Fox, Marsha [Spectral Sciences, Inc., Burlington, MA (United States)


    Polarimetric measurements can substantially enhance the ability of both spectrally resolved and single band imagery to detect the proliferation of weapons of mass destruction, providing data for locating and identifying facilities, materials, and processes of undeclared and proliferant nuclear weapons programs worldwide. Unfortunately, models do not exist that efficiently and accurately predict spectral polarized signatures for the materials of interest embedded in complex 3D environments. Having such a model would enable one to test hypotheses and optimize both the enhancement of scene contrast and the signal processing for spectral signature extraction. The Phase I set the groundwork for development of fully validated polarimetric spectral signature and scene simulation models. This has been accomplished 1. by (a) identifying and downloading state-of-the-art surface and atmospheric polarimetric data sources, (b) implementing tools for generating custom polarimetric data, and (c) identifying and requesting US Government funded field measurement data for use in validation; 2. by formulating an approach for upgrading the radiometric spectral signature model MODTRAN to generate polarimetric intensities through (a) ingestion of the polarimetric data, (b) polarimetric vectorization of existing MODTRAN modules, and (c) integration of a newly developed algorithm for computing polarimetric multiple scattering contributions; 3. by generating an initial polarimetric model that demonstrates calculation of polarimetric solar and lunar single scatter intensities arising from the interaction of incoming irradiances with molecules and aerosols; 4. by developing a design and implementation plan to (a) automate polarimetric scene construction and (b) efficiently sample polarimetric scattering and reflection events, for use in a to be developed polarimetric version of the existing first-principles synthetic scene simulation model, MCScene; and 5. by planning a validation field

  11. Validation of Loss and Continuation Rate Models to Support Navy Community Management (United States)


    statistics and results are discussed for each model. A discussion of the forecast results is limited to Aviation Structural Mechanic (Safety cumbersome. As a result, NPRST opted to use the EMF as the source data. 2. Data contain observations from 1/1999-9/2009, with observations...SC_IND is a variable in the EMF that denotes if an individual is considered active duty, reserves, or inactive. The data used for model validation

  12. Validation of Nonlinear Bipolar Transistor Model by Small-Signal Measurements

    DEFF Research Database (Denmark)

    Vidkjær, Jens; Porra, V.; Zhu, J.


    A new method for the validity analysis of nonlinear transistor models is presented based on DC-and small-signal S-parameter measurements and realistic consideration of the measurement and de-embedding errors and singularities of the small-signal equivalent circuit. As an example, some analysis...... results for an extended Gummel Poon model are presented in the case of a UHF bipolar power transistor....

  13. High-resolution computational algorithms for simulating offshore wind turbines and farms: Model development and validation

    Energy Technology Data Exchange (ETDEWEB)

    Calderer, Antoni [Univ. of Minnesota, Minneapolis, MN (United States); Yang, Xiaolei [Stony Brook Univ., NY (United States); Angelidis, Dionysios [Univ. of Minnesota, Minneapolis, MN (United States); Feist, Chris [Univ. of Minnesota, Minneapolis, MN (United States); Guala, Michele [Univ. of Minnesota, Minneapolis, MN (United States); Ruehl, Kelley [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Guo, Xin [Univ. of Minnesota, Minneapolis, MN (United States); Boomsma, Aaron [Univ. of Minnesota, Minneapolis, MN (United States); Shen, Lian [Univ. of Minnesota, Minneapolis, MN (United States); Sotiropoulos, Fotis [Stony Brook Univ., NY (United States)


    The present project involves the development of modeling and analysis design tools for assessing offshore wind turbine technologies. The computational tools developed herein are able to resolve the effects of the coupled interaction of atmospheric turbulence and ocean waves on aerodynamic performance and structural stability and reliability of offshore wind turbines and farms. Laboratory scale experiments have been carried out to derive data sets for validating the computational models.

  14. Recent Progress Validating the HADES Model of LLNL's HEAF MicroCT Measurements

    Energy Technology Data Exchange (ETDEWEB)

    White, W. T. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Bond, K. C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Lennox, K. P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Aufderheide, M. B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Seetho, I. M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Roberson, G. P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)


    This report compares recent HADES calculations of x-ray linear attenuation coefficients to previous MicroCT measurements made at Lawrence Livermore National Laboratory’s High Energy Applications Facility (HEAF). The chief objective is to investigate what impact recent changes in HADES modeling have on validation results. We find that these changes have no obvious effect on the overall accuracy of the model. Detailed comparisons between recent and previous results are presented.

  15. Stochastic Bus Traffic Modelling and Validation Using Smart Card Fare Collection Data


    Ivanchev, Jordan;Aydt, Heiko;Knoll, Alois


    This paper examines some of the most important contributions regarding bus transit control and optimisation from a modelling perspective. It provides guidelines for designing simulations of this type and points to the need of a qualitatively and quantitatively correct model that is validated and can serve as a benchmark testing platform for the numerous optimisation strategies that researchers present. It examines a case study about a bus line in Singapore and uses smart card data to extract ...

  16. Multivariable prediction model for suspected giant cell arteritis: development and validation (United States)

    Ing, Edsel B; Lahaie Luna, Gabriela; Toren, Andrew; Ing, Royce; Chen, John J; Arora, Nitika; Torun, Nurhan; Jakpor, Otana A; Fraser, J Alexander; Tyndel, Felix J; Sundaram, Arun NE; Liu, Xinyang; Lam, Cindy TY; Patel, Vivek; Weis, Ezekiel; Jordan, David; Gilberg, Steven; Pagnoux, Christian; ten Hove, Martin


    Purpose To develop and validate a diagnostic prediction model for patients with suspected giant cell arteritis (GCA). Methods A retrospective review of records of consecutive adult patients undergoing temporal artery biopsy (TABx) for suspected GCA was conducted at seven university centers. The pathologic diagnosis was considered the final diagnosis. The predictor variables were age, gender, new onset headache, clinical temporal artery abnormality, jaw claudication, ischemic vision loss (VL), diplopia, erythrocyte sedimentation rate (ESR), C-reactive protein (CRP), and platelet level. Multiple imputation was performed for missing data. Logistic regression was used to compare our models with the non-histologic American College of Rheumatology (ACR) GCA classification criteria. Internal validation was performed with 10-fold cross validation and bootstrap techniques. External validation was performed by geographic site. Results There were 530 complete TABx records: 397 were negative and 133 positive for GCA. Age, jaw claudication, VL, platelets, and log CRP were statistically significant predictors of positive TABx, whereas ESR, gender, headache, and temporal artery abnormality were not. The parsimonious model had a cross-validated bootstrap area under the receiver operating characteristic curve (AUROC) of 0.810 (95% CI =0.766–0.854), geographic external validation AUROC’s in the range of 0.75–0.85, calibration pH–L of 0.812, sensitivity of 43.6%, and specificity of 95.2%, which outperformed the ACR criteria. Conclusion Our prediction rule with calculator and nomogram aids in the triage of patients with suspected GCA and may decrease the need for TABx in select low-score at-risk subjects. However, misclassification remains a concern. PMID:29200816

  17. Validity of "Hi_Science" as instructional media based-android refer to experiential learning model (United States)

    Qamariah, Jumadi, Senam, Wilujeng, Insih


    Hi_Science is instructional media based-android in learning science on material environmental pollution and global warming. This study is aimed: (a) to show the display of Hi_Science that will be applied in Junior High School, and (b) to describe the validity of Hi_Science. Hi_Science as instructional media created with colaboration of innovative learning model and development of technology at the current time. Learning media selected is based-android and collaborated with experiential learning model as an innovative learning model. Hi_Science had adapted student worksheet by Taufiq (2015). Student worksheet had very good category by two expert lecturers and two science teachers (Taufik, 2015). This student worksheet is refined and redeveloped in android as an instructional media which can be used by students for learning science not only in the classroom, but also at home. Therefore, student worksheet which has become instructional media based-android must be validated again. Hi_Science has been validated by two experts. The validation is based on assessment of meterials aspects and media aspects. The data collection was done by media assessment instrument. The result showed the assessment of material aspects has obtained the average value 4,72 with percentage of agreement 96,47%, that means Hi_Science on the material aspects is in excellent category or very valid category. The assessment of media aspects has obtained the average value 4,53 with percentage of agreement 98,70%, that means Hi_Science on the media aspects is in excellent category or very valid category. It was concluded that Hi_Science as instructional media can be applied in the junior high school.

  18. Psychometric validation of the consensus five-factor model of the Positive and Negative Syndrome Scale. (United States)

    Fong, Ted C T; Ho, Rainbow T H; Wan, Adrian H Y; Siu, Pantha Joey C Y; Au-Yeung, Friendly S W


    The Positive and Negative Syndrome Scale (PANSS) is widely used for clinical assessment of symptoms in schizophrenia. Instead of the traditional pyramidal model, recent literature supports the pentagonal model for the dimensionality of the PANSS. The present study aimed to validate the consensus five-factor model of the PANSS and evaluate its convergent validity. Participants were 146 Chinese chronic schizophrenic patients who completed diagnostic interviews and cognitive assessments. Exploratory structural equation modeling (ESEM) was performed to investigate the dimensionality of the PANSS. Covariates (age, sex, and education level) and concurrent outcomes (perceived stress, memory, daily living functions, and motor deficits) were added in the ESEM model. The results supported the consensus 5-factor underlying structure, which comprised 20 items categorized into positive, negative, excitement, depression, and cognitive factors with acceptable reliability (α=.69-.85) and strong factor loadings (λ=.41-.93). The five factors, especially the cognitive factor, showed evident convergent validity with the covariates and concurrent outcomes. The results support the consensus five-factor structure of the PANSS as a robust measure of symptoms in schizophrenia. Future studies could explore the clinical and practical utility of the consensus five-factor model. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Validation of a FAST Model of the SWAY Prototype Floating Wind Turbine

    Energy Technology Data Exchange (ETDEWEB)

    Koh, J. H. [Nanyang Technological Univ. (Singapore); Ng, E. Y. K. [Nanyang Technological Univ. (Singapore); Robertson, Amy [National Renewable Energy Lab. (NREL), Golden, CO (United States); Jonkman, Jason [National Renewable Energy Lab. (NREL), Golden, CO (United States); Driscoll, Frederick [National Renewable Energy Lab. (NREL), Golden, CO (United States)


    As part of a collaboration of the National Renewable Energy Laboratory (NREL) and SWAY AS, NREL installed scientific wind, wave, and motion measurement equipment on the spar-type 1/6.5th-scale prototype SWAY floating offshore wind system. The equipment enhanced SWAY's data collection and allowed SWAY to verify the concept and NREL to validate a FAST model of the SWAY design in an open-water condition. Nanyang Technological University (NTU), in collaboration with NREL, assisted with the validation. This final report gives an overview of the SWAY prototype and NREL and NTU's efforts to validate a model of the system. The report provides a summary of the different software tools used in the study, the modeling strategies, and the development of a FAST model of the SWAY prototype wind turbine, including justification of the modeling assumptions. Because of uncertainty in system parameters and modeling assumptions due to the complexity of the design, several system properties were tuned to better represent the system and improve the accuracy of the simulations. Calibration was performed using data from a static equilibrium test and free-decay tests.

  20. Validation of periodontitis screening model using sociodemographic, systemic, and molecular information in a Korean population. (United States)

    Kim, Hyun-Duck; Sukhbaatar, Munkhzaya; Shin, Myungseop; Ahn, Yoo-Been; Yoo, Wook-Sung


    This study aims to evaluate and validate a periodontitis screening model that includes sociodemographic, metabolic syndrome (MetS), and molecular information, including gingival crevicular fluid (GCF), matrix metalloproteinase (MMP), and blood cytokines. The authors selected 506 participants from the Shiwha-Banwol cohort: 322 participants from the 2005 cohort for deriving the screening model and 184 participants from the 2007 cohort for its validation. Periodontitis was assessed by dentists using the community periodontal index. Interleukin (IL)-6, IL-8, and tumor necrosis factor-α in blood and MMP-8, -9, and -13 in GCF were assayed using enzyme-linked immunosorbent assay. MetS was assessed by physicians using physical examination and blood laboratory data. Information about age, sex, income, smoking, and drinking was obtained by interview. Logistic regression analysis was applied to finalize the best-fitting model and validate the model using sensitivity, specificity, and c-statistics. The derived model for periodontitis screening had a sensitivity of 0.73, specificity of 0.85, and c-statistic of 0.86 (P smoking, drinking, and blood and GCF biomarkers could be useful in screening for periodontitis. A future prospective study is indicated for evaluating this model's ability to predict the occurrence of periodontitis.

  1. Validation of a Novel Traditional Chinese Medicine Pulse Diagnostic Model Using an Artificial Neural Network

    Directory of Open Access Journals (Sweden)

    Anson Chui Yan Tang


    Full Text Available In view of lacking a quantifiable traditional Chinese medicine (TCM pulse diagnostic model, a novel TCM pulse diagnostic model was introduced to quantify the pulse diagnosis. Content validation was performed with a panel of TCM doctors. Criterion validation was tested with essential hypertension. The gold standard was brachial blood pressure measured by a sphygmomanometer. Two hundred and sixty subjects were recruited (139 in the normotensive group and 121 in the hypertensive group. A TCM doctor palpated pulses at left and right cun, guan, and chi points, and quantified pulse qualities according to eight elements (depth, rate, regularity, width, length, smoothness, stiffness, and strength on a visual analog scale. An artificial neural network was used to develop a pulse diagnostic model differentiating essential hypertension from normotension. Accuracy, specificity, and sensitivity were compared among various diagnostic models. About 80% accuracy was attained among all models. Their specificity and sensitivity varied, ranging from 70% to nearly 90%. It suggested that the novel TCM pulse diagnostic model was valid in terms of its content and diagnostic ability.

  2. Development and internal validation of a multivariable model to predict perinatal death in pregnancy hypertension. (United States)

    Payne, Beth A; Groen, Henk; Ukah, U Vivian; Ansermino, J Mark; Bhutta, Zulfiqar; Grobman, William; Hall, David R; Hutcheon, Jennifer A; Magee, Laura A; von Dadelszen, Peter


    To develop and internally validate a prognostic model for perinatal death that could guide community-based antenatal care of women with a hypertensive disorder of pregnancy (HDP) in low-resourced settings as part of a mobile health application. Using data from 1688 women (110 (6.5%) perinatal deaths) admitted to hospital after 32weeks gestation with a HDP from five low-resourced countries in the miniPIERS prospective cohort, a logistic regression model to predict perinatal death was developed and internally validated. Model discrimination, calibration, and classification accuracy were assessed and compared with use of gestational age alone to determine prognosis. Stillbirth or neonatal death before hospital discharge. The final model included maternal age; a count of symptoms (0, 1 or ⩾2); and dipstick proteinuria. The area under the receiver operating characteristic curve was 0.75 [95% CI 0.71-0.80]. The model correctly identified 42/110 (38.2%) additional cases as high-risk (probability >15%) of perinatal death compared with use of only gestational age perinatal death and who would benefit from transfer to facility-based care. This model requires external validation and assessment in an implementation study to confirm performance. Copyright © 2015 International Society for the Study of Hypertension in Pregnancy. Published by Elsevier B.V. All rights reserved.

  3. Validation and comparison of geostatistical and spline models for spatial stream networks. (United States)

    Rushworth, A M; Peterson, E E; Ver Hoef, J M; Bowman, A W


    Scientists need appropriate spatial-statistical models to account for the unique features of stream network data. Recent advances provide a growing methodological toolbox for modelling these data, but general-purpose statistical software has only recently emerged, with little information about when to use different approaches. We implemented a simulation study to evaluate and validate geostatistical models that use continuous distances, and penalised spline models that use a finite discrete approximation for stream networks. Data were simulated from the geostatistical model, with performance measured by empirical prediction and fixed effects estimation. We found that both models were comparable in terms of squared error, with a slight advantage for the geostatistical models. Generally, both methods were unbiased and had valid confidence intervals. The most marked differences were found for confidence intervals on fixed-effect parameter estimates, where, for small sample sizes, the spline models underestimated variance. However, the penalised spline models were always more computationally efficient, which may be important for real-time prediction and estimation. Thus, decisions about which method to use must be influenced by the size and format of the data set, in addition to the characteristics of the environmental process and the modelling goals. ©2015 The Authors. Environmetrics published by John Wiley & Sons, Ltd.

  4. Model validation lessons learned: A case study at Oak Ridge National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Ketelle, R.H.; Lee, R.R.; Bownds, J.M. [Oak Ridge National Lab., TN (United States); Rizk, T.A. [North Carolina State Univ., Raleigh, NC (United States)


    A groundwater flow and contaminant transport model validation study was performed to determine the applicability of typical groundwater flow models for performance assessment of proposed waste disposal facilities at Oak Ridge, Tennessee. Standard practice site interpretation and groundwater modeling resulted in inaccurate predictions of contaminant transport at a proposed waste disposal site. The site`s complex and heterogeneous geology, the presence of flow dominated by fractured and weathered zones, and the strongly transient character of shallow aquifer recharge and discharge combined to render assumptions of steady-state, homogeneous groundwater flow invalid. The study involved iterative phases of site field investigation and modeling. Subsequent modeling activities focused on generation of a model grid incorporating the observed site geologic heterogeneity, and on establishing and using model boundary conditions based on site data. Time dependent water table configurations, and fixed head boundary conditions were used as input to the refined model in simulating groundwater flow at the site.

  5. Pore Level Modeling of Immiscible Drainage: Validation in the Invasion Percolation and DLA Limits

    Energy Technology Data Exchange (ETDEWEB)

    Ferer, M.V.; Bromhal, G.S.; Smith, D.H.


    Motivated by a wide-range of applications from ground water remediation to carbon dioxide sequestration and by difficulties in reconciling experiments with previous modeling, we have developed a pore-level model of two-phase flow in porous media. We have attempted to make our model as physical and as reliable as possible, incorporating both capillary effects and viscous effects. After a detailed discussion of the model, we validate it in the very different limits of zero capillary number and zero-viscosity ratio. Invasion percolation (IP) models the flow in the limit of zero capillary number; results from our model show detailed agreement with results from IP, for small capillary numbers. Diffusion limited aggregation (DLA) models the flow in the limit of zero-viscosity ratio; flow patterns from our model have the same fractal dimension as patterns from DLA for small viscosity ratios.

  6. Verification and Validation of a Three-Dimensional Orthotropic Plasticity Constitutive Model Using a Unidirectional Composite

    Directory of Open Access Journals (Sweden)

    Canio Hoffarth


    Full Text Available A three-dimensional constitutive model has been developed for modeling orthotropic composites subject to impact loads. It has three distinct components—a deformation model involving elastic and plastic deformations; a damage model; and a failure model. The model is driven by tabular data that is generated either using laboratory tests or via virtual testing. A unidirectional composite—T800/F3900, commonly used in the aerospace industry, is used in the verification and validation tests. While the failure model is under development, these tests indicate that the implementation of the deformation and damage models in a commercial finite element program, LS-DYNA, is efficient, robust and accurate.

  7. Development of a coupled physical-biological ecosystem model ECOSMO - Part I: Model description and validation for the North Sea

    DEFF Research Database (Denmark)

    Schrum, Corinna; Alekseeva, I.; St. John, Michael


    forcing at temporal scale of 6 h using atmospheric re-analysis data. The model was integrated for 1984 and 1986. The simulated fields for 1984 were used to investigate the annual spatial distribution of phytoplankton and zooplankton biomass and their production in the North Sea. A detailed validation...

  8. A physiological Intensive Control Insulin-Nutrition-Glucose (ICING) model validated in critically ill patients. (United States)

    Lin, Jessica; Razak, Normy N; Pretty, Christopher G; Le Compte, Aaron; Docherty, Paul; Parente, Jacquelyn D; Shaw, Geoffrey M; Hann, Christopher E; Geoffrey Chase, J


    Intensive insulin therapy (IIT) and tight glycaemic control (TGC), particularly in intensive care unit (ICU), are the subjects of increasing and controversial debate in recent years. Model-based TGC has shown potential in delivering safe and tight glycaemic management, all the while limiting hypoglycaemia. A comprehensive, more physiologically relevant Intensive Control Insulin-Nutrition-Glucose (ICING) model is presented and validated using data from critically ill patients. Two existing glucose-insulin models are reviewed and formed the basis for the ICING model. Model limitations are discussed with respect to relevant physiology, pharmacodynamics and TGC practicality. Model identifiability issues are carefully considered for clinical settings. This article also contains significant reference to relevant physiology and clinical literature, as well as some references to the modeling efforts in this field. Identification of critical constant population parameters was performed in two stages, thus addressing model identifiability issues. Model predictive performance is the primary factor for optimizing population parameter values. The use of population values are necessary due to the limited clinical data available at the bedside in the clinical control scenario. Insulin sensitivity, S(I), the only dynamic, time-varying parameter, is identified hourly for each individual. All population parameters are justified physiologically and with respect to values reported in the clinical literature. A parameter sensitivity study confirms the validity of limiting time-varying parameters to S(I) only, as well as the choices for the population parameters. The ICING model achieves median fitting error of ICING model is suitable for developing model-based insulin therapies, and capable of delivering real-time model-based TGC with a very tight prediction error range. Finally, the detailed examination and discussion of issues surrounding model-based TGC and existing glucose

  9. Modeling Root Growth, Crop Growth and N Uptake of Winter Wheat Based on SWMS_2D: Model and Validation

    Directory of Open Access Journals (Sweden)

    Dejun Yang

    Full Text Available ABSTRACT Simulations for root growth, crop growth, and N uptake in agro-hydrological models are of significant concern to researchers. SWMS_2D is one of the most widely used physical hydrologically related models. This model solves equations that govern soil-water movement by the finite element method, and has a public access source code. Incorporating key agricultural components into the SWMS_2D model is of practical importance, especially for modeling some critical cereal crops such as winter wheat. We added root growth, crop growth, and N uptake modules into SWMS_2D. The root growth model had two sub-models, one for root penetration and the other for root length distribution. The crop growth model used was adapted from EU-ROTATE_N, linked to the N uptake model. Soil-water limitation, nitrogen limitation, and temperature effects were all considered in dry-weight modeling. Field experiments for winter wheat in Bouwing, the Netherlands, in 1983-1984 were selected for validation. Good agreements were achieved between simulations and measurements, including soil water content at different depths, normalized root length distribution, dry weight and nitrogen uptake. This indicated that the proposed new modules used in the SWMS_2D model are robust and reliable. In the future, more rigorous validation should be carried out, ideally under 2D situations, and attention should be paid to improve some modules, including the module simulating soil N mineralization.

  10. Signatures of Subacute Potentially Catastrophic Illness in the ICU: Model Development and Validation. (United States)

    Moss, Travis J; Lake, Douglas E; Calland, J Forrest; Enfield, Kyle B; Delos, John B; Fairchild, Karen D; Moorman, J Randall


    Patients in ICUs are susceptible to subacute potentially catastrophic illnesses such as respiratory failure, sepsis, and hemorrhage that present as severe derangements of vital signs. More subtle physiologic signatures may be present before clinical deterioration, when treatment might be more effective. We performed multivariate statistical analyses of bedside physiologic monitoring data to identify such early subclinical signatures of incipient life-threatening illness. We report a study of model development and validation of a retrospective observational cohort using resampling (Transparent Reporting of a multivariable prediction model for Individual Prognosis Or Diagnosis type 1b internal validation) and a study of model validation using separate data (type 2b internal/external validation). University of Virginia Health System (Charlottesville), a tertiary-care, academic medical center. Critically ill patients consecutively admitted between January 2009 and June 2015 to either the neonatal, surgical/trauma/burn, or medical ICUs with available physiologic monitoring data. None. We analyzed 146 patient-years of vital sign and electrocardiography waveform time series from the bedside monitors of 9,232 ICU admissions. Calculations from 30-minute windows of the physiologic monitoring data were made every 15 minutes. Clinicians identified 1,206 episodes of respiratory failure leading to urgent unplanned intubation, sepsis, or hemorrhage leading to multi-unit transfusions from systematic individual chart reviews. Multivariate models to predict events up to 24 hours prior had internally validated C-statistics of 0.61-0.88. In adults, physiologic signatures of respiratory failure and hemorrhage were distinct from each other but externally consistent across ICUs. Sepsis, on the other hand, demonstrated less distinct and inconsistent signatures. Physiologic signatures of all neonatal illnesses were similar. Subacute potentially catastrophic illnesses in three diverse ICU

  11. Methods for validation of the mass distribution of a full body finite element model - biomed 2011. (United States)

    Thompson, A Bradley; Rhyne, Ashley C; Moreno, Daniel P; Gayzik, F Scott; Stitzel, Joel D


    Accurate mass distribution in computational human body models is essential for kinematic and kinetic validation. The purpose of this study was to validate the mass distribution of the 50th percentile male model (M50) developed as part of the Global Human Body Models Consortium (GHBMC) project. The body segment centers of gravity (CG) of M50 were compared against published data in two ways: using a homogeneous body surface CAD model, and a Finite Element Model (FEM). Both the CAD and FEM models were generated from image data collected from the same 50th percentile male subject. Each model was partitioned into 11 segments, using segment planes constructed from bony landmarks acquired from the subject. CG’s of the CAD and FEA models were computed using commercially available software packages. Deviation between the literature data CG’s and CG’s of the FEM and CAD were 5.8% and 5.6% respectively when normalized by a regional characteristic length. Deviation between the FEM and CAD CG’s averaged 2.4% when normalized in the same fashion. Unlike the CAD and literature which both assume homogenous mass distribution, the FEM CG data account for varying densities of anatomical structures by virtue of the assigned material properties. This analysis validates the CG’s determined from each model by comparing them directly to well-known literature studies that rely only on anthropometric landmarks to determine the CG’s measurements. The results of this study will help enhance the biofidelity of the GHBMC M50 model.

  12. Dimensional reductions of a cardiac model for effective validation and calibration. (United States)

    Caruel, M; Chabiniok, R; Moireau, P; Lecarpentier, Y; Chapelle, D


    Complex 3D beating heart models are now available, but their complexity makes calibration and validation very difficult tasks. We thus propose a systematic approach of deriving simplified reduced-dimensional models, in "0D"-typically, to represent a cardiac cavity, or several coupled cavities-and in "1D"-to model elongated structures such as muscle samples or myocytes. We apply this approach with an earlier-proposed 3D cardiac model designed to capture length-dependence effects in contraction, which we here complement by an additional modeling component devised to represent length-dependent relaxation. We then present experimental data produced with rat papillary muscle samples when varying preload and afterload conditions, and we achieve some detailed validations of the 1D model with these data, including for the length-dependence effects that are accurately captured. Finally, when running simulations of the 0D model pre-calibrated with the 1D model parameters, we obtain pressure-volume indicators of the left ventricle in good agreement with some important features of cardiac physiology, including the so-called Frank-Starling mechanism, the End-Systolic Pressure-Volume Relationship, as well as varying elastance properties. This integrated multi-dimensional modeling approach thus sheds new light on the relations between the phenomena observed at different scales and at the local versus organ levels.

  13. Empirical Model and Validation of Bar Formation in Sand Bed Channel

    Directory of Open Access Journals (Sweden)

    Tholibon Duratul Ain


    Full Text Available Both present experimental and previous historical data were used to develop the empirical model. Generally, the aim of this study to establish an empirical model of bar formation from the present experimental data and selected historical data. Statistical techniques using Multiple Linear Regression Analysis were employed for empirical model development and the selection of bar formation parameters were done based on the works of previous investigators. Validation of the newly developed empirical model was done by using a different set of historical data from selected laboratory studies. Model development involved selection of parameters through review of established models, dimensional analysis to check on the homogeneity of the model and statistical analysis. Derived empirical model has been validated using a different set of data from previous studies. Analysis confirmed that the empirical model derived using linear regression technique depicts the highest discrepancy ratio accuracy of 90% with Dds${D \\over {{d_s}}}$ and BD${B \\over D}$ as the most significant parameters that promote bar height formation.

  14. A method for landing gear modeling and simulation with experimental validation (United States)

    Daniels, James N.


    This document presents an approach for modeling and simulating landing gear systems. Specifically, a nonlinear model of an A-6 Intruder Main Gear is developed, simulated, and validated against static and dynamic test data. This model includes nonlinear effects such as a polytropic gas model, velocity squared damping, a geometry governed model for the discharge coefficients, stick-slip friction effects and a nonlinear tire spring and damping model. An Adams-Moulton predictor corrector was used to integrate the equations of motion until a discontinuity caused by a stick-slip friction model was reached, at which point, a Runga-Kutta routine integrated past the discontinuity and returned the problem solution back to the predictor corrector. Run times of this software are around 2 mins. per 1 sec. of simulation under dynamic circumstances. To validate the model, engineers at the Aircraft Landing Dynamics facilities at NASA Langley Research Center installed one A-6 main gear on a drop carriage and used a hydraulic shaker table to provide simulated runway inputs to the gear. Model parameters were tuned to produce excellent agreement for many cases.

  15. Modelling of PEM Fuel Cell Performance: Steady-State and Dynamic Experimental Validation

    Directory of Open Access Journals (Sweden)

    Idoia San Martín


    Full Text Available This paper reports on the modelling of a commercial 1.2 kW proton exchange membrane fuel cell (PEMFC, based on interrelated electrical and thermal models. The electrical model proposed is based on the integration of the thermodynamic and electrochemical phenomena taking place in the FC whilst the thermal model is established from the FC thermal energy balance. The combination of both models makes it possible to predict the FC voltage, based on the current demanded and the ambient temperature. Furthermore, an experimental characterization is conducted and the parameters for the models associated with the FC electrical and thermal performance are obtained. The models are implemented in Matlab Simulink and validated in a number of operating environments, for steady-state and dynamic modes alike. In turn, the FC models are validated in an actual microgrid operating environment, through the series connection of 4 PEMFC. The simulations of the models precisely and accurately reproduce the FC electrical and thermal performance.

  16. External validation of a prognostic model for predicting survival of cirrhotic patients with refractory ascites. (United States)

    Guardiola, Jordi; Baliellas, Carme; Xiol, Xavier; Fernandez Esparrach, Glòria; Ginès, Pere; Ventura, Pere; Vazquez, Santiago


    Cirrhotic patients with refractory ascites (RA) have a poor prognosis, although individual survival varies greatly. A model that could predict survival for patients with RA would be helpful in planning treatment. Moreover, in cases of potential liver transplantation, a model of these characteristics would provide the bases for establishing priorities of organ allocation and the selection of patients for a living donor graft. Recently, we developed a model to predict survival of patients with RA. The aim of this study was to establish its generalizability for predicting the survival of patients with RA. The model was validated by assessing its performance in an external cohort of patients with RA included in a multicenter, randomized, controlled trial that compared large-volume paracentesis and peritoneovenous shunt. The values for actual and model-predicted survival of three risk groups of patients, established according to the model, were compared graphically and by means of the one-sample log-rank test. The model provided a very good fit to the survival data of the three risk groups in the validation cohort. We also found good agreement between the survival predicted from the model and the observed survival when patients treated with peritoneovenous shunt and with paracentesis were considered separately. Our survival model can be used to predict the survival of patients with RA and may be a useful tool in clinical decision making, especially in deciding priority for liver transplantation.

  17. Cultural consensus modeling to measure transactional sex in Swaziland: Scale building and validation. (United States)

    Fielding-Miller, Rebecca; Dunkle, Kristin L; Cooper, Hannah L F; Windle, Michael; Hadley, Craig


    Transactional sex is associated with increased risk of HIV and gender based violence in southern Africa and around the world. However the typical quantitative operationalization, "the exchange of gifts or money for sex," can be at odds with a wide array of relationship types and motivations described in qualitative explorations. To build on the strengths of both qualitative and quantitative research streams, we used cultural consensus models to identify distinct models of transactional sex in Swaziland. The process allowed us to build and validate emic scales of transactional sex, while identifying key informants for qualitative interviews within each model to contextualize women's experiences and risk perceptions. We used logistic and multinomial logistic regression models to measure associations with condom use and social status outcomes. Fieldwork was conducted between November 2013 and December 2014 in the Hhohho and Manzini regions. We identified three distinct models of transactional sex in Swaziland based on 124 Swazi women's emic valuation of what they hoped to receive in exchange for sex with their partners. In a clinic-based survey (n = 406), consensus model scales were more sensitive to condom use than the etic definition. Model consonance had distinct effects on social status for the three different models. Transactional sex is better measured as an emic spectrum of expectations within a relationship, rather than an etic binary relationship type. Cultural consensus models allowed us to blend qualitative and quantitative approaches to create an emicly valid quantitative scale grounded in qualitative context. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Development and Validation of the Faceted Inventory of the Five-Factor Model (FI-FFM). (United States)

    Watson, David; Nus, Ericka; Wu, Kevin D


    The Faceted Inventory of the Five-Factor Model (FI-FFM) is a comprehensive hierarchical measure of personality. The FI-FFM was created across five phases of scale development. It includes five facets apiece for neuroticism, extraversion, and conscientiousness; four facets within agreeableness; and three facets for openness. We present reliability and validity data obtained from three samples. The FI-FFM scales are internally consistent and highly stable over 2 weeks (retest rs ranged from .64 to .82, median r = .77). They show strong convergent and discriminant validity vis-à-vis the NEO, the Big Five Inventory, and the Personality Inventory for DSM-5. Moreover, self-ratings on the scales show moderate to strong agreement with corresponding ratings made by informants ( rs ranged from .26 to .66, median r = .42). Finally, in joint analyses with the NEO Personality Inventory-3, the FI-FFM neuroticism facet scales display significant incremental validity in predicting indicators of internalizing psychopathology.

  19. Calibration and validation of a model describing complete autotrophic nitrogen removal in a granular SBR system

    DEFF Research Database (Denmark)

    Vangsgaard, Anna Katrine; Mutlu, Ayten Gizem; Gernaey, Krist


    BACKGROUND: A validated model describing the nitritation-anammox process in a granular sequencing batch reactor (SBR) system is an important tool for: a) design of future experiments and b) prediction of process performance during optimization, while applying process control, or during system scale......-up. RESULTS: A model was calibrated using a step-wise procedure customized for the specific needs of the system. The important steps in the procedure were initialization, steady-state and dynamic calibration, and validation. A fast and effective initialization approach was developed to approximate pseudo...... steady-state in the biofilm system. For oxygen mass transfer coefficient (kLa) estimation, long-term data, removal efficiencies, and the stoichiometry of the reactions were used. For the dynamic calibration a pragmatic model fitting approach was used - in this case an iterative Monte Carlo based...

  20. Experimental Validation and Model Verification for a Novel Geometry ICPC Solar Collector

    DEFF Research Database (Denmark)

    Perers, Bengt; Duff, William S.; Daosukho, Jirachote

    A novel geometry ICPC solar collector was developed at the University of Chicago and Colorado State University. A ray tracing model has been designed to investigate the optical performance of both the horizontal and vertical fin versions of this collector. Solar radiation is modeled as discrete...... to the desired incident angle of the sun’s rays, performance of the novel ICPC solar collector at various specified angles along the transverse and longitudinal evacuated tube directions were experimentally determined. To validate the ray tracing model, transverse and longitudinal performance predictions...... at the corresponding specified incident angles are compared to the Sandia results. A 100 m2 336 Novel ICPC evacuated tube solar collector array has been in continuous operation at a demonstration project in Sacramento California since 1998. Data from the initial operation of the array are used to further validate...