WorldWideScience

Sample records for validated model developed

  1. Developing rural palliative care: validating a conceptual model.

    Science.gov (United States)

    Kelley, Mary Lou; Williams, Allison; DeMiglio, Lily; Mettam, Hilary

    2011-01-01

    The purpose of this research was to validate a conceptual model for developing palliative care in rural communities. This model articulates how local rural healthcare providers develop palliative care services according to four sequential phases. The model has roots in concepts of community capacity development, evolves from collaborative, generalist rural practice, and utilizes existing health services infrastructure. It addresses how rural providers manage challenges, specifically those related to: lack of resources, minimal community understanding of palliative care, health professionals' resistance, the bureaucracy of the health system, and the obstacles of providing services in rural environments. Seven semi-structured focus groups were conducted with interdisciplinary health providers in 7 rural communities in two Canadian provinces. Using a constant comparative analysis approach, focus group data were analyzed by examining participants' statements in relation to the model and comparing emerging themes in the development of rural palliative care to the elements of the model. The data validated the conceptual model as the model was able to theoretically predict and explain the experiences of the 7 rural communities that participated in the study. New emerging themes from the data elaborated existing elements in the model and informed the requirement for minor revisions. The model was validated and slightly revised, as suggested by the data. The model was confirmed as being a useful theoretical tool for conceptualizing the development of rural palliative care that is applicable in diverse rural communities.

  2. Development of a Conservative Model Validation Approach for Reliable Analysis

    Science.gov (United States)

    2015-01-01

    CIE 2015 August 2-5, 2015, Boston, Massachusetts, USA [DRAFT] DETC2015-46982 DEVELOPMENT OF A CONSERVATIVE MODEL VALIDATION APPROACH FOR RELIABLE...obtain a conservative simulation model for reliable design even with limited experimental data. Very little research has taken into account the...3, the proposed conservative model validation is briefly compared to the conventional model validation approach. Section 4 describes how to account

  3. Importance of Computer Model Validation in Pyroprocessing Technology Development

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Y. E.; Li, Hui; Yim, M. S. [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of)

    2014-05-15

    In this research, we developed a plan for experimental validation of one of the computer models developed for ER process modeling, i. e., the ERAD code. Several candidate surrogate materials are selected for the experiment considering the chemical and physical properties. Molten salt-based pyroprocessing technology is being examined internationally as an alternative to treat spent nuclear fuel over aqueous technology. The central process in pyroprocessing is electrorefining(ER) which separates uranium from transuranic elements and fission products present in spent nuclear fuel. ER is a widely used process in the minerals industry to purify impure metals. Studies of ER by using actual spent nuclear fuel materials are problematic for both technical and political reasons. Therefore, the initial effort for ER process optimization is made by using computer models. A number of models have been developed for this purpose. But as validation of these models is incomplete and often times problematic, the simulation results from these models are inherently uncertain.

  4. Development and validation of a mass casualty conceptual model.

    Science.gov (United States)

    Culley, Joan M; Effken, Judith A

    2010-03-01

    To develop and validate a conceptual model that provides a framework for the development and evaluation of information systems for mass casualty events. The model was designed based on extant literature and existing theoretical models. A purposeful sample of 18 experts validated the model. Open-ended questions, as well as a 7-point Likert scale, were used to measure expert consensus on the importance of each construct and its relationship in the model and the usefulness of the model to future research. Computer-mediated applications were used to facilitate a modified Delphi technique through which a panel of experts provided validation for the conceptual model. Rounds of questions continued until consensus was reached, as measured by an interquartile range (no more than 1 scale point for each item); stability (change in the distribution of responses less than 15% between rounds); and percent agreement (70% or greater) for indicator questions. Two rounds of the Delphi process were needed to satisfy the criteria for consensus or stability related to the constructs, relationships, and indicators in the model. The panel reached consensus or sufficient stability to retain all 10 constructs, 9 relationships, and 39 of 44 indicators. Experts viewed the model as useful (mean of 5.3 on a 7-point scale). Validation of the model provides the first step in understanding the context in which mass casualty events take place and identifying variables that impact outcomes of care. This study provides a foundation for understanding the complexity of mass casualty care, the roles that nurses play in mass casualty events, and factors that must be considered in designing and evaluating information-communication systems to support effective triage under these conditions.

  5. Development and demonstration of a validation methodology for vehicle lateral dynamics simulation models

    Energy Technology Data Exchange (ETDEWEB)

    Kutluay, Emir

    2013-02-01

    In this thesis a validation methodology to be used in the assessment of the vehicle dynamics simulation models is presented. Simulation of vehicle dynamics is used to estimate the dynamic responses of existing or proposed vehicles and has a wide array of applications in the development of vehicle technologies. Although simulation environments, measurement tools and mathematical theories on vehicle dynamics are well established, the methodical link between the experimental test data and validity analysis of the simulation model is still lacking. The developed validation paradigm has a top-down approach to the problem. It is ascertained that vehicle dynamics simulation models can only be validated using test maneuvers although they are aimed for real world maneuvers. Test maneuvers are determined according to the requirements of the real event at the start of the model development project and data handling techniques, validation metrics and criteria are declared for each of the selected maneuvers. If the simulation results satisfy these criteria, then the simulation is deemed ''not invalid''. If the simulation model fails to meet the criteria, the model is deemed invalid, and model iteration should be performed. The results are analyzed to determine if the results indicate a modeling error or a modeling inadequacy; and if a conditional validity in terms of system variables can be defined. Three test cases are used to demonstrate the application of the methodology. The developed methodology successfully identified the shortcomings of the tested simulation model, and defined the limits of application. The tested simulation model is found to be acceptable but valid only in a certain dynamical range. Several insights for the deficiencies of the model are reported in the analysis but the iteration step of the methodology is not demonstrated. Utilizing the proposed methodology will help to achieve more time and cost efficient simulation projects with

  6. Validation of HEDR models

    International Nuclear Information System (INIS)

    Napier, B.A.; Simpson, J.C.; Eslinger, P.W.; Ramsdell, J.V. Jr.; Thiede, M.E.; Walters, W.H.

    1994-05-01

    The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computer models for estimating the possible radiation doses that individuals may have received from past Hanford Site operations. This document describes the validation of these models. In the HEDR Project, the model validation exercise consisted of comparing computational model estimates with limited historical field measurements and experimental measurements that are independent of those used to develop the models. The results of any one test do not mean that a model is valid. Rather, the collection of tests together provide a level of confidence that the HEDR models are valid

  7. Monte Carlo Modelling of Mammograms : Development and Validation

    International Nuclear Information System (INIS)

    Spyrou, G.; Panayiotakis, G.; Bakas, A.; Tzanakos, G.

    1998-01-01

    A software package using Monte Carlo methods has been developed for the simulation of x-ray mammography. A simplified geometry of the mammographic apparatus has been considered along with the software phantom of compressed breast. This phantom may contain inhomogeneities of various compositions and sizes at any point. Using this model one can produce simulated mammograms. Results that demonstrate the validity of this simulation are presented. (authors)

  8. Monte Carlo Modelling of Mammograms : Development and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Spyrou, G; Panayiotakis, G [Univercity of Patras, School of Medicine, Medical Physics Department, 265 00 Patras (Greece); Bakas, A [Technological Educational Institution of Athens, Department of Radiography, 122 10 Athens (Greece); Tzanakos, G [University of Athens, Department of Physics, Divission of Nuclear and Particle Physics, 157 71 Athens (Greece)

    1999-12-31

    A software package using Monte Carlo methods has been developed for the simulation of x-ray mammography. A simplified geometry of the mammographic apparatus has been considered along with the software phantom of compressed breast. This phantom may contain inhomogeneities of various compositions and sizes at any point. Using this model one can produce simulated mammograms. Results that demonstrate the validity of this simulation are presented. (authors) 16 refs, 4 figs

  9. Development and validation of logistic prognostic models by predefined SAS-macros

    Directory of Open Access Journals (Sweden)

    Ziegler, Christoph

    2006-02-01

    Full Text Available In medical decision making about therapies or diagnostic procedures in the treatment of patients the prognoses of the course or of the magnitude of diseases plays a relevant role. Beside of the subjective attitude of the clinician mathematical models can help in providing such prognoses. Such models are mostly multivariate regression models. In the case of a dichotomous outcome the logistic model will be applied as the standard model. In this paper we will describe SAS-macros for the development of such a model, for examination of the prognostic performance, and for model validation. The rational for this developmental approach of a prognostic modelling and the description of the macros can only given briefly in this paper. Much more details are given in. These 14 SAS-macros are a tool for setting up the whole process of deriving a prognostic model. Especially the possibility of validating the model by a standardized software tool gives an opportunity, which is not used in general in published prognostic models. Therefore, this can help to develop new models with good prognostic performance for use in medical applications.

  10. Development and validation of a viscoelastic and nonlinear liver model for needle insertion

    Energy Technology Data Exchange (ETDEWEB)

    Kobayashi, Yo [Waseda University, Consolidated Research Institute for Advanced Science and Medical Care, Shinjuku, Tokyo (Japan); Onishi, Akinori; Hoshi, Takeharu; Kawamura, Kazuya [Waseda University, Graduate School of Science and Engineering, Shinjuku (Japan); Hashizume, Makoto [Kyushu University Hospital, Center for the Integration of Advanced Medicine and Innovative Technology, Fukuoka (Japan); Fujie, Masakatsu G. [Waseda University, Graduate School of Science and Engineering, Faculty of Science and Engineering, Shinjuku (Japan)

    2009-01-15

    The objective of our work is to develop and validate a viscoelastic and nonlinear physical liver model for organ model-based needle insertion, in which the deformation of an organ is estimated and predicted, and the needle path is determined with organ deformation taken into consideration. First, an overview is given of the development of the physical liver model. The material properties of the liver considering viscoelasticity and nonlinearity are modeled based on the measured data collected from a pig's liver. The method to develop the liver model using FEM is also shown. Second, the experimental method to validate the model is explained. Both in vitro and in vivo experiments that made use of a pig's liver were conducted for comparison with the simulation using the model. Results of the in vitro experiment showed that the model reproduces nonlinear and viscoelastic response of displacement at an internally located point with high accuracy. For a force up to 0.45 N, the maximum error is below 1 mm. Results of the in vivo experiment showed that the model reproduces the nonlinear increase of load upon the needle during insertion. Based on these results, the liver model developed and validated in this work reproduces the physical response of a liver in both in vitro and in vivo situations. (orig.)

  11. Development of Learning Models Based on Problem Solving and Meaningful Learning Standards by Expert Validity for Animal Development Course

    Science.gov (United States)

    Lufri, L.; Fitri, R.; Yogica, R.

    2018-04-01

    The purpose of this study is to produce a learning model based on problem solving and meaningful learning standards by expert assessment or validation for the course of Animal Development. This research is a development research that produce the product in the form of learning model, which consist of sub product, namely: the syntax of learning model and student worksheets. All of these products are standardized through expert validation. The research data is the level of validity of all sub products obtained using questionnaire, filled by validators from various field of expertise (field of study, learning strategy, Bahasa). Data were analysed using descriptive statistics. The result of the research shows that the problem solving and meaningful learning model has been produced. Sub products declared appropriate by expert include the syntax of learning model and student worksheet.

  12. Research program to develop and validate conceptual models for flow and transport through unsaturated, fractured rock

    International Nuclear Information System (INIS)

    Glass, R.J.; Tidwell, V.C.

    1991-01-01

    As part of the Yucca Mountain Project, our research program to develop and validate conceptual models for flow and transport through unsaturated fractured rock integrates fundamental physical experimentation with conceptual model formulation and mathematical modeling. Our research is directed toward developing and validating macroscopic, continuum-based models and supporting effective property models because of their widespread utility within the context of this project. Success relative to the development and validation of effective property models is predicated on a firm understanding of the basic physics governing flow through fractured media, specifically in the areas of unsaturated flow and transport in a single fracture and fracture-matrix interaction. 43 refs

  13. Research program to develop and validate conceptual models for flow and transport through unsaturated, fractured rock

    International Nuclear Information System (INIS)

    Glass, R.J.; Tidwell, V.C.

    1991-09-01

    As part of the Yucca Mountain Project, our research program to develop and validate conceptual models for flow and transport through unsaturated fractured rock integrates fundamental physical experimentation with conceptual model formulation and mathematical modeling. Our research is directed toward developing and validating macroscopic, continuum-based models and supporting effective property models because of their widespread utility within the context of this project. Success relative to the development and validation of effective property models is predicted on a firm understanding of the basic physics governing flow through fractured media, specifically in the areas of unsaturated flow and transport in a single fracture and fracture-matrix interaction

  14. Research program to develop and validate conceptual models for flow and transport through unsaturated, fractured rock

    International Nuclear Information System (INIS)

    Glass, R.J.; Tidwell, V.C.

    1991-01-01

    As part of the Yucca Mountain Project, our research program to develop and validate conceptual models for flow and transport through unsaturated fractured rock integrates fundamental physical experimentation with conceptual model formulation and mathematical modeling. Our research is directed toward developing and validating macroscopic, continuum-based models and supporting effective property models because of their widespread utility within the context of this project. Success relative to the development and validation of effective property models is predicted on a firm understanding of the basic physics governing flow through fractured media, specifically in the areas of unsaturated flow and transport in a single fracture and fracture-matrix interaction

  15. Development and Validation of a Polarimetric-MCScene 3D Atmospheric Radiation Model

    Energy Technology Data Exchange (ETDEWEB)

    Berk, Alexander [Spectral Sciences, Inc., Burlington, MA (United States); Hawes, Frederick [Spectral Sciences, Inc., Burlington, MA (United States); Fox, Marsha [Spectral Sciences, Inc., Burlington, MA (United States)

    2016-03-15

    Polarimetric measurements can substantially enhance the ability of both spectrally resolved and single band imagery to detect the proliferation of weapons of mass destruction, providing data for locating and identifying facilities, materials, and processes of undeclared and proliferant nuclear weapons programs worldwide. Unfortunately, models do not exist that efficiently and accurately predict spectral polarized signatures for the materials of interest embedded in complex 3D environments. Having such a model would enable one to test hypotheses and optimize both the enhancement of scene contrast and the signal processing for spectral signature extraction. The Phase I set the groundwork for development of fully validated polarimetric spectral signature and scene simulation models. This has been accomplished 1. by (a) identifying and downloading state-of-the-art surface and atmospheric polarimetric data sources, (b) implementing tools for generating custom polarimetric data, and (c) identifying and requesting US Government funded field measurement data for use in validation; 2. by formulating an approach for upgrading the radiometric spectral signature model MODTRAN to generate polarimetric intensities through (a) ingestion of the polarimetric data, (b) polarimetric vectorization of existing MODTRAN modules, and (c) integration of a newly developed algorithm for computing polarimetric multiple scattering contributions; 3. by generating an initial polarimetric model that demonstrates calculation of polarimetric solar and lunar single scatter intensities arising from the interaction of incoming irradiances with molecules and aerosols; 4. by developing a design and implementation plan to (a) automate polarimetric scene construction and (b) efficiently sample polarimetric scattering and reflection events, for use in a to be developed polarimetric version of the existing first-principles synthetic scene simulation model, MCScene; and 5. by planning a validation field

  16. U-tube steam generator empirical model development and validation using neural networks

    International Nuclear Information System (INIS)

    Parlos, A.G.; Chong, K.T.; Atiya, A.

    1992-01-01

    Empirical modeling techniques that use model structures motivated from neural networks research have proven effective in identifying complex process dynamics. A recurrent multilayer perception (RMLP) network was developed as a nonlinear state-space model structure along with a static learning algorithm for estimating the parameter associated with it. The methods developed were demonstrated by identifying two submodels of a U-tube steam generator (UTSG), each valid around an operating power level. A significant drawback of this approach is the long off-line training times required for the development of even a simplified model of a UTSG. Subsequently, a dynamic gradient descent-based learning algorithm was developed as an accelerated alternative to train an RMLP network for use in empirical modeling of power plants. The two main advantages of this learning algorithm are its ability to consider past error gradient information for future use and the two forward passes associated with its implementation. The enhanced learning capabilities provided by the dynamic gradient descent-based learning algorithm were demonstrated via the case study of a simple steam boiler power plant. In this paper, the dynamic gradient descent-based learning algorithm is used for the development and validation of a complete UTSG empirical model

  17. High Turbidity Solis Clear Sky Model: Development and Validation

    Directory of Open Access Journals (Sweden)

    Pierre Ineichen

    2018-03-01

    Full Text Available The Solis clear sky model is a spectral scheme based on radiative transfer calculations and the Lambert–Beer relation. Its broadband version is a simplified fast analytical version; it is limited to broadband aerosol optical depths lower than 0.45, which is a weakness when applied in countries with very high turbidity such as China or India. In order to extend the use of the original simplified version of the model for high turbidity values, we developed a new version of the broadband Solis model based on radiative transfer calculations, valid for turbidity values up to 7, for the three components, global, beam, and diffuse, and for the four aerosol types defined by Shettle and Fenn. A validation of low turbidity data acquired in Geneva shows slightly better results than the previous version. On data acquired at sites presenting higher turbidity data, the bias stays within ±4% for the beam and the global irradiances, and the standard deviation around 5% for clean and stable condition data and around 12% for questionable data and variable sky conditions.

  18. Development and Validation of a 3-Dimensional CFB Furnace Model

    Science.gov (United States)

    Vepsäläinen, Arl; Myöhänen, Karl; Hyppäneni, Timo; Leino, Timo; Tourunen, Antti

    At Foster Wheeler, a three-dimensional CFB furnace model is essential part of knowledge development of CFB furnace process regarding solid mixing, combustion, emission formation and heat transfer. Results of laboratory and pilot scale phenomenon research are utilized in development of sub-models. Analyses of field-test results in industrial-scale CFB boilers including furnace profile measurements are simultaneously carried out with development of 3-dimensional process modeling, which provides a chain of knowledge that is utilized as feedback for phenomenon research. Knowledge gathered by model validation studies and up-to-date parameter databases are utilized in performance prediction and design development of CFB boiler furnaces. This paper reports recent development steps related to modeling of combustion and formation of char and volatiles of various fuel types in CFB conditions. Also a new model for predicting the formation of nitrogen oxides is presented. Validation of mixing and combustion parameters for solids and gases are based on test balances at several large-scale CFB boilers combusting coal, peat and bio-fuels. Field-tests including lateral and vertical furnace profile measurements and characterization of solid materials provides a window for characterization of fuel specific mixing and combustion behavior in CFB furnace at different loads and operation conditions. Measured horizontal gas profiles are projection of balance between fuel mixing and reactions at lower part of furnace and are used together with both lateral temperature profiles at bed and upper parts of furnace for determination of solid mixing and combustion model parameters. Modeling of char and volatile based formation of NO profiles is followed by analysis of oxidizing and reducing regions formed due lower furnace design and mixing characteristics of fuel and combustion airs effecting to formation ofNO furnace profile by reduction and volatile-nitrogen reactions. This paper presents

  19. Development and validation of multivariable predictive model for thromboembolic events in lymphoma patients.

    Science.gov (United States)

    Antic, Darko; Milic, Natasa; Nikolovski, Srdjan; Todorovic, Milena; Bila, Jelena; Djurdjevic, Predrag; Andjelic, Bosko; Djurasinovic, Vladislava; Sretenovic, Aleksandra; Vukovic, Vojin; Jelicic, Jelena; Hayman, Suzanne; Mihaljevic, Biljana

    2016-10-01

    Lymphoma patients are at increased risk of thromboembolic events but thromboprophylaxis in these patients is largely underused. We sought to develop and validate a simple model, based on individual clinical and laboratory patient characteristics that would designate lymphoma patients at risk for thromboembolic event. The study population included 1,820 lymphoma patients who were treated in the Lymphoma Departments at the Clinics of Hematology, Clinical Center of Serbia and Clinical Center Kragujevac. The model was developed using data from a derivation cohort (n = 1,236), and further assessed in the validation cohort (n = 584). Sixty-five patients (5.3%) in the derivation cohort and 34 (5.8%) patients in the validation cohort developed thromboembolic events. The variables independently associated with risk for thromboembolism were: previous venous and/or arterial events, mediastinal involvement, BMI>30 kg/m(2) , reduced mobility, extranodal localization, development of neutropenia and hemoglobin level 3). For patients classified at risk (intermediate and high-risk scores), the model produced negative predictive value of 98.5%, positive predictive value of 25.1%, sensitivity of 75.4%, and specificity of 87.5%. A high-risk score had positive predictive value of 65.2%. The diagnostic performance measures retained similar values in the validation cohort. Developed prognostic Thrombosis Lymphoma - ThroLy score is more specific for lymphoma patients than any other available score targeting thrombosis in cancer patients. Am. J. Hematol. 91:1014-1019, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  20. Developing a model for hospital inherent safety assessment: Conceptualization and validation.

    Science.gov (United States)

    Yari, Saeed; Akbari, Hesam; Gholami Fesharaki, Mohammad; Khosravizadeh, Omid; Ghasemi, Mohammad; Barsam, Yalda; Akbari, Hamed

    2018-01-01

    Paying attention to the safety of hospitals, as the most crucial institute for providing medical and health services wherein a bundle of facilities, equipment, and human resource exist, is of significant importance. The present research aims at developing a model for assessing hospitals' safety based on principles of inherent safety design. Face validity (30 experts), content validity (20 experts), construct validity (268 examples), convergent validity, and divergent validity have been employed to validate the prepared questionnaire; and the items analysis, the Cronbach's alpha test, ICC test (to measure reliability of the test), composite reliability coefficient have been used to measure primary reliability. The relationship between variables and factors has been confirmed at 0.05 significance level by conducting confirmatory factor analysis (CFA) and structural equations modeling (SEM) technique with the use of Smart-PLS. R-square and load factors values, which were higher than 0.67 and 0.300 respectively, indicated the strong fit. Moderation (0.970), simplification (0.959), substitution (0.943), and minimization (0.5008) have had the most weights in determining the inherent safety of hospital respectively. Moderation, simplification, and substitution, among the other dimensions, have more weight on the inherent safety, while minimization has the less weight, which could be due do its definition as to minimize the risk.

  1. An integrated development environment for PMESII model authoring, integration, validation, and debugging

    Science.gov (United States)

    Pioch, Nicholas J.; Lofdahl, Corey; Sao Pedro, Michael; Krikeles, Basil; Morley, Liam

    2007-04-01

    To foster shared battlespace awareness in Air Operations Centers supporting the Joint Forces Commander and Joint Force Air Component Commander, BAE Systems is developing a Commander's Model Integration and Simulation Toolkit (CMIST), an Integrated Development Environment (IDE) for model authoring, integration, validation, and debugging. CMIST is built on the versatile Eclipse framework, a widely used open development platform comprised of extensible frameworks that enable development of tools for building, deploying, and managing software. CMIST provides two distinct layers: 1) a Commander's IDE for supporting staff to author models spanning the Political, Military, Economic, Social, Infrastructure, Information (PMESII) taxonomy; integrate multiple native (third-party) models; validate model interfaces and outputs; and debug the integrated models via intuitive controls and time series visualization, and 2) a PMESII IDE for modeling and simulation developers to rapidly incorporate new native simulation tools and models to make them available for use in the Commander's IDE. The PMESII IDE provides shared ontologies and repositories for world state, modeling concepts, and native tool characterization. CMIST includes extensible libraries for 1) reusable data transforms for semantic alignment of native data with the shared ontology, and 2) interaction patterns to synchronize multiple native simulations with disparate modeling paradigms, such as continuous-time system dynamics, agent-based discrete event simulation, and aggregate solution methods such as Monte Carlo sampling over dynamic Bayesian networks. This paper describes the CMIST system architecture, our technical approach to addressing these semantic alignment and synchronization problems, and initial results from integrating Political-Military-Economic models of post-war Iraq spanning multiple modeling paradigms.

  2. Groundwater Model Validation

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed E. Hassan

    2006-01-24

    Models have an inherent uncertainty. The difficulty in fully characterizing the subsurface environment makes uncertainty an integral component of groundwater flow and transport models, which dictates the need for continuous monitoring and improvement. Building and sustaining confidence in closure decisions and monitoring networks based on models of subsurface conditions require developing confidence in the models through an iterative process. The definition of model validation is postulated as a confidence building and long-term iterative process (Hassan, 2004a). Model validation should be viewed as a process not an end result. Following Hassan (2004b), an approach is proposed for the validation process of stochastic groundwater models. The approach is briefly summarized herein and detailed analyses of acceptance criteria for stochastic realizations and of using validation data to reduce input parameter uncertainty are presented and applied to two case studies. During the validation process for stochastic models, a question arises as to the sufficiency of the number of acceptable model realizations (in terms of conformity with validation data). Using a hierarchical approach to make this determination is proposed. This approach is based on computing five measures or metrics and following a decision tree to determine if a sufficient number of realizations attain satisfactory scores regarding how they represent the field data used for calibration (old) and used for validation (new). The first two of these measures are applied to hypothetical scenarios using the first case study and assuming field data consistent with the model or significantly different from the model results. In both cases it is shown how the two measures would lead to the appropriate decision about the model performance. Standard statistical tests are used to evaluate these measures with the results indicating they are appropriate measures for evaluating model realizations. The use of validation

  3. HEDR model validation plan

    International Nuclear Information System (INIS)

    Napier, B.A.; Gilbert, R.O.; Simpson, J.C.; Ramsdell, J.V. Jr.; Thiede, M.E.; Walters, W.H.

    1993-06-01

    The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computational ''tools'' for estimating the possible radiation dose that individuals may have received from past Hanford Site operations. This document describes the planned activities to ''validate'' these tools. In the sense of the HEDR Project, ''validation'' is a process carried out by comparing computational model predictions with field observations and experimental measurements that are independent of those used to develop the model

  4. Model Validation Status Review

    International Nuclear Information System (INIS)

    E.L. Hardin

    2001-01-01

    The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified, and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M and O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and

  5. Model Validation Status Review

    Energy Technology Data Exchange (ETDEWEB)

    E.L. Hardin

    2001-11-28

    The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified, and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M&O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and

  6. Condensation of steam in horizontal pipes: model development and validation

    International Nuclear Information System (INIS)

    Szijarto, R.

    2015-01-01

    This thesis submitted to the Swiss Federal Institute of Technology ETH in Zurich presents the development and validation of a model for the condensation of steam in horizontal pipes. Condensation models were introduced and developed particularly for the application in the emergency cooling system of a Gen-III+ boiling water reactor. Such an emergency cooling system consists of slightly inclined horizontal pipes, which are immersed in a cold water tank. The pipes are connected to the reactor pressure vessel. They are responsible for a fast depressurization of the reactor core in the case of accident. Condensation in horizontal pipes was investigated with both one-dimensional system codes (RELAP5) and three-dimensional computational fluid dynamics software (ANSYS FLUENT). The performance of the RELAP5 code was not sufficient for transient condensation processes. Therefore, a mechanistic model was developed and implemented. Four models were tested on the LAOKOON facility, which analysed direct contact condensation in a horizontal duct

  7. Developing a model for validation and prediction of bank customer ...

    African Journals Online (AJOL)

    Credit risk is the most important risk of banks. The main approaches of the bank to reduce credit risk are correct validation using the final status and the validation model parameters. High fuel of bank reserves and lost or outstanding facilities of banks indicate the lack of appropriate validation models in the banking network.

  8. Developing and Validating the Socio-Technical Model in Ontology Engineering

    Science.gov (United States)

    Silalahi, Mesnan; Indra Sensuse, Dana; Giri Sucahyo, Yudho; Fadhilah Akmaliah, Izzah; Rahayu, Puji; Cahyaningsih, Elin

    2018-03-01

    This paper describes results from an attempt to develop a model in ontology engineering methodology and a way to validate the model. The approach to methodology in ontology engineering is from the point view of socio-technical system theory. Qualitative research synthesis is used to build the model using meta-ethnography. In order to ensure the objectivity of the measurement, inter-rater reliability method was applied using a multi-rater Fleiss Kappa. The results show the accordance of the research output with the diamond model in the socio-technical system theory by evidence of the interdependency of the four socio-technical variables namely people, technology, structure and task.

  9. Validation of models with multivariate output

    International Nuclear Information System (INIS)

    Rebba, Ramesh; Mahadevan, Sankaran

    2006-01-01

    This paper develops metrics for validating computational models with experimental data, considering uncertainties in both. A computational model may generate multiple response quantities and the validation experiment might yield corresponding measured values. Alternatively, a single response quantity may be predicted and observed at different spatial and temporal points. Model validation in such cases involves comparison of multiple correlated quantities. Multiple univariate comparisons may give conflicting inferences. Therefore, aggregate validation metrics are developed in this paper. Both classical and Bayesian hypothesis testing are investigated for this purpose, using multivariate analysis. Since, commonly used statistical significance tests are based on normality assumptions, appropriate transformations are investigated in the case of non-normal data. The methodology is implemented to validate an empirical model for energy dissipation in lap joints under dynamic loading

  10. Readmissions and death after ICU discharge: development and validation of two predictive models.

    Directory of Open Access Journals (Sweden)

    Omar Badawi

    Full Text Available INTRODUCTION: Early discharge from the ICU is desirable because it shortens time in the ICU and reduces care costs, but can also increase the likelihood of ICU readmission and post-discharge unanticipated death if patients are discharged before they are stable. We postulated that, using eICU® Research Institute (eRI data from >400 ICUs, we could develop robust models predictive of post-discharge death and readmission that may be incorporated into future clinical information systems (CIS to assist ICU discharge planning. METHODS: Retrospective, multi-center, exploratory cohort study of ICU survivors within the eRI database between 1/1/2007 and 3/31/2011. EXCLUSION CRITERIA: DNR or care limitations at ICU discharge and discharge to location external to hospital. Patients were randomized (2∶1 to development and validation cohorts. Multivariable logistic regression was performed on a broad range of variables including: patient demographics, ICU admission diagnosis, admission severity of illness, laboratory values and physiologic variables present during the last 24 hours of the ICU stay. Multiple imputation was used to address missing data. The primary outcomes were the area under the receiver operator characteristic curves (auROC in the validation cohorts for the models predicting readmission and death within 48 hours of ICU discharge. RESULTS: 469,976 and 234,987 patients representing 219 hospitals were in the development and validation cohorts. Early ICU readmission and death was experienced by 2.54% and 0.92% of all patients, respectively. The relationship between predictors and outcomes (death vs readmission differed, justifying the need for separate models. The models for early readmission and death produced auROCs of 0.71 and 0.92, respectively. Both models calibrated well across risk groups. CONCLUSIONS: Our models for death and readmission after ICU discharge showed good to excellent discrimination and good calibration. Although

  11. Individualized prediction of perineural invasion in colorectal cancer: development and validation of a radiomics prediction model.

    Science.gov (United States)

    Huang, Yanqi; He, Lan; Dong, Di; Yang, Caiyun; Liang, Cuishan; Chen, Xin; Ma, Zelan; Huang, Xiaomei; Yao, Su; Liang, Changhong; Tian, Jie; Liu, Zaiyi

    2018-02-01

    To develop and validate a radiomics prediction model for individualized prediction of perineural invasion (PNI) in colorectal cancer (CRC). After computed tomography (CT) radiomics features extraction, a radiomics signature was constructed in derivation cohort (346 CRC patients). A prediction model was developed to integrate the radiomics signature and clinical candidate predictors [age, sex, tumor location, and carcinoembryonic antigen (CEA) level]. Apparent prediction performance was assessed. After internal validation, independent temporal validation (separate from the cohort used to build the model) was then conducted in 217 CRC patients. The final model was converted to an easy-to-use nomogram. The developed radiomics nomogram that integrated the radiomics signature and CEA level showed good calibration and discrimination performance [Harrell's concordance index (c-index): 0.817; 95% confidence interval (95% CI): 0.811-0.823]. Application of the nomogram in validation cohort gave a comparable calibration and discrimination (c-index: 0.803; 95% CI: 0.794-0.812). Integrating the radiomics signature and CEA level into a radiomics prediction model enables easy and effective risk assessment of PNI in CRC. This stratification of patients according to their PNI status may provide a basis for individualized auxiliary treatment.

  12. Empirical model development and validation with dynamic learning in the recurrent multilayer perception

    International Nuclear Information System (INIS)

    Parlos, A.G.; Chong, K.T.; Atiya, A.F.

    1994-01-01

    A nonlinear multivariable empirical model is developed for a U-tube steam generator using the recurrent multilayer perceptron network as the underlying model structure. The recurrent multilayer perceptron is a dynamic neural network, very effective in the input-output modeling of complex process systems. A dynamic gradient descent learning algorithm is used to train the recurrent multilayer perceptron, resulting in an order of magnitude improvement in convergence speed over static learning algorithms. In developing the U-tube steam generator empirical model, the effects of actuator, process,and sensor noise on the training and testing sets are investigated. Learning and prediction both appear very effective, despite the presence of training and testing set noise, respectively. The recurrent multilayer perceptron appears to learn the deterministic part of a stochastic training set, and it predicts approximately a moving average response. Extensive model validation studies indicate that the empirical model can substantially generalize (extrapolate), though online learning becomes necessary for tracking transients significantly different than the ones included in the training set and slowly varying U-tube steam generator dynamics. In view of the satisfactory modeling accuracy and the associated short development time, neural networks based empirical models in some cases appear to provide a serious alternative to first principles models. Caution, however, must be exercised because extensive on-line validation of these models is still warranted

  13. Developing and Validating a Survival Prediction Model for NSCLC Patients Through Distributed Learning Across 3 Countries.

    Science.gov (United States)

    Jochems, Arthur; Deist, Timo M; El Naqa, Issam; Kessler, Marc; Mayo, Chuck; Reeves, Jackson; Jolly, Shruti; Matuszak, Martha; Ten Haken, Randall; van Soest, Johan; Oberije, Cary; Faivre-Finn, Corinne; Price, Gareth; de Ruysscher, Dirk; Lambin, Philippe; Dekker, Andre

    2017-10-01

    Tools for survival prediction for non-small cell lung cancer (NSCLC) patients treated with chemoradiation or radiation therapy are of limited quality. In this work, we developed a predictive model of survival at 2 years. The model is based on a large volume of historical patient data and serves as a proof of concept to demonstrate the distributed learning approach. Clinical data from 698 lung cancer patients, treated with curative intent with chemoradiation or radiation therapy alone, were collected and stored at 2 different cancer institutes (559 patients at Maastro clinic (Netherlands) and 139 at Michigan university [United States]). The model was further validated on 196 patients originating from The Christie (United Kingdon). A Bayesian network model was adapted for distributed learning (the animation can be viewed at https://www.youtube.com/watch?v=ZDJFOxpwqEA). Two-year posttreatment survival was chosen as the endpoint. The Maastro clinic cohort data are publicly available at https://www.cancerdata.org/publication/developing-and-validating-survival-prediction-model-nsclc-patients-through-distributed, and the developed models can be found at www.predictcancer.org. Variables included in the final model were T and N category, age, performance status, and total tumor dose. The model has an area under the curve (AUC) of 0.66 on the external validation set and an AUC of 0.62 on a 5-fold cross validation. A model based on the T and N category performed with an AUC of 0.47 on the validation set, significantly worse than our model (PLearning the model in a centralized or distributed fashion yields a minor difference on the probabilities of the conditional probability tables (0.6%); the discriminative performance of the models on the validation set is similar (P=.26). Distributed learning from federated databases allows learning of predictive models on data originating from multiple institutions while avoiding many of the data-sharing barriers. We believe that

  14. Developing R&D portfolio business validity simulation model and system.

    Science.gov (United States)

    Yeo, Hyun Jin; Im, Kwang Hyuk

    2015-01-01

    The R&D has been recognized as critical method to take competitiveness by not only companies but also nations with its value creation such as patent value and new product. Therefore, R&D has been a decision maker's burden in that it is hard to decide how much money to invest, how long time one should spend, and what technology to develop which means it accompanies resources such as budget, time, and manpower. Although there are diverse researches about R&D evaluation, business factors are not concerned enough because almost all previous studies are technology oriented evaluation with one R&D technology based. In that, we early proposed R&D business aspect evaluation model which consists of nine business model components. In this research, we develop a simulation model and system evaluating a company or industry's R&D portfolio with business model point of view and clarify default and control parameters to facilitate evaluator's business validity work in each evaluation module by integrate to one screen.

  15. Development and validation of a 10-year-old child ligamentous cervical spine finite element model.

    Science.gov (United States)

    Dong, Liqiang; Li, Guangyao; Mao, Haojie; Marek, Stanley; Yang, King H

    2013-12-01

    Although a number of finite element (FE) adult cervical spine models have been developed to understand the injury mechanisms of the neck in automotive related crash scenarios, there have been fewer efforts to develop a child neck model. In this study, a 10-year-old ligamentous cervical spine FE model was developed for application in the improvement of pediatric safety related to motor vehicle crashes. The model geometry was obtained from medical scans and meshed using a multi-block approach. Appropriate properties based on review of literature in conjunction with scaling were assigned to different parts of the model. Child tensile force-deformation data in three segments, Occipital-C2 (C0-C2), C4-C5 and C6-C7, were used to validate the cervical spine model and predict failure forces and displacements. Design of computer experiments was performed to determine failure properties for intervertebral discs and ligaments needed to set up the FE model. The model-predicted ultimate displacements and forces were within the experimental range. The cervical spine FE model was validated in flexion and extension against the child experimental data in three segments, C0-C2, C4-C5 and C6-C7. Other model predictions were found to be consistent with the experimental responses scaled from adult data. The whole cervical spine model was also validated in tension, flexion and extension against the child experimental data. This study provided methods for developing a child ligamentous cervical spine FE model and to predict soft tissue failures in tension.

  16. Development and Validation of Linear Alternator Models for the Advanced Stirling Convertor

    Science.gov (United States)

    Metscher, Jonathan F.; Lewandowski, Edward J.

    2015-01-01

    Two models of the linear alternator of the Advanced Stirling Convertor (ASC) have been developed using the Sage 1-D modeling software package. The first model relates the piston motion to electric current by means of a motor constant. The second uses electromagnetic model components to model the magnetic circuit of the alternator. The models are tuned and validated using test data and also compared against each other. Results show both models can be tuned to achieve results within 7 of ASC test data under normal operating conditions. Using Sage enables the creation of a complete ASC model to be developed and simulations completed quickly compared to more complex multi-dimensional models. These models allow for better insight into overall Stirling convertor performance, aid with Stirling power system modeling, and in the future support NASA mission planning for Stirling-based power systems.

  17. Development and validation of a two-dimensional fast-response flood estimation model

    Energy Technology Data Exchange (ETDEWEB)

    Judi, David R [Los Alamos National Laboratory; Mcpherson, Timothy N [Los Alamos National Laboratory; Burian, Steven J [UNIV OF UTAK

    2009-01-01

    A finite difference formulation of the shallow water equations using an upwind differencing method was developed maintaining computational efficiency and accuracy such that it can be used as a fast-response flood estimation tool. The model was validated using both laboratory controlled experiments and an actual dam breach. Through the laboratory experiments, the model was shown to give good estimations of depth and velocity when compared to the measured data, as well as when compared to a more complex two-dimensional model. Additionally, the model was compared to high water mark data obtained from the failure of the Taum Sauk dam. The simulated inundation extent agreed well with the observed extent, with the most notable differences resulting from the inability to model sediment transport. The results of these validation studies complex two-dimensional model. Additionally, the model was compared to high water mark data obtained from the failure of the Taum Sauk dam. The simulated inundation extent agreed well with the observed extent, with the most notable differences resulting from the inability to model sediment transport. The results of these validation studies show that a relatively numerical scheme used to solve the complete shallow water equations can be used to accurately estimate flood inundation. Future work will focus on further reducing the computation time needed to provide flood inundation estimates for fast-response analyses. This will be accomplished through the efficient use of multi-core, multi-processor computers coupled with an efficient domain-tracking algorithm, as well as an understanding of the impacts of grid resolution on model results.

  18. Developing R&D Portfolio Business Validity Simulation Model and System

    Directory of Open Access Journals (Sweden)

    Hyun Jin Yeo

    2015-01-01

    Full Text Available The R&D has been recognized as critical method to take competitiveness by not only companies but also nations with its value creation such as patent value and new product. Therefore, R&D has been a decision maker’s burden in that it is hard to decide how much money to invest, how long time one should spend, and what technology to develop which means it accompanies resources such as budget, time, and manpower. Although there are diverse researches about R&D evaluation, business factors are not concerned enough because almost all previous studies are technology oriented evaluation with one R&D technology based. In that, we early proposed R&D business aspect evaluation model which consists of nine business model components. In this research, we develop a simulation model and system evaluating a company or industry’s R&D portfolio with business model point of view and clarify default and control parameters to facilitate evaluator’s business validity work in each evaluation module by integrate to one screen.

  19. Developing R&D Portfolio Business Validity Simulation Model and System

    Science.gov (United States)

    2015-01-01

    The R&D has been recognized as critical method to take competitiveness by not only companies but also nations with its value creation such as patent value and new product. Therefore, R&D has been a decision maker's burden in that it is hard to decide how much money to invest, how long time one should spend, and what technology to develop which means it accompanies resources such as budget, time, and manpower. Although there are diverse researches about R&D evaluation, business factors are not concerned enough because almost all previous studies are technology oriented evaluation with one R&D technology based. In that, we early proposed R&D business aspect evaluation model which consists of nine business model components. In this research, we develop a simulation model and system evaluating a company or industry's R&D portfolio with business model point of view and clarify default and control parameters to facilitate evaluator's business validity work in each evaluation module by integrate to one screen. PMID:25893209

  20. Development and validation of a multivariate prediction model for patients with acute pancreatitis in Intensive Care Medicine.

    Science.gov (United States)

    Zubia-Olaskoaga, Felix; Maraví-Poma, Enrique; Urreta-Barallobre, Iratxe; Ramírez-Puerta, María-Rosario; Mourelo-Fariña, Mónica; Marcos-Neira, María-Pilar; García-García, Miguel Ángel

    2018-03-01

    Development and validation of a multivariate prediction model for patients with acute pancreatitis (AP) admitted in Intensive Care Units (ICU). A prospective multicenter observational study, in 1 year period, in 46 international ICUs (EPAMI study). adults admitted to an ICU with AP and at least one organ failure. Development of a multivariate prediction model, using the worst data of the stay in ICU, based in multivariate analysis, simple imputation in a development cohort. The model was validated in another cohort. 374 patients were included (mortality of 28.9%). Variables with statistical significance in multivariate analysis were age, no alcoholic and no biliary etiology, development of shock, development of respiratory failure, need of continuous renal replacement therapy, and intra-abdominal pressure. The model created with these variables presented an AUC of ROC curve of 0.90 (CI 95% 0.81-0.94) in the validation cohort. We developed a multivariable prediction model, and AP cases could be classified as low mortality risk (between 2 and 9.5 points, mortality of 1.35%), moderate mortality risk (between 10 and 12.5 points, 28.92% of mortality), and high mortality risk (13 points of more, mortality of 88.37%). Our model presented better AUC of ROC curve than APACHE II (0.91 vs 0.80) and SOFA in the first 24 h (0.91 vs 0.79). We developed and validated a multivariate prediction model, which can be applied in any moment of the stay in ICU, with better discriminatory power than APACHE II and SOFA in the first 24 h. Copyright © 2018 IAP and EPC. Published by Elsevier B.V. All rights reserved.

  1. Model-based wear measurements in total knee arthroplasty : development and validation of novel radiographic techniques

    NARCIS (Netherlands)

    IJsseldijk, van E.A.

    2016-01-01

    The primary aim of this work was to develop novel model-based mJSW measurement methods using a 3D reconstruction and compare the accuracy and precision of these methods to conventional mJSW measurement. This thesis contributed to the development, validation and clinical application of model-based

  2. Non-Linear Slosh Damping Model Development and Validation

    Science.gov (United States)

    Yang, H. Q.; West, Jeff

    2015-01-01

    Propellant tank slosh dynamics are typically represented by a mechanical model of spring mass damper. This mechanical model is then included in the equation of motion of the entire vehicle for Guidance, Navigation and Control (GN&C) analysis. For a partially-filled smooth wall propellant tank, the critical damping based on classical empirical correlation is as low as 0.05%. Due to this low value of damping, propellant slosh is potential sources of disturbance critical to the stability of launch and space vehicles. It is postulated that the commonly quoted slosh damping is valid only under the linear regime where the slosh amplitude is small. With the increase of slosh amplitude, the critical damping value should also increase. If this nonlinearity can be verified and validated, the slosh stability margin can be significantly improved, and the level of conservatism maintained in the GN&C analysis can be lessened. The purpose of this study is to explore and to quantify the dependence of slosh damping with slosh amplitude. Accurately predicting the extremely low damping value of a smooth wall tank is very challenging for any Computational Fluid Dynamics (CFD) tool. One must resolve thin boundary layers near the wall and limit numerical damping to minimum. This computational study demonstrates that with proper grid resolution, CFD can indeed accurately predict the low damping physics from smooth walls under the linear regime. Comparisons of extracted damping values with experimental data for different tank sizes show very good agreements. Numerical simulations confirm that slosh damping is indeed a function of slosh amplitude. When slosh amplitude is low, the damping ratio is essentially constant, which is consistent with the empirical correlation. Once the amplitude reaches a critical value, the damping ratio becomes a linearly increasing function of the slosh amplitude. A follow-on experiment validated the developed nonlinear damping relationship. This discovery can

  3. Developing and validating a model to predict the success of an IHCS implementation: the Readiness for Implementation Model

    Science.gov (United States)

    Gustafson, David H; Hawkins, Robert P; Brennan, Patricia F; Dinauer, Susan; Johnson, Pauley R; Siegler, Tracy

    2010-01-01

    Objective To develop and validate the Readiness for Implementation Model (RIM). This model predicts a healthcare organization's potential for success in implementing an interactive health communication system (IHCS). The model consists of seven weighted factors, with each factor containing five to seven elements. Design Two decision-analytic approaches, self-explicated and conjoint analysis, were used to measure the weights of the RIM with a sample of 410 experts. The RIM model with weights was then validated in a prospective study of 25 IHCS implementation cases. Measurements Orthogonal main effects design was used to develop 700 conjoint-analysis profiles, which varied on seven factors. Each of the 410 experts rated the importance and desirability of the factors and their levels, as well as a set of 10 different profiles. For the prospective 25-case validation, three time-repeated measures of the RIM scores were collected for comparison with the implementation outcomes. Results Two of the seven factors, ‘organizational motivation’ and ‘meeting user needs,’ were found to be most important in predicting implementation readiness. No statistically significant difference was found in the predictive validity of the two approaches (self-explicated and conjoint analysis). The RIM was a better predictor for the 1-year implementation outcome than the half-year outcome. Limitations The expert sample, the order of the survey tasks, the additive model, and basing the RIM cut-off score on experience are possible limitations of the study. Conclusion The RIM needs to be empirically evaluated in institutions adopting IHCS and sustaining the system in the long term. PMID:20962135

  4. Development and validation of mechanical model for saturated/unsaturated bentonite buffer

    International Nuclear Information System (INIS)

    Yamamoto, S.; Komine, H.; Kato, S.

    2010-01-01

    Document available in extended abstract form only. Development and validation of mechanical models for bentonite buffer and backfill materials are one of important subjects to appropriately evaluate long term behaviour or condition of the EBS in radioactive waste disposal. The Barcelona Basic Model (BBM), which is one of extensions of the modified Cam-Clay model for unsaturated and expansive soil, has been developed and widely applied to several problems by using the coupled THM code, Code B right. Advantage of the model is that mechanical characteristics of buffer and backfill materials under not only saturated condition but also unsaturated one are taken account as well as swelling characteristics due to wetting. In this study the BBM is compared with already existing experimental data and already developed another model in terms of swelling characteristics of Japanese bentonite Kunigel-V1, and is validated in terms of consolidation characteristics based on newly performed controlled-suction oedometer tests for the Kunigel-V1 bentonite. Komine et al. (2003) have proposed a model (set of equations) for predicting swelling characteristics based on the diffuse double layer concept and the van der Waals force concept etc. They performed a lot of swelling deformation tests of bentonite and sand-bentonite mixture to confirm the applicability of the model. The BBM well agrees with the model proposed by Komine et al. and the experimental data in terms of swelling characteristics. Compression index and swelling index depending on suction are introduced in the BBM. Controlled-suction consolidation tests (oedometer tests) were performed to confirm the applicability of the suction dependent indexes to unsaturated bentonite. Compacted bentonite with initial dry density of 1.0 Mg/m 3 was tested. Constant suction, 80 kPa, 280 kPa and 480 kPa was applied and kept during the consolidation tests. Applicability of the BBM to consolidation and swelling behaviour of saturated and

  5. Development and Validation of a Predictive Model for Functional Outcome After Stroke Rehabilitation: The Maugeri Model.

    Science.gov (United States)

    Scrutinio, Domenico; Lanzillo, Bernardo; Guida, Pietro; Mastropasqua, Filippo; Monitillo, Vincenzo; Pusineri, Monica; Formica, Roberto; Russo, Giovanna; Guarnaschelli, Caterina; Ferretti, Chiara; Calabrese, Gianluigi

    2017-12-01

    Prediction of outcome after stroke rehabilitation may help clinicians in decision-making and planning rehabilitation care. We developed and validated a predictive tool to estimate the probability of achieving improvement in physical functioning (model 1) and a level of independence requiring no more than supervision (model 2) after stroke rehabilitation. The models were derived from 717 patients admitted for stroke rehabilitation. We used multivariable logistic regression analysis to build each model. Then, each model was prospectively validated in 875 patients. Model 1 included age, time from stroke occurrence to rehabilitation admission, admission motor and cognitive Functional Independence Measure scores, and neglect. Model 2 included age, male gender, time since stroke onset, and admission motor and cognitive Functional Independence Measure score. Both models demonstrated excellent discrimination. In the derivation cohort, the area under the curve was 0.883 (95% confidence intervals, 0.858-0.910) for model 1 and 0.913 (95% confidence intervals, 0.884-0.942) for model 2. The Hosmer-Lemeshow χ 2 was 4.12 ( P =0.249) and 1.20 ( P =0.754), respectively. In the validation cohort, the area under the curve was 0.866 (95% confidence intervals, 0.840-0.892) for model 1 and 0.850 (95% confidence intervals, 0.815-0.885) for model 2. The Hosmer-Lemeshow χ 2 was 8.86 ( P =0.115) and 34.50 ( P =0.001), respectively. Both improvement in physical functioning (hazard ratios, 0.43; 0.25-0.71; P =0.001) and a level of independence requiring no more than supervision (hazard ratios, 0.32; 0.14-0.68; P =0.004) were independently associated with improved 4-year survival. A calculator is freely available for download at https://goo.gl/fEAp81. This study provides researchers and clinicians with an easy-to-use, accurate, and validated predictive tool for potential application in rehabilitation research and stroke management. © 2017 American Heart Association, Inc.

  6. Development and validation of a nursing professionalism evaluation model in a career ladder system.

    Science.gov (United States)

    Kim, Yeon Hee; Jung, Young Sun; Min, Ja; Song, Eun Young; Ok, Jung Hui; Lim, Changwon; Kim, Kyunghee; Kim, Ji-Su

    2017-01-01

    The clinical ladder system categorizes the degree of nursing professionalism and rewards and is an important human resource tool for managing nursing. We developed a model to evaluate nursing professionalism, which determines the clinical ladder system levels, and verified its validity. Data were collected using a clinical competence tool developed in this study, and existing methods such as the nursing professionalism evaluation tool, peer reviews, and face-to-face interviews to evaluate promotions and verify the presented content in a medical institution. Reliability and convergent and discriminant validity of the clinical competence evaluation tool were verified using SmartPLS software. The validity of the model for evaluating overall nursing professionalism was also analyzed. Clinical competence was determined by five dimensions of nursing practice: scientific, technical, ethical, aesthetic, and existential. The structural model explained 66% of the variance. Clinical competence scales, peer reviews, and face-to-face interviews directly determined nursing professionalism levels. The evaluation system can be used for evaluating nurses' professionalism in actual medical institutions from a nursing practice perspective. A conceptual framework for establishing a human resources management system for nurses and a tool for evaluating nursing professionalism at medical institutions is provided.

  7. Physical validation issue of the NEPTUNE two-phase modelling: validation plan to be adopted, experimental programs to be set up and associated instrumentation techniques developed

    International Nuclear Information System (INIS)

    Pierre Peturaud; Eric Hervieu

    2005-01-01

    Full text of publication follows: A long-term joint development program for the next generation of nuclear reactors simulation tools has been launched in 2001 by EDF (Electricite de France) and CEA (Commissariat a l'Energie Atomique). The NEPTUNE Project constitutes the Thermal-Hydraulics part of this comprehensive program. Along with the underway development of this new two-phase flow software platform, the physical validation of the involved modelling is a crucial issue, whatever the modelling scale is, and the present paper deals with this issue. After a brief recall about the NEPTUNE platform, the general validation strategy to be adopted is first of all clarified by means of three major features: (i) physical validation in close connection with the concerned industrial applications, (ii) involving (as far as possible) a two-step process successively focusing on dominant separate models and assessing the whole modelling capability, (iii) thanks to the use of relevant data with respect to the validation aims. Based on this general validation process, a four-step generic work approach has been defined; it includes: (i) a thorough analysis of the concerned industrial applications to identify the key physical phenomena involved and associated dominant basic models, (ii) an assessment of these models against the available validation pieces of information, to specify the additional validation needs and define dedicated validation plans, (iii) an inventory and assessment of existing validation data (with respect to the requirements specified in the previous task) to identify the actual needs for new validation data, (iv) the specification of the new experimental programs to be set up to provide the needed new data. This work approach has been applied to the NEPTUNE software, focusing on 8 high priority industrial applications, and it has resulted in the definition of (i) the validation plan and experimental programs to be set up for the open medium 3D modelling

  8. Assessment for Complex Learning Resources: Development and Validation of an Integrated Model

    Directory of Open Access Journals (Sweden)

    Gudrun Wesiak

    2013-01-01

    Full Text Available Today’s e-learning systems meet the challenge to provide interactive, personalized environments that support self-regulated learning as well as social collaboration and simulation. At the same time assessment procedures have to be adapted to the new learning environments by moving from isolated summative assessments to integrated assessment forms. Therefore, learning experiences enriched with complex didactic resources - such as virtualized collaborations and serious games - have emerged. In this extension of [1] an integrated model for e-assessment (IMA is outlined, which incorporates complex learning resources and assessment forms as main components for the development of an enriched learning experience. For a validation the IMA was presented to a group of experts from the fields of cognitive science, pedagogy, and e-learning. The findings from the validation lead to several refinements of the model, which mainly concern the component forms of assessment and the integration of social aspects. Both aspects are accounted for in the revised model, the former by providing a detailed sub-model for assessment forms.

  9. Water loss in table grapes: model development and validation under dynamic storage conditions

    Directory of Open Access Journals (Sweden)

    Ericsem PEREIRA

    2017-09-01

    Full Text Available Abstract Water loss is a critical problem affecting the quality of table grapes. Temperature and relative humidity (RH are essential in this process. Although mathematical modelling can be applied to measure constant temperature and RH impacts, it is proved that variations in storage conditions are normally encountered in the cold chain. This study proposed a methodology to develop a weight loss model for table grapes and validate its predictions in non-constant conditions of a domestic refrigerator. Grapes were maintained under controlled conditions and the weight loss was measured to calibrate the model. The model described the water loss process adequately and the validation tests confirmed its predictive ability. Delayed cooling tests showed that estimated transpiration rates in subsequent continuous temperature treatment was not significantly influenced by prior exposure conditions, suggesting that this model may be useful to estimate the weight loss consequences of interruptions in the cold chain.

  10. Development and validation of the ENIGMA code for MOX fuel performance modelling

    International Nuclear Information System (INIS)

    Palmer, I.; Rossiter, G.; White, R.J.

    2000-01-01

    The ENIGMA fuel performance code has been under development in the UK since the mid-1980s with contributions made by both the fuel vendor (BNFL) and the utility (British Energy). In recent years it has become the principal code for UO 2 fuel licensing for both PWR and AGR reactor systems in the UK and has also been used by BNFL in support of overseas UO 2 and MOX fuel business. A significant new programme of work has recently been initiated by BNFL to further develop the code specifically for MOX fuel application. Model development is proceeding hand in hand with a major programme of MOX fuel testing and PIE studies, with the objective of producing a fuel modelling code suitable for mechanistic analysis, as well as for licensing applications. This paper gives an overview of the model developments being undertaken and of the experimental data being used to underpin and to validate the code. The paper provides a summary of the code development programme together with specific examples of new models produced. (author)

  11. Development and Validation of a Prediction Model to Estimate Individual Risk of Pancreatic Cancer.

    Science.gov (United States)

    Yu, Ami; Woo, Sang Myung; Joo, Jungnam; Yang, Hye-Ryung; Lee, Woo Jin; Park, Sang-Jae; Nam, Byung-Ho

    2016-01-01

    There is no reliable screening tool to identify people with high risk of developing pancreatic cancer even though pancreatic cancer represents the fifth-leading cause of cancer-related death in Korea. The goal of this study was to develop an individualized risk prediction model that can be used to screen for asymptomatic pancreatic cancer in Korean men and women. Gender-specific risk prediction models for pancreatic cancer were developed using the Cox proportional hazards model based on an 8-year follow-up of a cohort study of 1,289,933 men and 557,701 women in Korea who had biennial examinations in 1996-1997. The performance of the models was evaluated with respect to their discrimination and calibration ability based on the C-statistic and Hosmer-Lemeshow type χ2 statistic. A total of 1,634 (0.13%) men and 561 (0.10%) women were newly diagnosed with pancreatic cancer. Age, height, BMI, fasting glucose, urine glucose, smoking, and age at smoking initiation were included in the risk prediction model for men. Height, BMI, fasting glucose, urine glucose, smoking, and drinking habit were included in the risk prediction model for women. Smoking was the most significant risk factor for developing pancreatic cancer in both men and women. The risk prediction model exhibited good discrimination and calibration ability, and in external validation it had excellent prediction ability. Gender-specific risk prediction models for pancreatic cancer were developed and validated for the first time. The prediction models will be a useful tool for detecting high-risk individuals who may benefit from increased surveillance for pancreatic cancer.

  12. Modelling of the activity system - development of an evaluation method for integrated system validation

    International Nuclear Information System (INIS)

    Norros, Leena; Savioja, Paula

    2004-01-01

    In this paper we present our recent research which focuses on creating an evaluation method for human-system interfaces of complex systems. The method is aimed for use in the validation of modernised nuclear power plant (NPP) control rooms, and other complex systems with high reliability requirements. The task in validation is to determine whether the human-system functions safely and effectively. This question can be operationalized to the selection of relevant operational features and their appropriate acceptance criteria. Thus, there is a need to ensure that the results of the evaluation can be generalized so that they serve the purpose of integrated system validation. The definition of the appropriate acceptance criteria provides basis for the judgement of the appropriateness of the performance of the system. We propose that the operational situations and the acceptance criteria should be defined based on modelling of the NPP operation that is comprehended as an activity system. We developed a new core-tasks modelling framework. It is a formative modelling approach that combines causal, functional and understanding explanations of system performance. In this paper we reason how modelling can be used as a medium to determine the validity of the emerging control room system. (Author)

  13. The development and validation of a five-factor model of Sources of Self-Efficacy in clinical nursing education

    NARCIS (Netherlands)

    Gloudemans, H.; Reynaert, W.; Schalk, R.; Braeken, J.

    2013-01-01

    Background: The aim of this study is to validate a newly developed nurses' self-efficacy sources inventory. We test the validity of a five-dimensional model of sources of self-efficacy, which we contrast with the traditional four-dimensional model based on Bandura’s theoretical

  14. The development and validation of a thermal model for the cabin of a vehicle

    International Nuclear Information System (INIS)

    Marcos, David; Pino, Francisco J.; Bordons, Carlos; Guerra, José J.

    2014-01-01

    Energy management in modern vehicles is a crucial issue, especially in the case of electric vehicles (EV) or hybrid vehicles (HV), in which different energy sources and loads must be considered for the operation of a vehicle. Air conditioning is an important load that must be thoroughly analysed because it can constitute a considerable percentage of the energy demand. In this paper, a simplified and dynamic thermal model for the cabin of a vehicle is proposed and validated. The developed model can be used for the design and testing of the heating, ventilation, and air conditioning (HVAC) system of a vehicle and for the study of its effects on the performance and fuel consumption of vehicles, such as EVs or HVs. The model is based on theoretical heat transfer, thermal inertia, and radiation treatment equations. The model results obtained from simulations are compared with the cabin air temperature of a vehicle under different conditions. This comparison demonstrates the accuracy between the simulation results and actual results. - Highlights: •A thermal model of a vehicle cabin with two thermal inertias is developed. •The model is validated with experimental data. •The simulation results and the experimental data fit

  15. Model validation: a systemic and systematic approach

    International Nuclear Information System (INIS)

    Sheng, G.; Elzas, M.S.; Cronhjort, B.T.

    1993-01-01

    The term 'validation' is used ubiquitously in association with the modelling activities of numerous disciplines including social, political natural, physical sciences, and engineering. There is however, a wide range of definitions which give rise to very different interpretations of what activities the process involves. Analyses of results from the present large international effort in modelling radioactive waste disposal systems illustrate the urgent need to develop a common approach to model validation. Some possible explanations are offered to account for the present state of affairs. The methodology developed treats model validation and code verification in a systematic fashion. In fact, this approach may be regarded as a comprehensive framework to assess the adequacy of any simulation study. (author)

  16. Acute Kidney Injury in Trauma Patients Admitted to Critical Care: Development and Validation of a Diagnostic Prediction Model.

    Science.gov (United States)

    Haines, Ryan W; Lin, Shih-Pin; Hewson, Russell; Kirwan, Christopher J; Torrance, Hew D; O'Dwyer, Michael J; West, Anita; Brohi, Karim; Pearse, Rupert M; Zolfaghari, Parjam; Prowle, John R

    2018-02-26

    Acute Kidney Injury (AKI) complicating major trauma is associated with increased mortality and morbidity. Traumatic AKI has specific risk factors and predictable time-course facilitating diagnostic modelling. In a single centre, retrospective observational study we developed risk prediction models for AKI after trauma based on data around intensive care admission. Models predicting AKI were developed using data from 830 patients, using data reduction followed by logistic regression, and were independently validated in a further 564 patients. AKI occurred in 163/830 (19.6%) with 42 (5.1%) receiving renal replacement therapy (RRT). First serum creatinine and phosphate, units of blood transfused in first 24 h, age and Charlson score discriminated need for RRT and AKI early after trauma. For RRT c-statistics were good to excellent: development: 0.92 (0.88-0.96), validation: 0.91 (0.86-0.97). Modelling AKI stage 2-3, c-statistics were also good, development: 0.81 (0.75-0.88) and validation: 0.83 (0.74-0.92). The model predicting AKI stage 1-3 performed moderately, development: c-statistic 0.77 (0.72-0.81), validation: 0.70 (0.64-0.77). Despite good discrimination of need for RRT, positive predictive values (PPV) at the optimal cut-off were only 23.0% (13.7-42.7) in development. However, PPV for the alternative endpoint of RRT and/or death improved to 41.2% (34.8-48.1) highlighting death as a clinically relevant endpoint to RRT.

  17. The development and validation of a five factor model of sources of self-efficacy in clinical nursing education

    NARCIS (Netherlands)

    Prof. Dr. Rene Schalk; dr. Wouter Reynaert; Dr. Johan Braeken; Drs. Henk Gloudemans

    2012-01-01

    Background: The aim of this study is to validate a newly developed nurses' self-efficacy sources inventory. We test the validity of a five-dimensional model of sources of self-efficacy, which we contrast with the traditional four-dimensional model based on Bandura's theoretical concepts. Methods:

  18. The development and validation of a five-factor model of sources of self-efficacy in clinical nursing education

    NARCIS (Netherlands)

    Gloudemans, H.; Schalk, R.; Reynaert, W.M.; Braeken, J.

    2013-01-01

    Background: The aim of this study is to validate a newly developed nurses' self-efficacy sources inventory. We test the validity of a five-dimensional model of sources of self-efficacy, which we contrast with the traditional four-dimensional model based on Bandura’s theoretical concepts. Methods:

  19. Development-based Trust: Proposing and Validating a New Trust Measurement Model for Buyer-Seller Relationships

    Directory of Open Access Journals (Sweden)

    José Mauro da Costa Hernandez

    2010-04-01

    Full Text Available This study proposes and validates a trust measurement model for buyer-seller relationships. Baptized as development-based trust, the model encompasses three dimensions of trust: calculus-based, knowledge-based and identification-based. In addition to recognizing that trust is a multidimensional construct, the model also assumes that trust can evolve to take on a different character depending on the stage of the relationship. In order to test the proposed model and compare it to the characteristic-based trust measurement model, the measure most frequently used in the buyer-seller relationship literature, data were collected from 238 clients of an IT product wholesaler. The results show that the scales are valid and reliable and the proposed development-based trust measurement model is superior to the characteristic-based trust measurement model in terms of its ability to explain certain variables of interest in buyer-seller relationships (long-term relationship orientation, information sharing, behavioral loyalty and future intentions. Implications for practice, limitations and suggestions for future studies are discussed.

  20. Structural system identification: Structural dynamics model validation

    Energy Technology Data Exchange (ETDEWEB)

    Red-Horse, J.R.

    1997-04-01

    Structural system identification is concerned with the development of systematic procedures and tools for developing predictive analytical models based on a physical structure`s dynamic response characteristics. It is a multidisciplinary process that involves the ability (1) to define high fidelity physics-based analysis models, (2) to acquire accurate test-derived information for physical specimens using diagnostic experiments, (3) to validate the numerical simulation model by reconciling differences that inevitably exist between the analysis model and the experimental data, and (4) to quantify uncertainties in the final system models and subsequent numerical simulations. The goal of this project was to develop structural system identification techniques and software suitable for both research and production applications in code and model validation.

  1. Assessment of the Value, Impact, and Validity of the Jobs and Economic Development Impacts (JEDI) Suite of Models

    Energy Technology Data Exchange (ETDEWEB)

    Billman, L.; Keyser, D.

    2013-08-01

    The Jobs and Economic Development Impacts (JEDI) models, developed by the National Renewable Energy Laboratory (NREL) for the U.S. Department of Energy (DOE) Office of Energy Efficiency and Renewable Energy (EERE), use input-output methodology to estimate gross (not net) jobs and economic impacts of building and operating selected types of renewable electricity generation and fuel plants. This analysis provides the DOE with an assessment of the value, impact, and validity of the JEDI suite of models. While the models produce estimates of jobs, earnings, and economic output, this analysis focuses only on jobs estimates. This validation report includes an introduction to JEDI models, an analysis of the value and impact of the JEDI models, and an analysis of the validity of job estimates generated by JEDI model through comparison to other modeled estimates and comparison to empirical, observed jobs data as reported or estimated for a commercial project, a state, or a region.

  2. Development and prospective validation of a model estimating risk of readmission in cancer patients.

    Science.gov (United States)

    Schmidt, Carl R; Hefner, Jennifer; McAlearney, Ann S; Graham, Lisa; Johnson, Kristen; Moffatt-Bruce, Susan; Huerta, Timothy; Pawlik, Timothy M; White, Susan

    2018-02-26

    Hospital readmissions among cancer patients are common. While several models estimating readmission risk exist, models specific for cancer patients are lacking. A logistic regression model estimating risk of unplanned 30-day readmission was developed using inpatient admission data from a 2-year period (n = 18 782) at a tertiary cancer hospital. Readmission risk estimates derived from the model were then calculated prospectively over a 10-month period (n = 8616 admissions) and compared with actual incidence of readmission. There were 2478 (13.2%) unplanned readmissions. Model factors associated with readmission included: emergency department visit within 30 days, >1 admission within 60 days, non-surgical admission, solid malignancy, gastrointestinal cancer, emergency admission, length of stay >5 days, abnormal sodium, hemoglobin, or white blood cell count. The c-statistic for the model was 0.70. During the 10-month prospective evaluation, estimates of readmission from the model were associated with higher actual readmission incidence from 20.7% for the highest risk category to 9.6% for the lowest. An unplanned readmission risk model developed specifically for cancer patients performs well when validated prospectively. The specificity of the model for cancer patients, EMR incorporation, and prospective validation justify use of the model in future studies designed to reduce and prevent readmissions. © 2018 Wiley Periodicals, Inc.

  3. A proposed best practice model validation framework for banks

    Directory of Open Access Journals (Sweden)

    Pieter J. (Riaan de Jongh

    2017-06-01

    Full Text Available Background: With the increasing use of complex quantitative models in applications throughout the financial world, model risk has become a major concern. The credit crisis of 2008–2009 provoked added concern about the use of models in finance. Measuring and managing model risk has subsequently come under scrutiny from regulators, supervisors, banks and other financial institutions. Regulatory guidance indicates that meticulous monitoring of all phases of model development and implementation is required to mitigate this risk. Considerable resources must be mobilised for this purpose. The exercise must embrace model development, assembly, implementation, validation and effective governance. Setting: Model validation practices are generally patchy, disparate and sometimes contradictory, and although the Basel Accord and some regulatory authorities have attempted to establish guiding principles, no definite set of global standards exists. Aim: Assessing the available literature for the best validation practices. Methods: This comprehensive literature study provided a background to the complexities of effective model management and focussed on model validation as a component of model risk management. Results: We propose a coherent ‘best practice’ framework for model validation. Scorecard tools are also presented to evaluate if the proposed best practice model validation framework has been adequately assembled and implemented. Conclusion: The proposed best practice model validation framework is designed to assist firms in the construction of an effective, robust and fully compliant model validation programme and comprises three principal elements: model validation governance, policy and process.

  4. From control to causation: Validating a 'complex systems model' of running-related injury development and prevention.

    Science.gov (United States)

    Hulme, A; Salmon, P M; Nielsen, R O; Read, G J M; Finch, C F

    2017-11-01

    There is a need for an ecological and complex systems approach for better understanding the development and prevention of running-related injury (RRI). In a previous article, we proposed a prototype model of the Australian recreational distance running system which was based on the Systems Theoretic Accident Mapping and Processes (STAMP) method. That model included the influence of political, organisational, managerial, and sociocultural determinants alongside individual-level factors in relation to RRI development. The purpose of this study was to validate that prototype model by drawing on the expertise of both systems thinking and distance running experts. This study used a modified Delphi technique involving a series of online surveys (December 2016- March 2017). The initial survey was divided into four sections containing a total of seven questions pertaining to different features associated with the prototype model. Consensus in opinion about the validity of the prototype model was reached when the number of experts who agreed or disagreed with survey statement was ≥75% of the total number of respondents. A total of two Delphi rounds was needed to validate the prototype model. Out of a total of 51 experts who were initially contacted, 50.9% (n = 26) completed the first round of the Delphi, and 92.3% (n = 24) of those in the first round participated in the second. Most of the 24 full participants considered themselves to be a running expert (66.7%), and approximately a third indicated their expertise as a systems thinker (33.3%). After the second round, 91.7% of the experts agreed that the prototype model was a valid description of the Australian distance running system. This is the first study to formally examine the development and prevention of RRI from an ecological and complex systems perspective. The validated model of the Australian distance running system facilitates theoretical advancement in terms of identifying practical system

  5. The Johns Hopkins model of psychological first aid (RAPID-PFA): curriculum development and content validation.

    Science.gov (United States)

    Everly, George S; Barnett, Daniel J; Links, Jonathan M

    2012-01-01

    There appears to be virtual universal endorsement of the need for and value of acute "psychological first aid" (PFA) in the wake of trauma and disasters. In this paper, we describe the development of the curriculum for The Johns Hopkins RAPID-PFA model of psychological first aid. We employed an adaptation of the basic framework for the development of a clinical science as recommended by Millon which entailed: historical review, theoretical development, and content validation. The process of content validation of the RAPID-PFA curriculum entailed the assessment of attitudes (confidence in the application of PFA interventions, preparedness in the application of PFA); knowledge related to the application of immediate mental health interventions; and behavior (the ability to recognize clinical markers in the field as assessed via a videotape recognition exercise). Results of the content validation phase suggest the six-hour RAPID-PFA curriculum, initially based upon structural modeling analysis, can improve confidence in the application of PFA interventions, preparedness in the application of PFA, knowledge related to the application of immediate mental health interventions, and the ability to recognize clinical markers in the field as assessed via a videotape recognition exercise.

  6. Development and validation of a chronic copper biotic ligand model for Ceriodaphnia dubia

    International Nuclear Information System (INIS)

    Schwartz, Melissa L.; Vigneault, Bernard

    2007-01-01

    A biotic ligand model (BLM) to predict chronic Cu toxicity to Ceriodaphnia dubia was developed and tested. The effect of cationic competition, pH and natural organic matter complexation of Cu was examined to develop the model. There was no effect of cationic competition using increasing Ca and Na concentrations in our exposures. However, we did see a significant regression of decreasing toxicity (measured as the IC25; concentration at which there was a 25% inhibition of reproduction) as Mg concentration increased. However, taking into account the actual variability of the IC25 and since the relative increase in IC25 due to additional Mg was small (1.5-fold) Mg competition was not included in the model. Changes in pH had a significant effect on Cu IC25, which is consistent with proton competition as often suggested for acute BLMs. Finally, natural organic matter (NOM) was added to exposures resulting in significant decreases in toxicity. Therefore, our predictive model for chronic Cu toxicity to C. dubia includes the effect of pH and NOM complexation. The model was validated with Cu IC25 data generated in six natural surface waters collected from across Canada. Using WHAM VI, we calculated Cu speciation in each natural water and using our model, we generated 'predicted' IC25 data. We successfully predicted all Cu IC25 within a factor of 3 for the six waters used for validation

  7. Development and Validation of a Mathematical Model for Olive Oil Oxidation

    Science.gov (United States)

    Rahmouni, K.; Bouhafa, H.; Hamdi, S.

    2009-03-01

    A mathematical model describing the stability or the susceptibility to oxidation of extra virgin olive oil has been developed. The model has been resolved by an iterative method using differential finite method. It was validated by experimental data of extra virgin olive oil (EVOO) oxidation. EVOO stability was tested by using a Rancimat at four different temperatures 60, 70, 80 and 90° C until peroxide accumulation reached 20 [meq/kg]. Peroxide formation is speed relatively slow; fits zero order reaction with linear regression coefficients varying from 0, 98 to 0, 99. The mathematical model was used to predict the shelf life of bulk conditioned olive oil. This model described peroxide accumulation inside a container in excess of oxygen as a function of time at various positions from the interface air/oil. Good correlations were obtained between theoretical and experimental values.

  8. High-resolution computational algorithms for simulating offshore wind turbines and farms: Model development and validation

    Energy Technology Data Exchange (ETDEWEB)

    Calderer, Antoni [Univ. of Minnesota, Minneapolis, MN (United States); Yang, Xiaolei [Stony Brook Univ., NY (United States); Angelidis, Dionysios [Univ. of Minnesota, Minneapolis, MN (United States); Feist, Chris [Univ. of Minnesota, Minneapolis, MN (United States); Guala, Michele [Univ. of Minnesota, Minneapolis, MN (United States); Ruehl, Kelley [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Guo, Xin [Univ. of Minnesota, Minneapolis, MN (United States); Boomsma, Aaron [Univ. of Minnesota, Minneapolis, MN (United States); Shen, Lian [Univ. of Minnesota, Minneapolis, MN (United States); Sotiropoulos, Fotis [Stony Brook Univ., NY (United States)

    2015-10-30

    The present project involves the development of modeling and analysis design tools for assessing offshore wind turbine technologies. The computational tools developed herein are able to resolve the effects of the coupled interaction of atmospheric turbulence and ocean waves on aerodynamic performance and structural stability and reliability of offshore wind turbines and farms. Laboratory scale experiments have been carried out to derive data sets for validating the computational models.

  9. Development and validation of a prediction model for loss of physical function in elderly hemodialysis patients.

    Science.gov (United States)

    Fukuma, Shingo; Shimizu, Sayaka; Shintani, Ayumi; Kamitani, Tsukasa; Akizawa, Tadao; Fukuhara, Shunichi

    2017-09-05

    Among aging hemodialysis patients, loss of physical function has become a major issue. We developed and validated a model of predicting loss of physical function among elderly hemodialysis patients. We conducted a cohort study involving maintenance hemodialysis patients  ≥65 years of age from the Dialysis Outcomes and Practice Pattern Study in Japan. The derivation cohort included 593 early phase (1996-2004) patients and the temporal validation cohort included 447 late-phase (2005-12) patients. The main outcome was the incidence of loss of physical function, defined as the 12-item Short Form Health Survey physical function score decreasing to 0 within a year. Using backward stepwise logistic regression by Akaike's Information Criteria, six predictors (age, gender, dementia, mental health, moderate activity and ascending stairs) were selected for the final model. Points were assigned based on the regression coefficients and the total score was calculated by summing the points for each predictor. In total, 65 (11.0%) and 53 (11.9%) hemodialysis patients lost their physical function within 1 year in the derivation and validation cohorts, respectively. This model has good predictive performance quantified by both discrimination and calibration. The proportion of the loss of physical function increased sequentially through low-, middle-, and high-score categories based on the model (2.5%, 11.7% and 22.3% in the validation cohort, respectively). The loss of physical function was strongly associated with 1-year mortality [adjusted odds ratio 2.48 (95% confidence interval 1.26-4.91)]. We developed and validated a risk prediction model with good predictive performance for loss of physical function in elderly hemodialysis patients. Our simple prediction model may help physicians and patients make more informed decisions for healthy longevity. © The Author 2017. Published by Oxford University Press on behalf of ERA-EDTA.

  10. A diagnostic model for the detection of sensitization to wheat allergens was developed and validated in bakery workers

    NARCIS (Netherlands)

    Suarthana, Eva; Vergouwe, Yvonne; Moons, Karel G.; de Monchy, Jan; Grobbee, Diederick; Heederik, Dick; Meijer, Evert

    Objectives: To develop and validate a prediction model to detect sensitization to wheat allergens in bakery workers. Study Design and Setting: The prediction model was developed in 867 Dutch bakery workers (development set, prevalence of sensitization 13%) and included questionnaire items (candidate

  11. Development and validation of risk models and molecular diagnostics to permit personalized management of cancer.

    Science.gov (United States)

    Pu, Xia; Ye, Yuanqing; Wu, Xifeng

    2014-01-01

    Despite the advances made in cancer management over the past few decades, improvements in cancer diagnosis and prognosis are still poor, highlighting the need for individualized strategies. Toward this goal, risk prediction models and molecular diagnostic tools have been developed, tailoring each step of risk assessment from diagnosis to treatment and clinical outcomes based on the individual's clinical, epidemiological, and molecular profiles. These approaches hold increasing promise for delivering a new paradigm to maximize the efficiency of cancer surveillance and efficacy of treatment. However, they require stringent study design, methodology development, comprehensive assessment of biomarkers and risk factors, and extensive validation to ensure their overall usefulness for clinical translation. In the current study, the authors conducted a systematic review using breast cancer as an example and provide general guidelines for risk prediction models and molecular diagnostic tools, including development, assessment, and validation. © 2013 American Cancer Society.

  12. Pharmacokinetic modeling of gentamicin in treatment of infective endocarditis: Model development and validation of existing models

    Science.gov (United States)

    van der Wijk, Lars; Proost, Johannes H.; Sinha, Bhanu; Touw, Daan J.

    2017-01-01

    Gentamicin shows large variations in half-life and volume of distribution (Vd) within and between individuals. Thus, monitoring and accurately predicting serum levels are required to optimize effectiveness and minimize toxicity. Currently, two population pharmacokinetic models are applied for predicting gentamicin doses in adults. For endocarditis patients the optimal model is unknown. We aimed at: 1) creating an optimal model for endocarditis patients; and 2) assessing whether the endocarditis and existing models can accurately predict serum levels. We performed a retrospective observational two-cohort study: one cohort to parameterize the endocarditis model by iterative two-stage Bayesian analysis, and a second cohort to validate and compare all three models. The Akaike Information Criterion and the weighted sum of squares of the residuals divided by the degrees of freedom were used to select the endocarditis model. Median Prediction Error (MDPE) and Median Absolute Prediction Error (MDAPE) were used to test all models with the validation dataset. We built the endocarditis model based on data from the modeling cohort (65 patients) with a fixed 0.277 L/h/70kg metabolic clearance, 0.698 (±0.358) renal clearance as fraction of creatinine clearance, and Vd 0.312 (±0.076) L/kg corrected lean body mass. External validation with data from 14 validation cohort patients showed a similar predictive power of the endocarditis model (MDPE -1.77%, MDAPE 4.68%) as compared to the intensive-care (MDPE -1.33%, MDAPE 4.37%) and standard (MDPE -0.90%, MDAPE 4.82%) models. All models acceptably predicted pharmacokinetic parameters for gentamicin in endocarditis patients. However, these patients appear to have an increased Vd, similar to intensive care patients. Vd mainly determines the height of peak serum levels, which in turn correlate with bactericidal activity. In order to maintain simplicity, we advise to use the existing intensive-care model in clinical practice to avoid

  13. Pharmacokinetic modeling of gentamicin in treatment of infective endocarditis: Model development and validation of existing models.

    Directory of Open Access Journals (Sweden)

    Anna Gomes

    Full Text Available Gentamicin shows large variations in half-life and volume of distribution (Vd within and between individuals. Thus, monitoring and accurately predicting serum levels are required to optimize effectiveness and minimize toxicity. Currently, two population pharmacokinetic models are applied for predicting gentamicin doses in adults. For endocarditis patients the optimal model is unknown. We aimed at: 1 creating an optimal model for endocarditis patients; and 2 assessing whether the endocarditis and existing models can accurately predict serum levels. We performed a retrospective observational two-cohort study: one cohort to parameterize the endocarditis model by iterative two-stage Bayesian analysis, and a second cohort to validate and compare all three models. The Akaike Information Criterion and the weighted sum of squares of the residuals divided by the degrees of freedom were used to select the endocarditis model. Median Prediction Error (MDPE and Median Absolute Prediction Error (MDAPE were used to test all models with the validation dataset. We built the endocarditis model based on data from the modeling cohort (65 patients with a fixed 0.277 L/h/70kg metabolic clearance, 0.698 (±0.358 renal clearance as fraction of creatinine clearance, and Vd 0.312 (±0.076 L/kg corrected lean body mass. External validation with data from 14 validation cohort patients showed a similar predictive power of the endocarditis model (MDPE -1.77%, MDAPE 4.68% as compared to the intensive-care (MDPE -1.33%, MDAPE 4.37% and standard (MDPE -0.90%, MDAPE 4.82% models. All models acceptably predicted pharmacokinetic parameters for gentamicin in endocarditis patients. However, these patients appear to have an increased Vd, similar to intensive care patients. Vd mainly determines the height of peak serum levels, which in turn correlate with bactericidal activity. In order to maintain simplicity, we advise to use the existing intensive-care model in clinical practice to

  14. Development and Validation of a Constitutive Model for Dental Composites during the Curing Process

    Science.gov (United States)

    Wickham Kolstad, Lauren

    Debonding is a critical failure of a dental composites used for dental restorations. Debonding of dental composites can be determined by comparing the shrinkage stress of to the debonding strength of the adhesive that bonds it to the tooth surface. It is difficult to measure shrinkage stress experimentally. In this study, finite element analysis is used to predict the stress in the composite during cure. A new constitutive law is presented that will allow composite developers to evaluate composite shrinkage stress at early stages in the material development. Shrinkage stress and shrinkage strain experimental data were gathered for three dental resins, Z250, Z350, and P90. Experimental data were used to develop a constitutive model for the Young's modulus as a function of time of the dental composite during cure. A Maxwell model, spring and dashpot in series, was used to simulate the composite. The compliance of the shrinkage stress device was also taken into account by including a spring in series with the Maxwell model. A coefficient of thermal expansion was also determined for internal loading of the composite by dividing shrinkage strain by time. Three FEA models are presented. A spring-disk model validates that the constitutive law is self-consistent. A quarter cuspal deflection model uses separate experimental data to verify that the constitutive law is valid. Finally, an axisymmetric tooth model is used to predict interfacial stresses in the composite. These stresses are compared to the debonding strength to check if the composite debonds. The new constitutive model accurately predicted cuspal deflection data. Predictions for interfacial bond stress in the tooth model compare favorably with debonding characteristics observed in practice for dental resins.

  15. Assessing Discriminative Performance at External Validation of Clinical Prediction Models.

    Directory of Open Access Journals (Sweden)

    Daan Nieboer

    Full Text Available External validation studies are essential to study the generalizability of prediction models. Recently a permutation test, focusing on discrimination as quantified by the c-statistic, was proposed to judge whether a prediction model is transportable to a new setting. We aimed to evaluate this test and compare it to previously proposed procedures to judge any changes in c-statistic from development to external validation setting.We compared the use of the permutation test to the use of benchmark values of the c-statistic following from a previously proposed framework to judge transportability of a prediction model. In a simulation study we developed a prediction model with logistic regression on a development set and validated them in the validation set. We concentrated on two scenarios: 1 the case-mix was more heterogeneous and predictor effects were weaker in the validation set compared to the development set, and 2 the case-mix was less heterogeneous in the validation set and predictor effects were identical in the validation and development set. Furthermore we illustrated the methods in a case study using 15 datasets of patients suffering from traumatic brain injury.The permutation test indicated that the validation and development set were homogenous in scenario 1 (in almost all simulated samples and heterogeneous in scenario 2 (in 17%-39% of simulated samples. Previously proposed benchmark values of the c-statistic and the standard deviation of the linear predictors correctly pointed at the more heterogeneous case-mix in scenario 1 and the less heterogeneous case-mix in scenario 2.The recently proposed permutation test may provide misleading results when externally validating prediction models in the presence of case-mix differences between the development and validation population. To correctly interpret the c-statistic found at external validation it is crucial to disentangle case-mix differences from incorrect regression coefficients.

  16. Model development and experimental validation of capnophilic lactic fermentation and hydrogen synthesis by Thermotoga neapolitana.

    Science.gov (United States)

    Pradhan, Nirakar; Dipasquale, Laura; d'Ippolito, Giuliana; Fontana, Angelo; Panico, Antonio; Pirozzi, Francesco; Lens, Piet N L; Esposito, Giovanni

    2016-08-01

    The aim of the present study was to develop a kinetic model for a recently proposed unique and novel metabolic process called capnophilic (CO2-requiring) lactic fermentation (CLF) pathway in Thermotoga neapolitana. The model was based on Monod kinetics and the mathematical expressions were developed to enable the simulation of biomass growth, substrate consumption and product formation. The calibrated kinetic parameters such as maximum specific uptake rate (k), semi-saturation constant (kS), biomass yield coefficient (Y) and endogenous decay rate (kd) were 1.30 h(-1), 1.42 g/L, 0.1195 and 0.0205 h(-1), respectively. A high correlation (>0.98) was obtained between the experimental data and model predictions for both model validation and cross validation processes. An increase of the lactate production in the range of 40-80% was obtained through CLF pathway compared to the classic dark fermentation model. The proposed kinetic model is the first mechanistically based model for the CLF pathway. This model provides useful information to improve the knowledge about how acetate and CO2 are recycled back by Thermotoga neapolitana to produce lactate without compromising the overall hydrogen yield. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Predicting the 6-month risk of severe hypoglycemia among adults with diabetes: Development and external validation of a prediction model.

    Science.gov (United States)

    Schroeder, Emily B; Xu, Stan; Goodrich, Glenn K; Nichols, Gregory A; O'Connor, Patrick J; Steiner, John F

    2017-07-01

    To develop and externally validate a prediction model for the 6-month risk of a severe hypoglycemic event among individuals with pharmacologically treated diabetes. The development cohort consisted of 31,674 Kaiser Permanente Colorado members with pharmacologically treated diabetes (2007-2015). The validation cohorts consisted of 38,764 Kaiser Permanente Northwest members and 12,035 HealthPartners members. Variables were chosen that would be available in electronic health records. We developed 16-variable and 6-variable models, using a Cox counting model process that allows for the inclusion of multiple 6-month observation periods per person. Across the three cohorts, there were 850,992 6-month observation periods, and 10,448 periods with at least one severe hypoglycemic event. The six-variable model contained age, diabetes type, HgbA1c, eGFR, history of a hypoglycemic event in the prior year, and insulin use. Both prediction models performed well, with good calibration and c-statistics of 0.84 and 0.81 for the 16-variable and 6-variable models, respectively. In the external validation cohorts, the c-statistics were 0.80-0.84. We developed and validated two prediction models for predicting the 6-month risk of hypoglycemia. The 16-variable model had slightly better performance than the 6-variable model, but in some practice settings, use of the simpler model may be preferred. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Developing and Validating a Predictive Model for Stroke Progression

    Directory of Open Access Journals (Sweden)

    L.E. Craig

    2011-12-01

    Full Text Available Background: Progression is believed to be a common and important complication in acute stroke, and has been associated with increased mortality and morbidity. Reliable identification of predictors of early neurological deterioration could potentially benefit routine clinical care. The aim of this study was to identify predictors of early stroke progression using two independent patient cohorts. Methods: Two patient cohorts were used for this study – the first cohort formed the training data set, which included consecutive patients admitted to an urban teaching hospital between 2000 and 2002, and the second cohort formed the test data set, which included patients admitted to the same hospital between 2003 and 2004. A standard definition of stroke progression was used. The first cohort (n = 863 was used to develop the model. Variables that were statistically significant (p 0.1 in turn. The second cohort (n = 216 was used to test the performance of the model. The performance of the predictive model was assessed in terms of both calibration and discrimination. Multiple imputation methods were used for dealing with the missing values. Results: Variables shown to be significant predictors of stroke progression were conscious level, history of coronary heart disease, presence of hyperosmolarity, CT lesion, living alone on admission, Oxfordshire Community Stroke Project classification, presence of pyrexia and smoking status. The model appears to have reasonable discriminative properties [the median receiver-operating characteristic curve value was 0.72 (range 0.72–0.73] and to fit well with the observed data, which is indicated by the high goodness-of-fit p value [the median p value from the Hosmer-Lemeshow test was 0.90 (range 0.50–0.92]. Conclusion: The predictive model developed in this study contains variables that can be easily collected in practice therefore increasing its usability in clinical practice. Using this analysis approach, the

  19. Developing and validating a predictive model for stroke progression.

    Science.gov (United States)

    Craig, L E; Wu, O; Gilmour, H; Barber, M; Langhorne, P

    2011-01-01

    Progression is believed to be a common and important complication in acute stroke, and has been associated with increased mortality and morbidity. Reliable identification of predictors of early neurological deterioration could potentially benefit routine clinical care. The aim of this study was to identify predictors of early stroke progression using two independent patient cohorts. Two patient cohorts were used for this study - the first cohort formed the training data set, which included consecutive patients admitted to an urban teaching hospital between 2000 and 2002, and the second cohort formed the test data set, which included patients admitted to the same hospital between 2003 and 2004. A standard definition of stroke progression was used. The first cohort (n = 863) was used to develop the model. Variables that were statistically significant (p p > 0.1) in turn. The second cohort (n = 216) was used to test the performance of the model. The performance of the predictive model was assessed in terms of both calibration and discrimination. Multiple imputation methods were used for dealing with the missing values. Variables shown to be significant predictors of stroke progression were conscious level, history of coronary heart disease, presence of hyperosmolarity, CT lesion, living alone on admission, Oxfordshire Community Stroke Project classification, presence of pyrexia and smoking status. The model appears to have reasonable discriminative properties [the median receiver-operating characteristic curve value was 0.72 (range 0.72-0.73)] and to fit well with the observed data, which is indicated by the high goodness-of-fit p value [the median p value from the Hosmer-Lemeshow test was 0.90 (range 0.50-0.92)]. The predictive model developed in this study contains variables that can be easily collected in practice therefore increasing its usability in clinical practice. Using this analysis approach, the discrimination and calibration of the predictive model appear

  20. Developing and Validating a Predictive Model for Stroke Progression

    Science.gov (United States)

    Craig, L.E.; Wu, O.; Gilmour, H.; Barber, M.; Langhorne, P.

    2011-01-01

    Background Progression is believed to be a common and important complication in acute stroke, and has been associated with increased mortality and morbidity. Reliable identification of predictors of early neurological deterioration could potentially benefit routine clinical care. The aim of this study was to identify predictors of early stroke progression using two independent patient cohorts. Methods Two patient cohorts were used for this study – the first cohort formed the training data set, which included consecutive patients admitted to an urban teaching hospital between 2000 and 2002, and the second cohort formed the test data set, which included patients admitted to the same hospital between 2003 and 2004. A standard definition of stroke progression was used. The first cohort (n = 863) was used to develop the model. Variables that were statistically significant (p 0.1) in turn. The second cohort (n = 216) was used to test the performance of the model. The performance of the predictive model was assessed in terms of both calibration and discrimination. Multiple imputation methods were used for dealing with the missing values. Results Variables shown to be significant predictors of stroke progression were conscious level, history of coronary heart disease, presence of hyperosmolarity, CT lesion, living alone on admission, Oxfordshire Community Stroke Project classification, presence of pyrexia and smoking status. The model appears to have reasonable discriminative properties [the median receiver-operating characteristic curve value was 0.72 (range 0.72–0.73)] and to fit well with the observed data, which is indicated by the high goodness-of-fit p value [the median p value from the Hosmer-Lemeshow test was 0.90 (range 0.50–0.92)]. Conclusion The predictive model developed in this study contains variables that can be easily collected in practice therefore increasing its usability in clinical practice. Using this analysis approach, the discrimination and

  1. Toward Development of a Stochastic Wake Model: Validation Using LES and Turbine Loads

    Directory of Open Access Journals (Sweden)

    Jae Sang Moon

    2017-12-01

    Full Text Available Wind turbines within an array do not experience free-stream undisturbed flow fields. Rather, the flow fields on internal turbines are influenced by wakes generated by upwind unit and exhibit different dynamic characteristics relative to the free stream. The International Electrotechnical Commission (IEC standard 61400-1 for the design of wind turbines only considers a deterministic wake model for the design of a wind plant. This study is focused on the development of a stochastic model for waked wind fields. First, high-fidelity physics-based waked wind velocity fields are generated using Large-Eddy Simulation (LES. Stochastic characteristics of these LES waked wind velocity field, including mean and turbulence components, are analyzed. Wake-related mean and turbulence field-related parameters are then estimated for use with a stochastic model, using Multivariate Multiple Linear Regression (MMLR with the LES data. To validate the simulated wind fields based on the stochastic model, wind turbine tower and blade loads are generated using aeroelastic simulation for utility-scale wind turbine models and compared with those based directly on the LES inflow. The study’s overall objective is to offer efficient and validated stochastic approaches that are computationally tractable for assessing the performance and loads of turbines operating in wakes.

  2. Development and validation of P-MODTRAN7 and P-MCScene, 1D and 3D polarimetric radiative transfer models

    Science.gov (United States)

    Hawes, Frederick T.; Berk, Alexander; Richtsmeier, Steven C.

    2016-05-01

    A validated, polarimetric 3-dimensional simulation capability, P-MCScene, is being developed by generalizing Spectral Sciences' Monte Carlo-based synthetic scene simulation model, MCScene, to include calculation of all 4 Stokes components. P-MCScene polarimetric optical databases will be generated by a new version (MODTRAN7) of the government-standard MODTRAN radiative transfer algorithm. The conversion of MODTRAN6 to a polarimetric model is being accomplished by (1) introducing polarimetric data, by (2) vectorizing the MODTRAN radiation calculations and by (3) integrating the newly revised and validated vector discrete ordinate model VDISORT3. Early results, presented here, demonstrate a clear pathway to the long-term goal of fully validated polarimetric models.

  3. Validation process of simulation model

    International Nuclear Information System (INIS)

    San Isidro, M. J.

    1998-01-01

    It is presented a methodology on empirical validation about any detailed simulation model. This king of validation it is always related with an experimental case. The empirical validation has a residual sense, because the conclusions are based on comparisons between simulated outputs and experimental measurements. This methodology will guide us to detect the fails of the simulation model. Furthermore, it can be used a guide in the design of posterior experiments. Three steps can be well differentiated: Sensitivity analysis. It can be made with a DSA, differential sensitivity analysis, and with a MCSA, Monte-Carlo sensitivity analysis. Looking the optimal domains of the input parameters. It has been developed a procedure based on the Monte-Carlo methods and Cluster techniques, to find the optimal domains of these parameters. Residual analysis. This analysis has been made on the time domain and on the frequency domain, it has been used the correlation analysis and spectral analysis. As application of this methodology, it is presented the validation carried out on a thermal simulation model on buildings, Esp., studying the behavior of building components on a Test Cell of LECE of CIEMAT. (Author) 17 refs

  4. Solar Sail Models and Test Measurements Correspondence for Validation Requirements Definition

    Science.gov (United States)

    Ewing, Anthony; Adams, Charles

    2004-01-01

    Solar sails are being developed as a mission-enabling technology in support of future NASA science missions. Current efforts have advanced solar sail technology sufficient to justify a flight validation program. A primary objective of this activity is to test and validate solar sail models that are currently under development so that they may be used with confidence in future science mission development (e.g., scalable to larger sails). Both system and model validation requirements must be defined early in the program to guide design cycles and to ensure that relevant and sufficient test data will be obtained to conduct model validation to the level required. A process of model identification, model input/output documentation, model sensitivity analyses, and test measurement correspondence is required so that decisions can be made to satisfy validation requirements within program constraints.

  5. Development and validation of a predictive model for excessive postpartum blood loss: A retrospective, cohort study.

    Science.gov (United States)

    Rubio-Álvarez, Ana; Molina-Alarcón, Milagros; Arias-Arias, Ángel; Hernández-Martínez, Antonio

    2018-03-01

    postpartum haemorrhage is one of the leading causes of maternal morbidity and mortality worldwide. Despite the use of uterotonics agents as preventive measure, it remains a challenge to identify those women who are at increased risk of postpartum bleeding. to develop and to validate a predictive model to assess the risk of excessive bleeding in women with vaginal birth. retrospective cohorts study. "Mancha-Centro Hospital" (Spain). the elaboration of the predictive model was based on a derivation cohort consisting of 2336 women between 2009 and 2011. For validation purposes, a prospective cohort of 953 women between 2013 and 2014 were employed. Women with antenatal fetal demise, multiple pregnancies and gestations under 35 weeks were excluded METHODS: we used a multivariate analysis with binary logistic regression, Ridge Regression and areas under the Receiver Operating Characteristic curves to determine the predictive ability of the proposed model. there was 197 (8.43%) women with excessive bleeding in the derivation cohort and 63 (6.61%) women in the validation cohort. Predictive factors in the final model were: maternal age, primiparity, duration of the first and second stages of labour, neonatal birth weight and antepartum haemoglobin levels. Accordingly, the predictive ability of this model in the derivation cohort was 0.90 (95% CI: 0.85-0.93), while it remained 0.83 (95% CI: 0.74-0.92) in the validation cohort. this predictive model is proved to have an excellent predictive ability in the derivation cohort, and its validation in a latter population equally shows a good ability for prediction. This model can be employed to identify women with a higher risk of postpartum haemorrhage. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Bayesian risk-based decision method for model validation under uncertainty

    International Nuclear Information System (INIS)

    Jiang Xiaomo; Mahadevan, Sankaran

    2007-01-01

    This paper develops a decision-making methodology for computational model validation, considering the risk of using the current model, data support for the current model, and cost of acquiring new information to improve the model. A Bayesian decision theory-based method is developed for this purpose, using a likelihood ratio as the validation metric for model assessment. An expected risk or cost function is defined as a function of the decision costs, and the likelihood and prior of each hypothesis. The risk is minimized through correctly assigning experimental data to two decision regions based on the comparison of the likelihood ratio with a decision threshold. A Bayesian validation metric is derived based on the risk minimization criterion. Two types of validation tests are considered: pass/fail tests and system response value measurement tests. The methodology is illustrated for the validation of reliability prediction models in a tension bar and an engine blade subjected to high cycle fatigue. The proposed method can effectively integrate optimal experimental design into model validation to simultaneously reduce the cost and improve the accuracy of reliability model assessment

  7. Development and validation of risk models to predict outcomes following in-hospital cardiac arrest attended by a hospital-based resuscitation team.

    Science.gov (United States)

    Harrison, David A; Patel, Krishna; Nixon, Edel; Soar, Jasmeet; Smith, Gary B; Gwinnutt, Carl; Nolan, Jerry P; Rowan, Kathryn M

    2014-08-01

    The National Cardiac Arrest Audit (NCAA) is the UK national clinical audit for in-hospital cardiac arrest. To make fair comparisons among health care providers, clinical indicators require case mix adjustment using a validated risk model. The aim of this study was to develop and validate risk models to predict outcomes following in-hospital cardiac arrest attended by a hospital-based resuscitation team in UK hospitals. Risk models for two outcomes-return of spontaneous circulation (ROSC) for greater than 20min and survival to hospital discharge-were developed and validated using data for in-hospital cardiac arrests between April 2011 and March 2013. For each outcome, a full model was fitted and then simplified by testing for non-linearity, combining categories and stepwise reduction. Finally, interactions between predictors were considered. Models were assessed for discrimination, calibration and accuracy. 22,479 in-hospital cardiac arrests in 143 hospitals were included (14,688 development, 7791 validation). The final risk model for ROSC>20min included: age (non-linear), sex, prior length of stay in hospital, reason for attendance, location of arrest, presenting rhythm, and interactions between presenting rhythm and location of arrest. The model for hospital survival included the same predictors, excluding sex. Both models had acceptable performance across the range of measures, although discrimination for hospital mortality exceeded that for ROSC>20min (c index 0.81 versus 0.72). Validated risk models for ROSC>20min and hospital survival following in-hospital cardiac arrest have been developed. These models will strengthen comparative reporting in NCAA and support local quality improvement. Copyright © 2014 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  8. Development and validation of risk models to predict outcomes following in-hospital cardiac arrest attended by a hospital-based resuscitation team☆

    Science.gov (United States)

    Harrison, David A.; Patel, Krishna; Nixon, Edel; Soar, Jasmeet; Smith, Gary B.; Gwinnutt, Carl; Nolan, Jerry P.; Rowan, Kathryn M.

    2014-01-01

    Aim The National Cardiac Arrest Audit (NCAA) is the UK national clinical audit for in-hospital cardiac arrest. To make fair comparisons among health care providers, clinical indicators require case mix adjustment using a validated risk model. The aim of this study was to develop and validate risk models to predict outcomes following in-hospital cardiac arrest attended by a hospital-based resuscitation team in UK hospitals. Methods Risk models for two outcomes—return of spontaneous circulation (ROSC) for greater than 20 min and survival to hospital discharge—were developed and validated using data for in-hospital cardiac arrests between April 2011 and March 2013. For each outcome, a full model was fitted and then simplified by testing for non-linearity, combining categories and stepwise reduction. Finally, interactions between predictors were considered. Models were assessed for discrimination, calibration and accuracy. Results 22,479 in-hospital cardiac arrests in 143 hospitals were included (14,688 development, 7791 validation). The final risk model for ROSC > 20 min included: age (non-linear), sex, prior length of stay in hospital, reason for attendance, location of arrest, presenting rhythm, and interactions between presenting rhythm and location of arrest. The model for hospital survival included the same predictors, excluding sex. Both models had acceptable performance across the range of measures, although discrimination for hospital mortality exceeded that for ROSC > 20 min (c index 0.81 versus 0.72). Conclusions Validated risk models for ROSC > 20 min and hospital survival following in-hospital cardiac arrest have been developed. These models will strengthen comparative reporting in NCAA and support local quality improvement. PMID:24830872

  9. Development and validation of a prediction model for insulin-associated hypoglycemia in non-critically ill hospitalized adults.

    Science.gov (United States)

    Mathioudakis, Nestoras Nicolas; Everett, Estelle; Routh, Shuvodra; Pronovost, Peter J; Yeh, Hsin-Chieh; Golden, Sherita Hill; Saria, Suchi

    2018-01-01

    To develop and validate a multivariable prediction model for insulin-associated hypoglycemia in non-critically ill hospitalized adults. We collected pharmacologic, demographic, laboratory, and diagnostic data from 128 657 inpatient days in which at least 1 unit of subcutaneous insulin was administered in the absence of intravenous insulin, total parenteral nutrition, or insulin pump use (index days). These data were used to develop multivariable prediction models for biochemical and clinically significant hypoglycemia (blood glucose (BG) of ≤70 mg/dL and model development and validation, respectively. Using predictors of age, weight, admitting service, insulin doses, mean BG, nadir BG, BG coefficient of variation (CV BG ), diet status, type 1 diabetes, type 2 diabetes, acute kidney injury, chronic kidney disease (CKD), liver disease, and digestive disease, our model achieved a c-statistic of 0.77 (95% CI 0.75 to 0.78), positive likelihood ratio (+LR) of 3.5 (95% CI 3.4 to 3.6) and negative likelihood ratio (-LR) of 0.32 (95% CI 0.30 to 0.35) for prediction of biochemical hypoglycemia. Using predictors of sex, weight, insulin doses, mean BG, nadir BG, CV BG , diet status, type 1 diabetes, type 2 diabetes, CKD stage, and steroid use, our model achieved a c-statistic of 0.80 (95% CI 0.78 to 0.82), +LR of 3.8 (95% CI 3.7 to 4.0) and -LR of 0.2 (95% CI 0.2 to 0.3) for prediction of clinically significant hypoglycemia. Hospitalized patients at risk of insulin-associated hypoglycemia can be identified using validated prediction models, which may support the development of real-time preventive interventions.

  10. Verification and validation for waste disposal models

    International Nuclear Information System (INIS)

    1987-07-01

    A set of evaluation criteria has been developed to assess the suitability of current verification and validation techniques for waste disposal methods. A survey of current practices and techniques was undertaken and evaluated using these criteria with the items most relevant to waste disposal models being identified. Recommendations regarding the most suitable verification and validation practices for nuclear waste disposal modelling software have been made

  11. Development and validation of the Bullying and Cyberbullying Scale for Adolescents: A multi-dimensional measurement model.

    Science.gov (United States)

    Thomas, Hannah J; Scott, James G; Coates, Jason M; Connor, Jason P

    2018-05-03

    Intervention on adolescent bullying is reliant on valid and reliable measurement of victimization and perpetration experiences across different behavioural expressions. This study developed and validated a survey tool that integrates measurement of both traditional and cyber bullying to test a theoretically driven multi-dimensional model. Adolescents from 10 mainstream secondary schools completed a baseline and follow-up survey (N = 1,217; M age  = 14 years; 66.2% male). The Bullying and cyberbullying Scale for Adolescents (BCS-A) developed for this study comprised parallel victimization and perpetration subscales, each with 20 items. Additional measures of bullying (Olweus Global Bullying and the Forms of Bullying Scale [FBS]), as well as measures of internalizing and externalizing problems, school connectedness, social support, and personality, were used to further assess validity. Factor structure was determined, and then, the suitability of items was assessed according to the following criteria: (1) factor interpretability, (2) item correlations, (3) model parsimony, and (4) measurement equivalence across victimization and perpetration experiences. The final models comprised four factors: physical, verbal, relational, and cyber. The final scale was revised to two 13-item subscales. The BCS-A demonstrated acceptable concurrent and convergent validity (internalizing and externalizing problems, school connectedness, social support, and personality), as well as predictive validity over 6 months. The BCS-A has sound psychometric properties. This tool establishes measurement equivalence across types of involvement and behavioural forms common among adolescents. An improved measurement method could add greater rigour to the evaluation of intervention programmes and also enable interventions to be tailored to subscale profiles. © 2018 The British Psychological Society.

  12. Infant bone age estimation based on fibular shaft length: model development and clinical validation

    International Nuclear Information System (INIS)

    Tsai, Andy; Stamoulis, Catherine; Bixby, Sarah D.; Breen, Micheal A.; Connolly, Susan A.; Kleinman, Paul K.

    2016-01-01

    Bone age in infants (<1 year old) is generally estimated using hand/wrist or knee radiographs, or by counting ossification centers. The accuracy and reproducibility of these techniques are largely unknown. To develop and validate an infant bone age estimation technique using fibular shaft length and compare it to conventional methods. We retrospectively reviewed negative skeletal surveys of 247 term-born low-risk-of-abuse infants (no persistent child protection team concerns) from July 2005 to February 2013, and randomized them into two datasets: (1) model development (n = 123) and (2) model testing (n = 124). Three pediatric radiologists measured all fibular shaft lengths. An ordinary linear regression model was fitted to dataset 1, and the model was evaluated using dataset 2. Readers also estimated infant bone ages in dataset 2 using (1) the hemiskeleton method of Sontag, (2) the hemiskeleton method of Elgenmark, (3) the hand/wrist atlas of Greulich and Pyle, and (4) the knee atlas of Pyle and Hoerr. For validation, we selected lower-extremity radiographs of 114 normal infants with no suspicion of abuse. Readers measured the fibulas and also estimated bone ages using the knee atlas. Bone age estimates from the proposed method were compared to the other methods. The proposed method outperformed all other methods in accuracy and reproducibility. Its accuracy was similar for the testing and validating datasets, with root-mean-square error of 36 days and 37 days; mean absolute error of 28 days and 31 days; and error variability of 22 days and 20 days, respectively. This study provides strong support for an infant bone age estimation technique based on fibular shaft length as a more accurate alternative to conventional methods. (orig.)

  13. Infant bone age estimation based on fibular shaft length: model development and clinical validation

    Energy Technology Data Exchange (ETDEWEB)

    Tsai, Andy; Stamoulis, Catherine; Bixby, Sarah D.; Breen, Micheal A.; Connolly, Susan A.; Kleinman, Paul K. [Boston Children' s Hospital, Harvard Medical School, Department of Radiology, Boston, MA (United States)

    2016-03-15

    Bone age in infants (<1 year old) is generally estimated using hand/wrist or knee radiographs, or by counting ossification centers. The accuracy and reproducibility of these techniques are largely unknown. To develop and validate an infant bone age estimation technique using fibular shaft length and compare it to conventional methods. We retrospectively reviewed negative skeletal surveys of 247 term-born low-risk-of-abuse infants (no persistent child protection team concerns) from July 2005 to February 2013, and randomized them into two datasets: (1) model development (n = 123) and (2) model testing (n = 124). Three pediatric radiologists measured all fibular shaft lengths. An ordinary linear regression model was fitted to dataset 1, and the model was evaluated using dataset 2. Readers also estimated infant bone ages in dataset 2 using (1) the hemiskeleton method of Sontag, (2) the hemiskeleton method of Elgenmark, (3) the hand/wrist atlas of Greulich and Pyle, and (4) the knee atlas of Pyle and Hoerr. For validation, we selected lower-extremity radiographs of 114 normal infants with no suspicion of abuse. Readers measured the fibulas and also estimated bone ages using the knee atlas. Bone age estimates from the proposed method were compared to the other methods. The proposed method outperformed all other methods in accuracy and reproducibility. Its accuracy was similar for the testing and validating datasets, with root-mean-square error of 36 days and 37 days; mean absolute error of 28 days and 31 days; and error variability of 22 days and 20 days, respectively. This study provides strong support for an infant bone age estimation technique based on fibular shaft length as a more accurate alternative to conventional methods. (orig.)

  14. Development and validation of extensive growth and growth boundary models for psychrotolerant pseudomonads in seafood, meat and vegetable products

    DEFF Research Database (Denmark)

    Martinez Rios, Veronica; Dalgaard, Paw

    Extensive growth and growth boundary models were developed and validated for psychrotolerant pseudomonads growing in seafood, meat and vegetable products. The new models were developed by expanding anexisting cardinal parameter-type model for growth of pseudomonads in milk (Martinez-Rios et al......, when observed and predicted μmax -values were compared. Thus, on average μmax -values for seafood and meat products were overestimated by 14%. Additionally, the reference growth rate parameter μref25˚C was calibrated by fitting the model to 21 μmax -values in vegetable products. This resulted in a μref......25˚C -value of 0.54 1/h. The calibrated vegetable model wassuccessfully validated using 51 μmax -values for psychrotolerant pseudomonads in vegetables. Average bias and accuracy factor values of 1.24 and 1.38 were obtained, respectively. Lag time models were developed by using relative lag times from...

  15. Multivariable prediction model for suspected giant cell arteritis: development and validation

    Directory of Open Access Journals (Sweden)

    Ing EB

    2017-11-01

    Full Text Available Edsel B Ing,1 Gabriela Lahaie Luna,2 Andrew Toren,3 Royce Ing,4 John J Chen,5 Nitika Arora,6 Nurhan Torun,7 Otana A Jakpor,8 J Alexander Fraser,9 Felix J Tyndel,10 Arun NE Sundaram,10 Xinyang Liu,11 Cindy TY Lam,1 Vivek Patel,12 Ezekiel Weis,13 David Jordan,14 Steven Gilberg,14 Christian Pagnoux,15 Martin ten Hove21Department of Ophthalmology and Vision Sciences, University of Toronto Medical School, Toronto, 2Department of Ophthalmology, Queen’s University, Kingston, ON, 3Department of Ophthalmology, University of Laval, Quebec, QC, 4Toronto Eyelid, Strabismus and Orbit Surgery Clinic, Toronto, ON, Canada; 5Mayo Clinic, Department of Ophthalmology and Neurology, 6Mayo Clinic, Department of Ophthalmology, Rochester, MN, 7Department of Surgery, Division of Ophthalmology, Harvard Medical School, Boston, MA, 8Harvard Medical School, Boston, MA, USA; 9Department of Clinical Neurological Sciences and Ophthalmology, Western University, London, 10Department of Medicine, University of Toronto Medical School, Toronto, ON, Canada; 11Department of Medicine, Fudan University Shanghai Medical College, Shanghai, People’s Republic of China; 12Roski Eye Institute, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA; 13Departments of Ophthalmology, Universities of Alberta and Calgary, Edmonton and Calgary, AB, 14Department of Ophthalmology, University of Ottawa, Ottawa, ON, 15Vasculitis Clinic, Mount Sinai Hospital, Toronto, ON, CanadaPurpose: To develop and validate a diagnostic prediction model for patients with suspected giant cell arteritis (GCA.Methods: A retrospective review of records of consecutive adult patients undergoing temporal artery biopsy (TABx for suspected GCA was conducted at seven university centers. The pathologic diagnosis was considered the final diagnosis. The predictor variables were age, gender, new onset headache, clinical temporal artery abnormality, jaw claudication, ischemic vision loss (VL, diplopia

  16. Locating the Seventh Cervical Spinous Process: Development and Validation of a Multivariate Model Using Palpation and Personal Information.

    Science.gov (United States)

    Ferreira, Ana Paula A; Póvoa, Luciana C; Zanier, José F C; Ferreira, Arthur S

    2017-02-01

    The aim of this study was to develop and validate a multivariate prediction model, guided by palpation and personal information, for locating the seventh cervical spinous process (C7SP). A single-blinded, cross-sectional study at a primary to tertiary health care center was conducted for model development and temporal validation. One-hundred sixty participants were prospectively included for model development (n = 80) and time-split validation stages (n = 80). The C7SP was located using the thorax-rib static method (TRSM). Participants underwent chest radiography for assessment of the inner body structure located with TRSM and using radio-opaque markers placed over the skin. Age, sex, height, body mass, body mass index, and vertex-marker distance (D V-M ) were used to predict the distance from the C7SP to the vertex (D V-C7 ). Multivariate linear regression modeling, limits of agreement plot, histogram of residues, receiver operating characteristic curves, and confusion tables were analyzed. The multivariate linear prediction model for D V-C7 (in centimeters) was D V-C7 = 0.986D V-M + 0.018(mass) + 0.014(age) - 1.008. Receiver operating characteristic curves had better discrimination of D V-C7 (area under the curve = 0.661; 95% confidence interval = 0.541-0.782; P = .015) than D V-M (area under the curve = 0.480; 95% confidence interval = 0.345-0.614; P = .761), with respective cutoff points at 23.40 cm (sensitivity = 41%, specificity = 63%) and 24.75 cm (sensitivity = 69%, specificity = 52%). The C7SP was correctly located more often when using predicted D V-C7 in the validation sample than when using the TRSM in the development sample: n = 53 (66%) vs n = 32 (40%), P information. Copyright © 2016. Published by Elsevier Inc.

  17. Development and validation of corium oxidation model for the VAPEX code

    International Nuclear Information System (INIS)

    Blinkov, V.N.; Melikhov, V.I.; Davydov, M.V.; Melikhov, O.I.; Borovkova, E.M.

    2011-01-01

    In light water reactor core melt accidents, the molten fuel (corium) can be brought into contact with coolant water in the course of the melt relocation in-vessel and ex-vessel as well as in an accident mitigation action of water addition. Mechanical energy release from such an interaction is of interest in evaluating the structural integrity of the reactor vessel as well as of the containment. Usually, the source for the energy release is considered to be the rapid transfer of heat from the molten fuel to the water ('vapor explosion'). When the fuel contains a chemically reactive metal component, there could be an additional source for the energy release, which is the heat release and hydrogen production due to the metal-water chemical reaction. In Electrogorsk Research and Engineering Center the computer code VAPEX (VAPor EXplosion) has been developed for analysis of the molten fuel coolant interaction. Multifield approach is used for modeling of dynamics of following phases: water, steam, melt jet, melt droplets, debris. The VAPEX code was successfully validated on FARO experimental data. Hydrogen generation was observed in FARO tests even though corium didn't contain metal component. The reason for hydrogen generation was not clear, so, simplified empirical model of hydrogen generation was implemented in the VAPEX code to take into account input of hydrogen into pressure increase. This paper describes new more detailed model of hydrogen generation due to the metal-water chemical reaction and results of its validation on ZREX experiments. (orig.)

  18. Development and validation of a predictive risk model for all-cause mortality in type 2 diabetes.

    Science.gov (United States)

    Robinson, Tom E; Elley, C Raina; Kenealy, Tim; Drury, Paul L

    2015-06-01

    Type 2 diabetes is common and is associated with an approximate 80% increase in the rate of mortality. Management decisions may be assisted by an estimate of the patient's absolute risk of adverse outcomes, including death. This study aimed to derive a predictive risk model for all-cause mortality in type 2 diabetes. We used primary care data from a large national multi-ethnic cohort of patients with type 2 diabetes in New Zealand and linked mortality records to develop a predictive risk model for 5-year risk of mortality. We then validated this model using information from a separate cohort of patients with type 2 diabetes. 26,864 people were included in the development cohort with a median follow up time of 9.1 years. We developed three models initially using demographic information and then progressively more clinical detail. The final model, which also included markers of renal disease, proved to give best prediction of all-cause mortality with a C-statistic of 0.80 in the development cohort and 0.79 in the validation cohort (7610 people) and was well calibrated. Ethnicity was a major factor with hazard ratios of 1.37 for indigenous Maori, 0.41 for East Asian and 0.55 for Indo Asian compared with European (P<0.001). We have developed a model using information usually available in primary care that provides good assessment of patient's risk of death. Results are similar to models previously published from smaller cohorts in other countries and apply to a wider range of patient ethnic groups. Copyright © 2015. Published by Elsevier Ireland Ltd.

  19. A clinical reasoning model focused on clients' behaviour change with reference to physiotherapists: its multiphase development and validation.

    Science.gov (United States)

    Elvén, Maria; Hochwälder, Jacek; Dean, Elizabeth; Söderlund, Anne

    2015-05-01

    A biopsychosocial approach and behaviour change strategies have long been proposed to serve as a basis for addressing current multifaceted health problems. This emphasis has implications for clinical reasoning of health professionals. This study's aim was to develop and validate a conceptual model to guide physiotherapists' clinical reasoning focused on clients' behaviour change. Phase 1 consisted of the exploration of existing research and the research team's experiences and knowledge. Phases 2a and 2b consisted of validation and refinement of the model based on input from physiotherapy students in two focus groups (n = 5 per group) and from experts in behavioural medicine (n = 9). Phase 1 generated theoretical and evidence bases for the first version of a model. Phases 2a and 2b established the validity and value of the model. The final model described clinical reasoning focused on clients' behaviour change as a cognitive, reflective, collaborative and iterative process with multiple interrelated levels that included input from the client and physiotherapist, a functional behavioural analysis of the activity-related target behaviour and the selection of strategies for behaviour change. This unique model, theory- and evidence-informed, has been developed to help physiotherapists to apply clinical reasoning systematically in the process of behaviour change with their clients.

  20. An independent verification and validation of the Future Theater Level Model conceptual model

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, D.S. III; Kruse, K.L.; Martellaro, A.J.; Packard, S.L.; Thomas, B. Jr.; Turley, V.K.

    1994-08-01

    This report describes the methodology and results of independent verification and validation performed on a combat model in its design stage. The combat model is the Future Theater Level Model (FTLM), under development by The Joint Staff/J-8. J-8 has undertaken its development to provide an analysis tool that addresses the uncertainties of combat more directly than previous models and yields more rapid study results. The methodology adopted for this verification and validation consisted of document analyses. Included were detailed examination of the FTLM design documents (at all stages of development), the FTLM Mission Needs Statement, and selected documentation for other theater level combat models. These documents were compared to assess the FTLM as to its design stage, its purpose as an analytical combat model, and its capabilities as specified in the Mission Needs Statement. The conceptual design passed those tests. The recommendations included specific modifications as well as a recommendation for continued development. The methodology is significant because independent verification and validation have not been previously reported as being performed on a combat model in its design stage. The results are significant because The Joint Staff/J-8 will be using the recommendations from this study in determining whether to proceed with develop of the model.

  1. Development and validation of a septoplasty training model using 3-dimensional printing technology.

    Science.gov (United States)

    AlReefi, Mahmoud A; Nguyen, Lily H P; Mongeau, Luc G; Haq, Bassam Ul; Boyanapalli, Siddharth; Hafeez, Nauman; Cegarra-Escolano, Francois; Tewfik, Marc A

    2017-04-01

    Providing alternative training modalities may improve trainees' ability to perform septoplasty. Three-dimensional printing has been shown to be a powerful tool in surgical training. The objectives of this study were to explain the development of our 3-dimensional (3D) printed septoplasty training model, to assess its face and content validity, and to present evidence supporting its ability to distinguish between levels of surgical proficiency. Imaging data of a patient with a nasal septal deviation was selected for printing. Printing materials reproducing the mechanical properties of human tissues were selected based on literature review and prototype testing. Eight expert rhinologists, 6 senior residents, and 6 junior residents performed endoscopic septoplasties on the model and completed a postsimulation survey. Performance metrics in quality (final product analysis), efficiency (time), and safety (eg, perforation length, nares damage) were recorded and analyzed in a study-blind manner. The model was judged to be anatomically correct and the steps performed realistic, with scores of 4.05 ± 0.82 and 4.2 ± 1, respectively, on a 5-point Likert scale. Ninety-two percent of residents desired the simulator to be integrated into their teaching curriculum. There was a significant difference (p simulator training models for septoplasty. Our model incorporates 2 different materials mixed into the 3 relevant consistencies necessary to simulate septoplasty. Our findings provide evidence supporting the validity of the model. © 2016 ARS-AAOA, LLC.

  2. Towards policy relevant environmental modeling: contextual validity and pragmatic models

    Science.gov (United States)

    Miles, Scott B.

    2000-01-01

    "What makes for a good model?" In various forms, this question is a question that, undoubtedly, many people, businesses, and institutions ponder with regards to their particular domain of modeling. One particular domain that is wrestling with this question is the multidisciplinary field of environmental modeling. Examples of environmental models range from models of contaminated ground water flow to the economic impact of natural disasters, such as earthquakes. One of the distinguishing claims of the field is the relevancy of environmental modeling to policy and environment-related decision-making in general. A pervasive view by both scientists and decision-makers is that a "good" model is one that is an accurate predictor. Thus, determining whether a model is "accurate" or "correct" is done by comparing model output to empirical observations. The expected outcome of this process, usually referred to as "validation" or "ground truthing," is a stamp on the model in question of "valid" or "not valid" that serves to indicate whether or not the model will be reliable before it is put into service in a decision-making context. In this paper, I begin by elaborating on the prevailing view of model validation and why this view must change. Drawing from concepts coming out of the studies of science and technology, I go on to propose a contextual view of validity that can overcome the problems associated with "ground truthing" models as an indicator of model goodness. The problem of how we talk about and determine model validity has much to do about how we perceive the utility of environmental models. In the remainder of the paper, I argue that we should adopt ideas of pragmatism in judging what makes for a good model and, in turn, developing good models. From such a perspective of model goodness, good environmental models should facilitate communication, convey—not bury or "eliminate"—uncertainties, and, thus, afford the active building of consensus decisions, instead

  3. Two-phase 1D+1D model of a DMFC: development and validation on extensive operating conditions range

    Energy Technology Data Exchange (ETDEWEB)

    Casalegno, A.; Marchesi, R.; Parenti, D. [Dipartimento di Energetica, Politecnico di Milano (Italy)

    2008-02-15

    A two-phase 1D+1D model of a direct methanol fuel cell (DMFC) is developed, considering overall mass balance, methanol transport in gas phase through anode diffusion layer, methanol and water crossover. The model is quantitatively validated on an extensive range of operating conditions, 24 polarisation curves. The model accurately reproduces DMFC performance in the validation range and, outside this, it is able to predict values under feasible operating conditions. Finally, the estimations of methanol crossover flux are qualitatively and quantitatively similar to experimental measures and the main local quantities' trends are coherent with results obtained with more complex models. (Abstract Copyright [2008], Wiley Periodicals, Inc.)

  4. Development, Validation and Parametric study of a 3-Year-Old Child Head Finite Element Model

    Science.gov (United States)

    Cui, Shihai; Chen, Yue; Li, Haiyan; Ruan, ShiJie

    2015-12-01

    Traumatic brain injury caused by drop and traffic accidents is an important reason for children's death and disability. Recently, the computer finite element (FE) head model has been developed to investigate brain injury mechanism and biomechanical responses. Based on CT data of a healthy 3-year-old child head, the FE head model with detailed anatomical structure was developed. The deep brain structures such as white matter, gray matter, cerebral ventricle, hippocampus, were firstly created in this FE model. The FE model was validated by comparing the simulation results with that of cadaver experiments based on reconstructing the child and adult cadaver experiments. In addition, the effects of skull stiffness on the child head dynamic responses were further investigated. All the simulation results confirmed the good biofidelity of the FE model.

  5. Concepts of Model Verification and Validation

    International Nuclear Information System (INIS)

    Thacker, B.H.; Doebling, S.W.; Hemez, F.M.; Anderson, M.C.; Pepin, J.E.; Rodriguez, E.A.

    2004-01-01

    Model verification and validation (VandV) is an enabling methodology for the development of computational models that can be used to make engineering predictions with quantified confidence. Model VandV procedures are needed by government and industry to reduce the time, cost, and risk associated with full-scale testing of products, materials, and weapon systems. Quantifying the confidence and predictive accuracy of model calculations provides the decision-maker with the information necessary for making high-consequence decisions. The development of guidelines and procedures for conducting a model VandV program are currently being defined by a broad spectrum of researchers. This report reviews the concepts involved in such a program. Model VandV is a current topic of great interest to both government and industry. In response to a ban on the production of new strategic weapons and nuclear testing, the Department of Energy (DOE) initiated the Science-Based Stockpile Stewardship Program (SSP). An objective of the SSP is to maintain a high level of confidence in the safety, reliability, and performance of the existing nuclear weapons stockpile in the absence of nuclear testing. This objective has challenged the national laboratories to develop high-confidence tools and methods that can be used to provide credible models needed for stockpile certification via numerical simulation. There has been a significant increase in activity recently to define VandV methods and procedures. The U.S. Department of Defense (DoD) Modeling and Simulation Office (DMSO) is working to develop fundamental concepts and terminology for VandV applied to high-level systems such as ballistic missile defense and battle management simulations. The American Society of Mechanical Engineers (ASME) has recently formed a Standards Committee for the development of VandV procedures for computational solid mechanics models. The Defense Nuclear Facilities Safety Board (DNFSB) has been a proponent of model

  6. Development and validation of multivariable models to predict mortality and hospitalization in patients with heart failure

    NARCIS (Netherlands)

    Voors, Adriaan A.; Ouwerkerk, Wouter; Zannad, Faiez; van Veldhuisen, Dirk J.; Samani, Nilesh J.; Ponikowski, Piotr; Ng, Leong L.; Metra, Marco; ter Maaten, Jozine M.; Lang, Chim C.; Hillege, Hans L.; van der Harst, Pim; Filippatos, Gerasimos; Dickstein, Kenneth; Cleland, John G.; Anker, Stefan D.; Zwinderman, Aeilko H.

    Introduction From a prospective multicentre multicountry clinical trial, we developed and validated risk models to predict prospective all-cause mortality and hospitalizations because of heart failure (HF) in patients with HF. Methods and results BIOSTAT-CHF is a research programme designed to

  7. Development and validation of multivariable models to predict mortality and hospitalization in patients with heart failure

    NARCIS (Netherlands)

    Voors, Adriaan A.; Ouwerkerk, Wouter; Zannad, Faiez; van Veldhuisen, Dirk J.; Samani, Nilesh J.; Ponikowski, Piotr; Ng, Leong L.; Metra, Marco; ter Maaten, Jozine M.; Lang, Chim C.; Hillege, Hans L.; van der Harst, Pim; Filippatos, Gerasimos; Dickstein, Kenneth; Cleland, John G.; Anker, Stefan D.; Zwinderman, Aeilko H.

    2017-01-01

    Introduction From a prospective multicentre multicountry clinical trial, we developed and validated risk models to predict prospective all-cause mortality and hospitalizations because of heart failure (HF) in patients with HF. Methods and results BIOSTAT-CHF is a research programme designed to

  8. Development and validation of a mortality risk model for pediatric sepsis

    Science.gov (United States)

    Chen, Mengshi; Lu, Xiulan; Hu, Li; Liu, Pingping; Zhao, Wenjiao; Yan, Haipeng; Tang, Liang; Zhu, Yimin; Xiao, Zhenghui; Chen, Lizhang; Tan, Hongzhuan

    2017-01-01

    Abstract Pediatric sepsis is a burdensome public health problem. Assessing the mortality risk of pediatric sepsis patients, offering effective treatment guidance, and improving prognosis to reduce mortality rates, are crucial. We extracted data derived from electronic medical records of pediatric sepsis patients that were collected during the first 24 hours after admission to the pediatric intensive care unit (PICU) of the Hunan Children's hospital from January 2012 to June 2014. A total of 788 children were randomly divided into a training (592, 75%) and validation group (196, 25%). The risk factors for mortality among these patients were identified by conducting multivariate logistic regression in the training group. Based on the established logistic regression equation, the logit probabilities for all patients (in both groups) were calculated to verify the model's internal and external validities. According to the training group, 6 variables (brain natriuretic peptide, albumin, total bilirubin, D-dimer, lactate levels, and mechanical ventilation in 24 hours) were included in the final logistic regression model. The areas under the curves of the model were 0.854 (0.826, 0.881) and 0.844 (0.816, 0.873) in the training and validation groups, respectively. The Mortality Risk Model for Pediatric Sepsis we established in this study showed acceptable accuracy to predict the mortality risk in pediatric sepsis patients. PMID:28514310

  9. Anatomical Cystocele Recurrence: Development and Internal Validation of a Prediction Model.

    Science.gov (United States)

    Vergeldt, Tineke F M; van Kuijk, Sander M J; Notten, Kim J B; Kluivers, Kirsten B; Weemhoff, Mirjam

    2016-02-01

    To develop a prediction model that estimates the risk of anatomical cystocele recurrence after surgery. The databases of two multicenter prospective cohort studies were combined, and we performed a retrospective secondary analysis of these data. Women undergoing an anterior colporrhaphy without mesh materials and without previous pelvic organ prolapse (POP) surgery filled in a questionnaire, underwent translabial three-dimensional ultrasonography, and underwent staging of POP preoperatively and postoperatively. We developed a prediction model using multivariable logistic regression and internally validated it using standard bootstrapping techniques. The performance of the prediction model was assessed by computing indices of overall performance, discriminative ability, calibration, and its clinical utility by computing test characteristics. Of 287 included women, 149 (51.9%) had anatomical cystocele recurrence. Factors included in the prediction model were assisted delivery, preoperative cystocele stage, number of compartments involved, major levator ani muscle defects, and levator hiatal area during Valsalva. Potential predictors that were excluded after backward elimination because of high P values were age, body mass index, number of vaginal deliveries, and family history of POP. The shrinkage factor resulting from the bootstrap procedure was 0.91. After correction for optimism, Nagelkerke's R and the Brier score were 0.15 and 0.22, respectively. This indicates satisfactory model fit. The area under the receiver operating characteristic curve of the prediction model was 71.6% (95% confidence interval 65.7-77.5). After correction for optimism, the area under the receiver operating characteristic curve was 69.7%. This prediction model, including history of assisted delivery, preoperative stage, number of compartments, levator defects, and levator hiatus, estimates the risk of anatomical cystocele recurrence.

  10. Development and external validation of a risk-prediction model to predict 5-year overall survival in advanced larynx cancer.

    Science.gov (United States)

    Petersen, Japke F; Stuiver, Martijn M; Timmermans, Adriana J; Chen, Amy; Zhang, Hongzhen; O'Neill, James P; Deady, Sandra; Vander Poorten, Vincent; Meulemans, Jeroen; Wennerberg, Johan; Skroder, Carl; Day, Andrew T; Koch, Wayne; van den Brekel, Michiel W M

    2018-05-01

    TNM-classification inadequately estimates patient-specific overall survival (OS). We aimed to improve this by developing a risk-prediction model for patients with advanced larynx cancer. Cohort study. We developed a risk prediction model to estimate the 5-year OS rate based on a cohort of 3,442 patients with T3T4N0N+M0 larynx cancer. The model was internally validated using bootstrapping samples and externally validated on patient data from five external centers (n = 770). The main outcome was performance of the model as tested by discrimination, calibration, and the ability to distinguish risk groups based on tertiles from the derivation dataset. The model performance was compared to a model based on T and N classification only. We included age, gender, T and N classification, and subsite as prognostic variables in the standard model. After external validation, the standard model had a significantly better fit than a model based on T and N classification alone (C statistic, 0.59 vs. 0.55, P statistic to 0.68. A risk prediction model for patients with advanced larynx cancer, consisting of readily available clinical variables, gives more accurate estimations of the estimated 5-year survival rate when compared to a model based on T and N classification alone. 2c. Laryngoscope, 128:1140-1145, 2018. © 2017 The American Laryngological, Rhinological and Otological Society, Inc.

  11. Development and field validation of a regional, management-scale habitat model: A koala Phascolarctos cinereus case study.

    Science.gov (United States)

    Law, Bradley; Caccamo, Gabriele; Roe, Paul; Truskinger, Anthony; Brassil, Traecey; Gonsalves, Leroy; McConville, Anna; Stanton, Matthew

    2017-09-01

    Species distribution models have great potential to efficiently guide management for threatened species, especially for those that are rare or cryptic. We used MaxEnt to develop a regional-scale model for the koala Phascolarctos cinereus at a resolution (250 m) that could be used to guide management. To ensure the model was fit for purpose, we placed emphasis on validating the model using independently-collected field data. We reduced substantial spatial clustering of records in coastal urban areas using a 2-km spatial filter and by modeling separately two subregions separated by the 500-m elevational contour. A bias file was prepared that accounted for variable survey effort. Frequency of wildfire, soil type, floristics and elevation had the highest relative contribution to the model, while a number of other variables made minor contributions. The model was effective in discriminating different habitat suitability classes when compared with koala records not used in modeling. We validated the MaxEnt model at 65 ground-truth sites using independent data on koala occupancy (acoustic sampling) and habitat quality (browse tree availability). Koala bellows ( n  = 276) were analyzed in an occupancy modeling framework, while site habitat quality was indexed based on browse trees. Field validation demonstrated a linear increase in koala occupancy with higher modeled habitat suitability at ground-truth sites. Similarly, a site habitat quality index at ground-truth sites was correlated positively with modeled habitat suitability. The MaxEnt model provided a better fit to estimated koala occupancy than the site-based habitat quality index, probably because many variables were considered simultaneously by the model rather than just browse species. The positive relationship of the model with both site occupancy and habitat quality indicates that the model is fit for application at relevant management scales. Field-validated models of similar resolution would assist in

  12. Development of the Galaxy Chronic Obstructive Pulmonary Disease (COPD) Model Using Data from ECLIPSE: Internal Validation of a Linked-Equations Cohort Model.

    Science.gov (United States)

    Briggs, Andrew H; Baker, Timothy; Risebrough, Nancy A; Chambers, Mike; Gonzalez-McQuire, Sebastian; Ismaila, Afisi S; Exuzides, Alex; Colby, Chris; Tabberer, Maggie; Muellerova, Hana; Locantore, Nicholas; Rutten van Mölken, Maureen P M H; Lomas, David A

    2017-05-01

    The recent joint International Society for Pharmacoeconomics and Outcomes Research / Society for Medical Decision Making Modeling Good Research Practices Task Force emphasized the importance of conceptualizing and validating models. We report a new model of chronic obstructive pulmonary disease (COPD) (part of the Galaxy project) founded on a conceptual model, implemented using a novel linked-equation approach, and internally validated. An expert panel developed a conceptual model including causal relationships between disease attributes, progression, and final outcomes. Risk equations describing these relationships were estimated using data from the Evaluation of COPD Longitudinally to Identify Predictive Surrogate Endpoints (ECLIPSE) study, with costs estimated from the TOwards a Revolution in COPD Health (TORCH) study. Implementation as a linked-equation model enabled direct estimation of health service costs and quality-adjusted life years (QALYs) for COPD patients over their lifetimes. Internal validation compared 3 years of predicted cohort experience with ECLIPSE results. At 3 years, the Galaxy COPD model predictions of annual exacerbation rate and annual decline in forced expiratory volume in 1 second fell within the ECLIPSE data confidence limits, although 3-year overall survival was outside the observed confidence limits. Projections of the risk equations over time permitted extrapolation to patient lifetimes. Averaging the predicted cost/QALY outcomes for the different patients within the ECLIPSE cohort gives an estimated lifetime cost of £25,214 (undiscounted)/£20,318 (discounted) and lifetime QALYs of 6.45 (undiscounted/5.24 [discounted]) per ECLIPSE patient. A new form of model for COPD was conceptualized, implemented, and internally validated, based on a series of linked equations using epidemiological data (ECLIPSE) and cost data (TORCH). This Galaxy model predicts COPD outcomes from treatment effects on disease attributes such as lung function

  13. Polytomous diagnosis of ovarian tumors as benign, borderline, primary invasive or metastatic: development and validation of standard and kernel-based risk prediction models

    Directory of Open Access Journals (Sweden)

    Testa Antonia C

    2010-10-01

    Full Text Available Abstract Background Hitherto, risk prediction models for preoperative ultrasound-based diagnosis of ovarian tumors were dichotomous (benign versus malignant. We develop and validate polytomous models (models that predict more than two events to diagnose ovarian tumors as benign, borderline, primary invasive or metastatic invasive. The main focus is on how different types of models perform and compare. Methods A multi-center dataset containing 1066 women was used for model development and internal validation, whilst another multi-center dataset of 1938 women was used for temporal and external validation. Models were based on standard logistic regression and on penalized kernel-based algorithms (least squares support vector machines and kernel logistic regression. We used true polytomous models as well as combinations of dichotomous models based on the 'pairwise coupling' technique to produce polytomous risk estimates. Careful variable selection was performed, based largely on cross-validated c-index estimates. Model performance was assessed with the dichotomous c-index (i.e. the area under the ROC curve and a polytomous extension, and with calibration graphs. Results For all models, between 9 and 11 predictors were selected. Internal validation was successful with polytomous c-indexes between 0.64 and 0.69. For the best model dichotomous c-indexes were between 0.73 (primary invasive vs metastatic and 0.96 (borderline vs metastatic. On temporal and external validation, overall discrimination performance was good with polytomous c-indexes between 0.57 and 0.64. However, discrimination between primary and metastatic invasive tumors decreased to near random levels. Standard logistic regression performed well in comparison with advanced algorithms, and combining dichotomous models performed well in comparison with true polytomous models. The best model was a combination of dichotomous logistic regression models. This model is available online

  14. A Multivariate Model for Prediction of Obstructive Coronary Disease in Patients with Acute Chest Pain: Development and Validation

    Directory of Open Access Journals (Sweden)

    Luis Cláudio Lemos Correia

    Full Text Available Abstract Background: Currently, there is no validated multivariate model to predict probability of obstructive coronary disease in patients with acute chest pain. Objective: To develop and validate a multivariate model to predict coronary artery disease (CAD based on variables assessed at admission to the coronary care unit (CCU due to acute chest pain. Methods: A total of 470 patients were studied, 370 utilized as the derivation sample and the subsequent 100 patients as the validation sample. As the reference standard, angiography was required to rule in CAD (stenosis ≥ 70%, while either angiography or a negative noninvasive test could be used to rule it out. As predictors, 13 baseline variables related to medical history, 14 characteristics of chest discomfort, and eight variables from physical examination or laboratory tests were tested. Results: The prevalence of CAD was 48%. By logistic regression, six variables remained independent predictors of CAD: age, male gender, relief with nitrate, signs of heart failure, positive electrocardiogram, and troponin. The area under the curve (AUC of this final model was 0.80 (95% confidence interval [95%CI] = 0.75 - 0.84 in the derivation sample and 0.86 (95%CI = 0.79 - 0.93 in the validation sample. Hosmer-Lemeshow's test indicated good calibration in both samples (p = 0.98 and p = 0.23, respectively. Compared with a basic model containing electrocardiogram and troponin, the full model provided an AUC increment of 0.07 in both derivation (p = 0.0002 and validation (p = 0.039 samples. Integrated discrimination improvement was 0.09 in both derivation (p < 0.001 and validation (p < 0.0015 samples. Conclusion: A multivariate model was derived and validated as an accurate tool for estimating the pretest probability of CAD in patients with acute chest pain.

  15. Development and validity of a new model for assessing pressure redistribution properties of support surfaces.

    Science.gov (United States)

    Matsuo, Junko; Sugama, Junko; Sanada, Hiromi; Okuwa, Mayumi; Nakatani, Toshio; Konya, Chizuko; Sakamoto, Jirou

    2011-05-01

    Pressure ulcers are a common problem, especially in older patients. In Japan, most institutionalized older people are malnourished and show extreme bony prominence (EBP). EBP is a significant factor in the development of pressure ulcers due to increased interface pressure concentrated at the skin surface over the EBP. The use of support surfaces is recommended for the prophylaxis of pressure ulcers. However, the present equivocal criteria for evaluating the pressure redistribution of support surfaces are inadequate. Since pressure redistribution is influenced by physique and posture, evaluations using human subjects are limited. For this reason, models that can substitute for humans are necessary. We developed a new EBP model based on the anthropometric measurements, including pelvic inclination, of 100 bedridden elderly people. A comparison between the pressure distribution charts of our model and bedridden elderly subjects demonstrated that maximum contact pressure values, buttock contact pressure values, and bone prominence rates corresponded closely. This indicates that the model provides a good approximation of the features of elderly people with EBP. We subsequently examined the validity of the model through quantitative assessment of pressure redistribution functions consisting of immersion, envelopment, and contact area change. The model was able to detect differences in the hardness of urethane foam, differences in the internal pressure of an air mattress, and sequential changes during the pressure switching mode. These results demonstrate the validity of our new buttock model in evaluating pressure redistribution for a variety of surfaces. Copyright © 2010 Tissue Viability Society. Published by Elsevier Ltd. All rights reserved.

  16. Development and Validation of the Faceted Inventory of the Five-Factor Model (FI-FFM).

    Science.gov (United States)

    Watson, David; Nus, Ericka; Wu, Kevin D

    2017-06-01

    The Faceted Inventory of the Five-Factor Model (FI-FFM) is a comprehensive hierarchical measure of personality. The FI-FFM was created across five phases of scale development. It includes five facets apiece for neuroticism, extraversion, and conscientiousness; four facets within agreeableness; and three facets for openness. We present reliability and validity data obtained from three samples. The FI-FFM scales are internally consistent and highly stable over 2 weeks (retest rs ranged from .64 to .82, median r = .77). They show strong convergent and discriminant validity vis-à-vis the NEO, the Big Five Inventory, and the Personality Inventory for DSM-5. Moreover, self-ratings on the scales show moderate to strong agreement with corresponding ratings made by informants ( rs ranged from .26 to .66, median r = .42). Finally, in joint analyses with the NEO Personality Inventory-3, the FI-FFM neuroticism facet scales display significant incremental validity in predicting indicators of internalizing psychopathology.

  17. Development and validation of a heuristic model for evaluation of the team performance of operators in nuclear power plants

    International Nuclear Information System (INIS)

    Kim, Sa Kil; Byun, Seong Nam; Lee, Dhong Hoon

    2011-01-01

    Highlights: → We develop an estimation model for evaluation of the team performance of MCR. → To build the model, we extract team performance factors through reviewing literatures and identifying behavior markers. → We validate that the model is adaptable to the advanced MCR of nuclear power plants. → As a result, we find that the model is a systematic and objective to measure team performance. - Abstract: The global concerns about safety in the digital technology of the main control room (MCR) are growing as domestic and foreign nuclear power plants are developed with computerized control facilities and human-system interfaces. In a narrow space, the digital technology contributes to a control room environment, which can facilitate the acquisition of all the information needed for operation. Thus, although an individual performance of the advanced MCR can be further improved; there is a limit in expecting an improvement in team performance. The team performance depends on organic coherence as a whole team rather than on the knowledge and skill of an individual operator. Moreover, a good team performance improves communication between and within teams in an efficient manner, and then it can be conducive to addressing unsafe conditions. Respecting this, it is important and necessary to develop methodological technology for the evaluation of operators' teamwork or collaboration, thus enhancing operational performance in nuclear power plant at the MCR. The objectives of this research are twofold: to develop a systematic methodology for evaluation of the team performance of MCR operators in consideration of advanced MCR characteristics, and to validate that the methodology is adaptable to the advanced MCR of nuclear power plants. In order to achieve these two objectives, first, team performance factors were extracted through literature reviews and methodological study concerning team performance theories. Second, the team performance factors were identified and

  18. Performance of a pavement solar energy collector: Model development and validation

    International Nuclear Information System (INIS)

    Guldentops, Gert; Nejad, Alireza Mahdavi; Vuye, Cedric; Van den bergh, Wim; Rahbar, Nima

    2016-01-01

    Highlights: • A novel numerical model is developed that predicts the thermal behavior of a pavement solar collector. • A parametric study is conducted on the sensitivity of the system to changes in design parameters. • A new methodology is developed to perform a long-term performance analysis of the system. - Abstract: Current aims regarding environmental protection, like reduction of fossil fuel consumption and greenhouse gas emissions, require the development of new technologies. These new technologies enable the production of renewable energy, which is both cleaner and more abundant in comparison to using fossil fuels for energy production. This necessity encourages researchers to develop new ways to capture solar energy, and if possible, store it for later use. In this paper, the Pavement Solar Collector (PSC), and its use to extract low temperature thermal energy, is studied. Such a system, which harvests energy by flowing water through a heat exchanger embedded in the pavement structure, could have a significant energy output since pavement materials tend to absorb large amounts of solar radiation. The main objective of this paper is to develop a modeling framework for the PSC system and validate it with a self-instructed experiment. Such a model will allow for a detailed parametric study of the system to optimize the design, as well as an investigation on the effect of aging (e.g. decreasing solar absorptivity) on the performance of the system. A long-term energy output of the system that is currently lacking is calculated based on results of the study on weather parameters. This newly acquired data could be the start of a comprehensive data set on the performance of a PSC, which leads to a comprehensive feasibility study of the system.

  19. Development and validation of a Markov microsimulation model for the economic evaluation of treatments in osteoporosis.

    Science.gov (United States)

    Hiligsmann, Mickaël; Ethgen, Olivier; Bruyère, Olivier; Richy, Florent; Gathon, Henry-Jean; Reginster, Jean-Yves

    2009-01-01

    Markov models are increasingly used in economic evaluations of treatments for osteoporosis. Most of the existing evaluations are cohort-based Markov models missing comprehensive memory management and versatility. In this article, we describe and validate an original Markov microsimulation model to accurately assess the cost-effectiveness of prevention and treatment of osteoporosis. We developed a Markov microsimulation model with a lifetime horizon and a direct health-care cost perspective. The patient history was recorded and was used in calculations of transition probabilities, utilities, and costs. To test the internal consistency of the model, we carried out an example calculation for alendronate therapy. Then, external consistency was investigated by comparing absolute lifetime risk of fracture estimates with epidemiologic data. For women at age 70 years, with a twofold increase in the fracture risk of the average population, the costs per quality-adjusted life-year gained for alendronate therapy versus no treatment were estimated at €9105 and €15,325, respectively, under full and realistic adherence assumptions. All the sensitivity analyses in terms of model parameters and modeling assumptions were coherent with expected conclusions and absolute lifetime risk of fracture estimates were within the range of previous estimates, which confirmed both internal and external consistency of the model. Microsimulation models present some major advantages over cohort-based models, increasing the reliability of the results and being largely compatible with the existing state of the art, evidence-based literature. The developed model appears to be a valid model for use in economic evaluations in osteoporosis.

  20. Developing and Validating Path-Dependent Uncertainty Estimates for use with the Regional Seismic Travel Time (RSTT) Model

    Science.gov (United States)

    Begnaud, M. L.; Anderson, D. N.; Phillips, W. S.; Myers, S. C.; Ballard, S.

    2016-12-01

    The Regional Seismic Travel Time (RSTT) tomography model has been developed to improve travel time predictions for regional phases (Pn, Sn, Pg, Lg) in order to increase seismic location accuracy, especially for explosion monitoring. The RSTT model is specifically designed to exploit regional phases for location, especially when combined with teleseismic arrivals. The latest RSTT model (version 201404um) has been released (http://www.sandia.gov/rstt). Travel time uncertainty estimates for RSTT are determined using one-dimensional (1D), distance-dependent error models, that have the benefit of being very fast to use in standard location algorithms, but do not account for path-dependent variations in error, and structural inadequacy of the RSTTT model (e.g., model error). Although global in extent, the RSTT tomography model is only defined in areas where data exist. A simple 1D error model does not accurately model areas where RSTT has not been calibrated. We are developing and validating a new error model for RSTT phase arrivals by mathematically deriving this multivariate model directly from a unified model of RSTT embedded into a statistical random effects model that captures distance, path and model error effects. An initial method developed is a two-dimensional path-distributed method using residuals. The goals for any RSTT uncertainty method are for it to be both readily useful for the standard RSTT user as well as improve travel time uncertainty estimates for location. We have successfully tested using the new error model for Pn phases and will demonstrate the method and validation of the error model for Sn, Pg, and Lg phases.

  1. The German cervical cancer screening model: development and validation of a decision-analytic model for cervical cancer screening in Germany.

    Science.gov (United States)

    Siebert, Uwe; Sroczynski, Gaby; Hillemanns, Peter; Engel, Jutta; Stabenow, Roland; Stegmaier, Christa; Voigt, Kerstin; Gibis, Bernhard; Hölzel, Dieter; Goldie, Sue J

    2006-04-01

    We sought to develop and validate a decision-analytic model for the natural history of cervical cancer for the German health care context and to apply it to cervical cancer screening. We developed a Markov model for the natural history of cervical cancer and cervical cancer screening in the German health care context. The model reflects current German practice standards for screening, diagnostic follow-up and treatment regarding cervical cancer and its precursors. Data for disease progression and cervical cancer survival were obtained from the literature and German cancer registries. Accuracy of Papanicolaou (Pap) testing was based on meta-analyses. We performed internal and external model validation using observed epidemiological data for unscreened women from different German cancer registries. The model predicts life expectancy, incidence of detected cervical cancer cases, lifetime cervical cancer risks and mortality. The model predicted a lifetime cervical cancer risk of 3.0% and a lifetime cervical cancer mortality of 1.0%, with a peak cancer incidence of 84/100,000 at age 51 years. These results were similar to observed data from German cancer registries, German literature data and results from other international models. Based on our model, annual Pap screening could prevent 98.7% of diagnosed cancer cases and 99.6% of deaths due to cervical cancer in women completely adherent to screening and compliant to treatment. Extending the screening interval from 1 year to 2, 3 or 5 years resulted in reduced screening effectiveness. This model provides a tool for evaluating the long-term effectiveness of different cervical cancer screening tests and strategies.

  2. Development and Validation of Predictive Models of Cardiac Mortality and Transplantation in Resynchronization Therapy

    Directory of Open Access Journals (Sweden)

    Eduardo Arrais Rocha

    2015-01-01

    Full Text Available Abstract Background: 30-40% of cardiac resynchronization therapy cases do not achieve favorable outcomes. Objective: This study aimed to develop predictive models for the combined endpoint of cardiac death and transplantation (Tx at different stages of cardiac resynchronization therapy (CRT. Methods: Prospective observational study of 116 patients aged 64.8 ± 11.1 years, 68.1% of whom had functional class (FC III and 31.9% had ambulatory class IV. Clinical, electrocardiographic and echocardiographic variables were assessed by using Cox regression and Kaplan-Meier curves. Results: The cardiac mortality/Tx rate was 16.3% during the follow-up period of 34.0 ± 17.9 months. Prior to implantation, right ventricular dysfunction (RVD, ejection fraction < 25% and use of high doses of diuretics (HDD increased the risk of cardiac death and Tx by 3.9-, 4.8-, and 5.9-fold, respectively. In the first year after CRT, RVD, HDD and hospitalization due to congestive heart failure increased the risk of death at hazard ratios of 3.5, 5.3, and 12.5, respectively. In the second year after CRT, RVD and FC III/IV were significant risk factors of mortality in the multivariate Cox model. The accuracy rates of the models were 84.6% at preimplantation, 93% in the first year after CRT, and 90.5% in the second year after CRT. The models were validated by bootstrapping. Conclusion: We developed predictive models of cardiac death and Tx at different stages of CRT based on the analysis of simple and easily obtainable clinical and echocardiographic variables. The models showed good accuracy and adjustment, were validated internally, and are useful in the selection, monitoring and counseling of patients indicated for CRT.

  3. Development and validation of a model for CANDU-6 SDS2 poison injection analysis

    International Nuclear Information System (INIS)

    Lee, B. W.; Jung, C. J.; Min, B. J.; Yoon, H. J.; Choi, J. H.; Jang, D. S.

    2002-01-01

    In CANDU-6 reactor there are two independent reactor shutdown systems. The shutdown system no. 2(SDS2) injects the liquid poison into the moderator tank by high pressure via small holes on the 6 nozzle pipes and stops the nuclear chain reaction. To ensure the safe shutdown of a reactor loaded with either DUPIC or SEU fuels it is necessary for the poison curtains generated by jets provide quick, and enough negative reactivity to the reactor during the early stage of the accident. In order to produce the neutron cross section necessary to perform this work, the poison concentration distribution during the transient is necessary. The motivation for this work arose from the fact that the computer code package for performing this task is not transfered to Korea yet. In this study, a set of models for analyzing the transient poison concentration induced by this high pressure poison injection jet activated upon the reactor trip in a CANDU-6 reactor moderator tank has been developed and used to generate the poison concentration distribution of the poison curtains induced by the high pressure jets injected into the vacant region between the pressure tube banks. The poison injection rate through the jet holes drilled on the nozzle pipes is obtained by a 1-D transient hydrodynamic code called, ALITRIG, and this injection rate is used to provide the inlet boundary condition to a 3-D CFD model of the moderator tank based on CFX4.3, a commercial CFD code developed by AEA technology, to simulate the formation of the poison jet curtain inside the moderator tank. For the validation, a simulation for a generic CANDU-6 SDS2 design poison jet growth experiment was made to evaluate this model's capability against experiment. As no concentration field was measured and only the growth of the poison jet height was obtained by high speed camera, the validation was limited as such. The result showed that if one assume the jet front corresponds to 200 ppm of the poison the model succeed to

  4. Validation of simulation models

    DEFF Research Database (Denmark)

    Rehman, Muniza; Pedersen, Stig Andur

    2012-01-01

    In philosophy of science, the interest for computational models and simulations has increased heavily during the past decades. Different positions regarding the validity of models have emerged but the views have not succeeded in capturing the diversity of validation methods. The wide variety...

  5. Development and validation of the 3-D CFD model for CANDU-6 moderator temperature predictions

    International Nuclear Information System (INIS)

    Yoon, Churl; Rhee, Bo Wook; Min, Byung Joo

    2003-03-01

    A computational fluid dynamics model for predicting the moderator circulation inside the CANada Deuterium Uranium (CANDU) reactor vessel has been developed to estimate the local subcooling of the moderator in the vicinity of the Calandria tubes. The buoyancy effect induced by internal heating is accounted for by Boussinesq approximation. The standard κ-ε turbulence model associated with logarithmic wall treatment is applied to predict the turbulent jet flows from the inlet nozzles. The matrix of the Calandria tubes in the core region is simplified to porous media, in which an-isotropic hydraulic impedance is modeled using an empirical correlation of the frictional pressure loss. The governing equations are solved by CFX-4.4, a commercial CFD code developed by AEA technology. The CFD model has been successfully verified and validated against experimental data obtained in the Stern Laboratories Inc. (SLI) in Hamilton, Ontario

  6. Validation of the hdm models forcrack initiation and development, rutting and roughness of the pavement

    Directory of Open Access Journals (Sweden)

    Ognjenović Slobodan

    2017-01-01

    Full Text Available Worldwide practice recommends validation of the HDM models with some other software that can be used for comparison of the forecasting results. The program package MATLAB is used in this case, as it enables for modelling of all the HDM models. A statistic validation of the results of the forecasts concerning the condition of the pavements in HDM with the on-field measuring results was also performed. This paper shall present the results of the validation of the coefficients of calibration of the deterioration models in HDM 4 on the Macedonian highways.

  7. Validation of ASTEC core degradation and containment models

    International Nuclear Information System (INIS)

    Kruse, Philipp; Brähler, Thimo; Koch, Marco K.

    2014-01-01

    Ruhr-Universitaet Bochum performed in a German funded project validation of in-vessel and containment models of the integral code ASTEC V2, jointly developed by IRSN (France) and GRS (Germany). In this paper selected results of this validation are presented. In the in-vessel part, the main point of interest was the validation of the code capability concerning cladding oxidation and hydrogen generation. The ASTEC calculations of QUENCH experiments QUENCH-03 and QUENCH-11 show satisfactory results, despite of some necessary adjustments in the input deck. Furthermore, the oxidation models based on the Cathcart–Pawel and Urbanic–Heidrick correlations are not suitable for higher temperatures while the ASTEC model BEST-FIT based on the Prater–Courtright approach at high temperature gives reliable enough results. One part of the containment model validation was the assessment of three hydrogen combustion models of ASTEC against the experiment BMC Ix9. The simulation results of these models differ from each other and therefore the quality of the simulations depends on the characteristic of each model. Accordingly, the CPA FRONT model, corresponding to the simplest necessary input parameters, provides the best agreement to the experimental data

  8. Development of regional scale soil erosion and sediment transport model; its calibration and validations

    International Nuclear Information System (INIS)

    Rehman, M.H.; Akhtar, M.N.

    2005-01-01

    Despite of the fact that many soil erosion models have been developed in the past more than 5 decades including empirical based models like USLE and RUSLE and many process based soil erosion and sediment transport models like WEPP, EUROSEM and SHETRAN, the application of these models to regional scales remained questionable. To address the problem, a process-based soil erosion and sediment transport model has been developed to estimate the soil erosion, deposition, transport and sediment yield at regional scale. The soil erosion processes are modeled as the detachment of soil by the raindrop impact over the entire grid and detachment of soil due to overland flow only within the equivalent channels, whereas sediment is routed to the forward grid considering the transport capacity of the flow. The loss of heterogeneity in the spatial information of the topography due to slope averaging effect is reproduced by adapting a Fractal analysis approach. The model has been calibrated for Nan river basin (N.13A) and validated to the Yom river basin (Y.6) and Nam Mae Klang river basin (P.24A) of Thailand, simulated results show good agreements with the observed sediment discharge data. The developed model with few new components can also be applied for predicting the sediment discharges of the river Indus. (author)

  9. Development and Initial Validation of the Five-Factor Model Adolescent Personality Questionnaire (FFM-APQ).

    Science.gov (United States)

    Rogers, Mary E; Glendon, A Ian

    2018-01-01

    This research reports on the 4-phase development of the 25-item Five-Factor Model Adolescent Personality Questionnaire (FFM-APQ). The purpose was to develop and determine initial evidence for validity of a brief adolescent personality inventory using a vocabulary that could be understood by adolescents up to 18 years old. Phase 1 (N = 48) consisted of item generation and expert (N = 5) review of items; Phase 2 (N = 179) involved item analyses; in Phase 3 (N = 496) exploratory factor analysis assessed the underlying structure; in Phase 4 (N = 405) confirmatory factor analyses resulted in a 25-item inventory with 5 subscales.

  10. Utilizing Chamber Data for Developing and Validating Climate Change Models

    Science.gov (United States)

    Monje, Oscar

    2012-01-01

    Controlled environment chambers (e.g. growth chambers, SPAR chambers, or open-top chambers) are useful for measuring plant ecosystem responses to climatic variables and CO2 that affect plant water relations. However, data from chambers was found to overestimate responses of C fluxes to CO2 enrichment. Chamber data may be confounded by numerous artifacts (e.g. sidelighting, edge effects, increased temperature and VPD, etc) and this limits what can be measured accurately. Chambers can be used to measure canopy level energy balance under controlled conditions and plant transpiration responses to CO2 concentration can be elucidated. However, these measurements cannot be used directly in model development or validation. The response of stomatal conductance to CO2 will be the same as in the field, but the measured response must be recalculated in such a manner to account for differences in aerodynamic conductance, temperature and VPD between the chamber and the field.

  11. Validation of nuclear models used in space radiation shielding applications

    International Nuclear Information System (INIS)

    Norman, Ryan B.; Blattnig, Steve R.

    2013-01-01

    A program of verification and validation has been undertaken to assess the applicability of models to space radiation shielding applications and to track progress as these models are developed over time. In this work, simple validation metrics applicable to testing both model accuracy and consistency with experimental data are developed. The developed metrics treat experimental measurement uncertainty as an interval and are therefore applicable to cases in which epistemic uncertainty dominates the experimental data. To demonstrate the applicability of the metrics, nuclear physics models used by NASA for space radiation shielding applications are compared to an experimental database consisting of over 3600 experimental cross sections. A cumulative uncertainty metric is applied to the question of overall model accuracy, while a metric based on the median uncertainty is used to analyze the models from the perspective of model development by examining subsets of the model parameter space.

  12. A process improvement model for software verification and validation

    Science.gov (United States)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  13. Construct validity of the Moral Development Scale for Professionals (MDSP).

    Science.gov (United States)

    Söderhamn, Olle; Bjørnestad, John Olav; Skisland, Anne; Cliffordson, Christina

    2011-01-01

    The aim of this study was to investigate the construct validity of the Moral Development Scale for Professionals (MDSP) using structural equation modeling. The instrument is a 12-item self-report instrument, developed in the Scandinavian cultural context and based on Kohlberg's theory. A hypothesized simplex structure model underlying the MDSP was tested through structural equation modeling. Validity was also tested as the proportion of respondents older than 20 years that reached the highest moral level, which according to the theory should be small. A convenience sample of 339 nursing students with a mean age of 25.3 years participated. Results confirmed the simplex model structure, indicating that MDSP reflects a moral construct empirically organized from low to high. A minority of respondents >20 years of age (13.5%) scored more than 80% on the highest moral level. The findings support the construct validity of the MDSP and the stages and levels in Kohlberg's theory.

  14. Test-driven verification/validation of model transformations

    Institute of Scientific and Technical Information of China (English)

    László LENGYEL; Hassan CHARAF

    2015-01-01

    Why is it important to verify/validate model transformations? The motivation is to improve the quality of the trans-formations, and therefore the quality of the generated software artifacts. Verified/validated model transformations make it possible to ensure certain properties of the generated software artifacts. In this way, verification/validation methods can guarantee different requirements stated by the actual domain against the generated/modified/optimized software products. For example, a verified/ validated model transformation can ensure the preservation of certain properties during the model-to-model transformation. This paper emphasizes the necessity of methods that make model transformation verified/validated, discusses the different scenarios of model transformation verification and validation, and introduces the principles of a novel test-driven method for verifying/ validating model transformations. We provide a solution that makes it possible to automatically generate test input models for model transformations. Furthermore, we collect and discuss the actual open issues in the field of verification/validation of model transformations.

  15. An approach to model validation and model-based prediction -- polyurethane foam case study.

    Energy Technology Data Exchange (ETDEWEB)

    Dowding, Kevin J.; Rutherford, Brian Milne

    2003-07-01

    Enhanced software methodology and improved computing hardware have advanced the state of simulation technology to a point where large physics-based codes can be a major contributor in many systems analyses. This shift toward the use of computational methods has brought with it new research challenges in a number of areas including characterization of uncertainty, model validation, and the analysis of computer output. It is these challenges that have motivated the work described in this report. Approaches to and methods for model validation and (model-based) prediction have been developed recently in the engineering, mathematics and statistical literatures. In this report we have provided a fairly detailed account of one approach to model validation and prediction applied to an analysis investigating thermal decomposition of polyurethane foam. A model simulates the evolution of the foam in a high temperature environment as it transforms from a solid to a gas phase. The available modeling and experimental results serve as data for a case study focusing our model validation and prediction developmental efforts on this specific thermal application. We discuss several elements of the ''philosophy'' behind the validation and prediction approach: (1) We view the validation process as an activity applying to the use of a specific computational model for a specific application. We do acknowledge, however, that an important part of the overall development of a computational simulation initiative is the feedback provided to model developers and analysts associated with the application. (2) We utilize information obtained for the calibration of model parameters to estimate the parameters and quantify uncertainty in the estimates. We rely, however, on validation data (or data from similar analyses) to measure the variability that contributes to the uncertainty in predictions for specific systems or units (unit-to-unit variability). (3) We perform statistical

  16. Development and validation of rear impact computer simulation model of an adult manual transit wheelchair with a seated occupant.

    Science.gov (United States)

    Salipur, Zdravko; Bertocci, Gina

    2010-01-01

    It has been shown that ANSI WC19 transit wheelchairs that are crashworthy in frontal impact exhibit catastrophic failures in rear impact and may not be able to provide stable seating support and thus occupant protection for the wheelchair occupant. Thus far only limited sled test and computer simulation data have been available to study rear impact wheelchair safety. Computer modeling can be used as an economic and comprehensive tool to gain critical knowledge regarding wheelchair integrity and occupant safety. This study describes the development and validation of a computer model simulating an adult wheelchair-seated occupant subjected to a rear impact event. The model was developed in MADYMO and validated rigorously using the results of three similar sled tests conducted to specifications provided in the draft ISO/TC 173 standard. Outcomes from the model can provide critical wheelchair loading information to wheelchair and tiedown manufacturers, resulting in safer wheelchair designs for rear impact conditions. (c) 2009 IPEM. Published by Elsevier Ltd. All rights reserved.

  17. Development of a new model to predict indoor daylighting: Integration in CODYRUN software and validation

    Energy Technology Data Exchange (ETDEWEB)

    Fakra, A.H., E-mail: fakra@univ-reunion.f [Physics and Mathematical Engineering Laboratory for Energy and Environment (PIMENT), University of La Reunion, 117 rue du General Ailleret, 97430 Le Tampon (French Overseas Dpt.), Reunion (France); Miranville, F.; Boyer, H.; Guichard, S. [Physics and Mathematical Engineering Laboratory for Energy and Environment (PIMENT), University of La Reunion, 117 rue du General Ailleret, 97430 Le Tampon (French Overseas Dpt.), Reunion (France)

    2011-07-15

    Research highlights: {yields} This study presents a new model capable to simulate indoor daylighting. {yields} The model was introduced in research software called CODYRUN. {yields} The validation of the code was realized from a lot of tests cases. -- Abstract: Many models exist in the scientific literature for determining indoor daylighting values. They are classified in three categories: numerical, simplified and empirical models. Nevertheless, each of these categories of models are not convenient for every application. Indeed, the numerical model requires high calculation time; conditions of use of the simplified models are limited, and experimental models need not only important financial resources but also a perfect control of experimental devices (e.g. scale model), as well as climatic characteristics of the location (e.g. in situ experiment). In this article, a new model based on a combination of multiple simplified models is established. The objective is to improve this category of model. The originality of our paper relies on the coupling of several simplified models of indoor daylighting calculations. The accuracy of the simulation code, introduced into CODYRUN software to simulate correctly indoor illuminance, is then verified. Besides, the software consists of a numerical building simulation code, developed in the Physics and Mathematical Engineering Laboratory for Energy and Environment (PIMENT) at the University of Reunion. Initially dedicated to the thermal, airflow and hydrous phenomena in the buildings, the software has been completed for the calculation of indoor daylighting. New models and algorithms - which rely on a semi-detailed approach - will be presented in this paper. In order to validate the accuracy of the integrated models, many test cases have been considered as analytical, inter-software comparisons and experimental comparisons. In order to prove the accuracy of the new model - which can properly simulate the illuminance - a

  18. Continuous validation of ASTEC containment models and regression testing

    International Nuclear Information System (INIS)

    Nowack, Holger; Reinke, Nils; Sonnenkalb, Martin

    2014-01-01

    The focus of the ASTEC (Accident Source Term Evaluation Code) development at GRS is primarily on the containment module CPA (Containment Part of ASTEC), whose modelling is to a large extent based on the GRS containment code COCOSYS (COntainment COde SYStem). Validation is usually understood as the approval of the modelling capabilities by calculations of appropriate experiments done by external users different from the code developers. During the development process of ASTEC CPA, bugs and unintended side effects may occur, which leads to changes in the results of the initially conducted validation. Due to the involvement of a considerable number of developers in the coding of ASTEC modules, validation of the code alone, even if executed repeatedly, is not sufficient. Therefore, a regression testing procedure has been implemented in order to ensure that the initially obtained validation results are still valid with succeeding code versions. Within the regression testing procedure, calculations of experiments and plant sequences are performed with the same input deck but applying two different code versions. For every test-case the up-to-date code version is compared to the preceding one on the basis of physical parameters deemed to be characteristic for the test-case under consideration. In the case of post-calculations of experiments also a comparison to experimental data is carried out. Three validation cases from the regression testing procedure are presented within this paper. The very good post-calculation of the HDR E11.1 experiment shows the high quality modelling of thermal-hydraulics in ASTEC CPA. Aerosol behaviour is validated on the BMC VANAM M3 experiment, and the results show also a very good agreement with experimental data. Finally, iodine behaviour is checked in the validation test-case of the THAI IOD-11 experiment. Within this test-case, the comparison of the ASTEC versions V2.0r1 and V2.0r2 shows how an error was detected by the regression testing

  19. Development and validation of a computational model of the knee joint for the evaluation of surgical treatments for osteoarthritis.

    Science.gov (United States)

    Mootanah, R; Imhauser, C W; Reisse, F; Carpanen, D; Walker, R W; Koff, M F; Lenhoff, M W; Rozbruch, S R; Fragomen, A T; Dewan, Z; Kirane, Y M; Cheah, K; Dowell, J K; Hillstrom, H J

    2014-01-01

    A three-dimensional (3D) knee joint computational model was developed and validated to predict knee joint contact forces and pressures for different degrees of malalignment. A 3D computational knee model was created from high-resolution radiological images to emulate passive sagittal rotation (full-extension to 65°-flexion) and weight acceptance. A cadaveric knee mounted on a six-degree-of-freedom robot was subjected to matching boundary and loading conditions. A ligament-tuning process minimised kinematic differences between the robotically loaded cadaver specimen and the finite element (FE) model. The model was validated by measured intra-articular force and pressure measurements. Percent full scale error between FE-predicted and in vitro-measured values in the medial and lateral compartments were 6.67% and 5.94%, respectively, for normalised peak pressure values, and 7.56% and 4.48%, respectively, for normalised force values. The knee model can accurately predict normalised intra-articular pressure and forces for different loading conditions and could be further developed for subject-specific surgical planning.

  20. Development and validation of a combined phased acoustical radiosity and image source model for predicting sound fields in rooms

    DEFF Research Database (Denmark)

    Marbjerg, Gerd Høy; Brunskog, Jonas; Jeong, Cheol-Ho

    2015-01-01

    A model, combining acoustical radiosity and the image source method, including phase shifts on reflection, has been developed. The model is denoted Phased Acoustical Radiosity and Image Source Method (PARISM), and it has been developed in order to be able to model both specular and diffuse...... radiosity by regarding the model as being stochastic. Three methods of implementation are proposed and investigated, and finally, recommendations are made for their use. Validation of the image source method is done by comparison with finite element simulations of a rectangular room with a porous absorber...

  1. Development and validation of clinical prediction models for mortality, functional outcome and cognitive impairment after stroke: a study protocol.

    Science.gov (United States)

    Fahey, Marion; Rudd, Anthony; Béjot, Yannick; Wolfe, Charles; Douiri, Abdel

    2017-08-18

    Stroke is a leading cause of adult disability and death worldwide. The neurological impairments associated with stroke prevent patients from performing basic daily activities and have enormous impact on families and caregivers. Practical and accurate tools to assist in predicting outcome after stroke at patient level can provide significant aid for patient management. Furthermore, prediction models of this kind can be useful for clinical research, health economics, policymaking and clinical decision support. 2869 patients with first-ever stroke from South London Stroke Register (SLSR) (1995-2004) will be included in the development cohort. We will use information captured after baseline to construct multilevel models and a Cox proportional hazard model to predict cognitive impairment, functional outcome and mortality up to 5 years after stroke. Repeated random subsampling validation (Monte Carlo cross-validation) will be evaluated in model development. Data from participants recruited to the stroke register (2005-2014) will be used for temporal validation of the models. Data from participants recruited to the Dijon Stroke Register (1985-2015) will be used for external validation. Discrimination, calibration and clinical utility of the models will be presented. Patients, or for patients who cannot consent their relatives, gave written informed consent to participate in stroke-related studies within the SLSR. The SLSR design was approved by the ethics committees of Guy's and St Thomas' NHS Foundation Trust, Kings College Hospital, Queens Square and Westminster Hospitals (London). The Dijon Stroke Registry was approved by the Comité National des Registres and the InVS and has authorisation of the Commission Nationale de l'Informatique et des Libertés. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  2. Development and validation of a thermodynamic model for the performance analysis of a gamma Stirling engine prototype

    International Nuclear Information System (INIS)

    Araoz, Joseph A.; Cardozo, Evelyn; Salomon, Marianne; Alejo, Lucio; Fransson, Torsten H.

    2015-01-01

    This work presents the development and validation of a numerical model that represents the performance of a gamma Stirling engine prototype. The model follows a modular approach considering ideal adiabatic working spaces; limited internal and external heat transfer through the heat exchangers; and mechanical and thermal losses during the cycle. In addition, it includes the calculation of the mechanical efficiency taking into account the crank mechanism effectiveness and the forced work during the cycle. Consequently, the model aims to predict the work that can be effectively taken from the shaft. The model was compared with experimental data obtained in an experimental rig built for the engine prototype. The results showed an acceptable degree of accuracy when comparing with the experimental data, with errors ranging from ±1% to ±8% for the temperature in the heater side, less than ±1% error for the cooler temperatures, and ±1 to ±8% for the brake power calculations. Therefore, the model was probed adequate for study of the prototype performance. In addition, the results of the simulation reflected the limited performance obtained during the prototype experiments, and a first analysis of the results attributed this to the forced work during the cycle. The implemented model is the basis for a subsequent parametric analysis that will complement the results presented. - Highlights: • A numerical model for a Stirling engine was developed. • A mechanical efficiency analysis was included in the model. • The model was validated with experimental data of a novel prototype. • The model results permit a deeper insight into the engine operation

  3. Can preventable adverse events be predicted among hospitalized older patients? The development and validation of a predictive model.

    NARCIS (Netherlands)

    Steeg, L. van de; Langelaan, M.; Wagner, C.

    2014-01-01

    Objective: To develop and validate a predictive model for preventable adverse events (AEs) in hospitalized older patients, using clinically important risk factors that are readily available on admission. Design: Data from two retrospective patient record review studies on AEs were used. Risk factors

  4. ADVISHE: A new tool to report validation of health-economic decision models

    NARCIS (Netherlands)

    Vemer, P.; Corro Ramos, I.; Van Voorn, G.; Al, M.J.; Feenstra, T.L.

    2014-01-01

    Background: Modelers and reimbursement decision makers could both profit from a more systematic reporting of the efforts to validate health-economic (HE) models. Objectives: Development of a tool to systematically report validation efforts of HE decision models and their outcomes. Methods: A gross

  5. Validation of a phytoremediation computer model

    Energy Technology Data Exchange (ETDEWEB)

    Corapcioglu, M Y; Sung, K; Rhykerd, R L; Munster, C; Drew, M [Texas A and M Univ., College Station, TX (United States)

    1999-01-01

    The use of plants to stimulate remediation of contaminated soil is an effective, low-cost cleanup method which can be applied to many different sites. A phytoremediation computer model has been developed to simulate how recalcitrant hydrocarbons interact with plant roots in unsaturated soil. A study was conducted to provide data to validate and calibrate the model. During the study, lysimeters were constructed and filled with soil contaminated with 10 [mg kg[sub -1

  6. Development and validation of models for bubble coalescence and breakup. Final report

    International Nuclear Information System (INIS)

    Liao, Y.; Lucas, D.

    2013-02-01

    A new generalized model for bubble coalescence and breakup has been developed. It is based on physical considerations and takes into account various mechanisms that can lead to bubble coalescence and breakup. First, in a detailed literature review, the available models were compiled and analyzed. It turned out that many of them show a contradictory behaviour. None of these models allows the prediction of the evolution of bubble size distributions along a pipe flow for a wide range of combinations of flow rates of the gas and the liquid phase. The new model has been extensively studied in a simplified Test-Solver. Although this does not cover all details of a developing flow along the pipe, it allows - in contrast to a CFD code - to conduct a large number of variational calculations to investigate the influence of individual sizes and models. Coalescence and breakup cannot be considered separately from other phenomena and models that reflect these phenomena. There are close interactions with the turbulence of the liquid phase and the momentum exchange between phases. Since the dissipation rate of turbulent kinetic energy is a direct input parameter for the new model, the turbulence modelling has been studied very carefully. To validate the model, a special experimental series for air-water flows was used, conducted at the TOPFLOW facility in an 8-meter long DN200 pipe. The data are characterized by high quality and were produced within the TOPFLOW-II project. The test series aims to provide a basis for the work presented here. Predicting the evolution of the bubble size distribution along the pipe could be improved significantly in comparison to the previous standard models for bubble coalescence and breakup implemented in CFX. However some quantitative discrepancies remain. The full model equations as well as an implementation as ''User-FORTRAN'' in CFX are available and can be used for further work on the simulation of poly-disperse bubbly flows.

  7. Modeling the Relationship between Safety Climate and Safety Performance in a Developing Construction Industry: A Cross-Cultural Validation Study.

    Science.gov (United States)

    Zahoor, Hafiz; Chan, Albert P C; Utama, Wahyudi P; Gao, Ran; Zafar, Irfan

    2017-03-28

    This study attempts to validate a safety performance (SP) measurement model in the cross-cultural setting of a developing country. In addition, it highlights the variations in investigating the relationship between safety climate (SC) factors and SP indicators. The data were collected from forty under-construction multi-storey building projects in Pakistan. Based on the results of exploratory factor analysis, a SP measurement model was hypothesized. It was tested and validated by conducting confirmatory factor analysis on calibration and validation sub-samples respectively. The study confirmed the significant positive impact of SC on safety compliance and safety participation , and negative impact on number of self-reported accidents/injuries . However, number of near-misses could not be retained in the final SP model because it attained a lower standardized path coefficient value. Moreover, instead of safety participation , safety compliance established a stronger impact on SP. The study uncovered safety enforcement and promotion as a novel SC factor, whereas safety rules and work practices was identified as the most neglected factor. The study contributed to the body of knowledge by unveiling the deviations in existing dimensions of SC and SP. The refined model is expected to concisely measure the SP in the Pakistani construction industry, however, caution must be exercised while generalizing the study results to other developing countries.

  8. Verification and validation of models

    International Nuclear Information System (INIS)

    Herbert, A.W.; Hodgkinson, D.P.; Jackson, C.P.; Lever, D.A.; Robinson, P.C.

    1986-12-01

    The numerical accuracy of the computer models for groundwater flow and radionuclide transport that are to be used in repository safety assessment must be tested, and their ability to describe experimental data assessed: they must be verified and validated respectively. Also appropriate ways to use the codes in performance assessments, taking into account uncertainties in present data and future conditions, must be studied. These objectives are being met by participation in international exercises, by developing bench-mark problems, and by analysing experiments. In particular the project has funded participation in the HYDROCOIN project for groundwater flow models, the Natural Analogues Working Group, and the INTRAVAL project for geosphere models. (author)

  9. Development and validation of a prediction model for long-term sickness absence based on occupational health survey variables

    NARCIS (Netherlands)

    Roelen, Corne; Thorsen, Sannie; Heymans, Martijn; Twisk, Jos; Bultmann, Ute; Bjorner, Jakob

    2018-01-01

    Purpose: The purpose of this study is to develop and validate a prediction model for identifying employees at increased risk of long-term sickness absence (LTSA), by using variables commonly measured in occupational health surveys. Materials and methods: Based on the literature, 15 predictor

  10. Comorbidity predicts poor prognosis in nasopharyngeal carcinoma: Development and validation of a predictive score model

    International Nuclear Information System (INIS)

    Guo, Rui; Chen, Xiao-Zhong; Chen, Lei; Jiang, Feng; Tang, Ling-Long; Mao, Yan-Ping; Zhou, Guan-Qun; Li, Wen-Fei; Liu, Li-Zhi; Tian, Li; Lin, Ai-Hua; Ma, Jun

    2015-01-01

    Background and purpose: The impact of comorbidity on prognosis in nasopharyngeal carcinoma (NPC) is poorly characterized. Material and methods: Using the Adult Comorbidity Evaluation-27 (ACE-27) system, we assessed the prognostic value of comorbidity and developed, validated and confirmed a predictive score model in a training set (n = 658), internal validation set (n = 658) and independent set (n = 652) using area under the receiver operating curve analysis. Results: Comorbidity was present in 40.4% of 1968 patients (mild, 30.1%; moderate, 9.1%; severe, 1.2%). Compared to an ACE-27 score ⩽1, patients with an ACE-27 score >1 in the training set had shorter overall survival (OS) and disease-free survival (DFS) (both P < 0.001), similar results were obtained in the other sets (P < 0.05). In multivariate analysis, ACE-27 score was a significant independent prognostic factor for OS and DFS. The combined risk score model including ACE-27 had superior prognostic value to TNM stage alone in the internal validation set (0.70 vs. 0.66; P = 0.02), independent set (0.73 vs. 0.67; P = 0.002) and all patients (0.71 vs. 0.67; P < 0.001). Conclusions: Comorbidity significantly affects prognosis, especially in stages II and III, and should be incorporated into the TNM staging system for NPC. Assessment of comorbidity may improve outcome prediction and help tailor individualized treatment

  11. Refinement, Validation and Benchmarking of a Model for E-Government Service Quality

    Science.gov (United States)

    Magoutas, Babis; Mentzas, Gregoris

    This paper presents the refinement and validation of a model for Quality of e-Government Services (QeGS). We built upon our previous work where a conceptualized model was identified and put focus on the confirmatory phase of the model development process, in order to come up with a valid and reliable QeGS model. The validated model, which was benchmarked with very positive results with similar models found in the literature, can be used for measuring the QeGS in a reliable and valid manner. This will form the basis for a continuous quality improvement process, unleashing the full potential of e-government services for both citizens and public administrations.

  12. Development, external validation and clinical usefulness of a practical prediction model for radiation-induced dysphagia in lung cancer patients

    International Nuclear Information System (INIS)

    Dehing-Oberije, Cary; De Ruysscher, Dirk; Petit, Steven; Van Meerbeeck, Jan; Vandecasteele, Katrien; De Neve, Wilfried; Dingemans, Anne Marie C.; El Naqa, Issam; Deasy, Joseph; Bradley, Jeff; Huang, Ellen; Lambin, Philippe

    2010-01-01

    Introduction: Acute dysphagia is a distressing dose-limiting toxicity occurring frequently during concurrent chemo-radiation or high-dose radiotherapy for lung cancer. It can lead to treatment interruptions and thus jeopardize survival. Although a number of predictive factors have been identified, it is still not clear how these could offer assistance for treatment decision making in daily clinical practice. Therefore, we have developed and validated a nomogram to predict this side-effect. In addition, clinical usefulness was assessed by comparing model predictions to physicians' predictions. Materials and methods: Clinical data from 469 inoperable lung cancer patients, treated with curative intent, were collected prospectively. A prediction model for acute radiation-induced dysphagia was developed. Model performance was evaluated by the c-statistic and assessed using bootstrapping as well as two external datasets. In addition, a prospective study was conducted comparing model to physicians' predictions in 138 patients. Results: The final multivariate model consisted of age, gender, WHO performance status, mean esophageal dose (MED), maximum esophageal dose (MAXED) and overall treatment time (OTT). The c-statistic, assessed by bootstrapping, was 0.77. External validation yielded an AUC of 0.94 on the Ghent data and 0.77 on the Washington University St. Louis data for dysphagia ≥ grade 3. Comparing model predictions to the physicians' predictions resulted in an AUC of 0.75 versus 0.53, respectively. Conclusions: The proposed model performed well was successfully validated and demonstrated the ability to predict acute severe dysphagia remarkably better than the physicians. Therefore, this model could be used in clinical practice to identify patients at high or low risk.

  13. Attempted development and cross-validation of predictive models of individual-level and organizational-level turnover of nuclear power operators

    International Nuclear Information System (INIS)

    Vasa-Sideris, S.J.

    1989-01-01

    Nuclear power accounts for 209% of the electric power generated in the U.S. by 107 nuclear plants which employ over 8,700 operators. Operator turnover is significant to utilities from the economic point of view since it costs almost three hundred thousand dollars to train and qualify one operator, and because turnover affects plant operability and therefore plant safety. The study purpose was to develop and cross-validate individual-level and organizational-level models of turnover of nuclear power plant operators. Data were obtained by questionnaires and from published data for 1983 and 1984 on a number of individual, organizational, and environmental predictors. Plants had been in operation for two or more years. Questionnaires were returned by 29 out of 50 plants on over 1600 operators. The objectives were to examine the reliability of the turnover criterion, to determine the classification accuracy of the multivariate predictive models and of categories of predictors (individual, organizational, and environmental) and to determine if a homology existed between the individual-level and organizational-level models. The method was to examine the shrinkage that occurred between foldback design (in which the predictive models were reapplied to the data used to develop them) and cross-validation. Results did not support the hypothesis objectives. Turnover data were accurate but not stable between the two years. No significant differences were detected between the low and high turnover groups at the organization or individual level in cross-validation. Lack of stability in the criterion, restriction of range, and small sample size at the organizational level were serious limitations of this study. The results did support the methods. Considerable shrinkage occurred between foldback and cross-validation of the models

  14. Development and validation of sodium fire analysis code ASSCOPS

    International Nuclear Information System (INIS)

    Ohno, Shuji

    2001-01-01

    A version 2.1 of the ASSCOPS sodium fire analysis code was developed to evaluate the thermal consequences of a sodium leak and consequent fire in LMFBRs. This report describes the computational models and the validation studies using the code. The ASSCOPS calculates sodium droplet and pool fire, and consequential heat/mass transfer behavior. Analyses of sodium pool or spray fire experiments confirmed that this code and parameters used in the validation studies gave valid results on the thermal consequences of sodium leaks and fires. (author)

  15. Intrapreneurial competencies: development and validation of a measurement scale

    Directory of Open Access Journals (Sweden)

    Tomás Vargas-Halabí

    2017-07-01

    Full Text Available Purpose - Few models have attempted to explain intrapreneurial behavior from the perspective of competencies. Therefore, the purpose of this paper is to contribute along this line by developing and validating a scale to measure intrapreneurial competencies for a Costa Rican organizational context. Design/methodology/approach - A three stage process was followed. The first stage considered literature review, expert judgment, cognitive interviews, and back-translation. In the second stage, the questionnaire was administered to a sample of 543 university professionals who worked mainly in private organizations in Costa Rica. The third stage led to evaluate of the proposed scale’s psychometric properties, including, exploratory factor analysis procedure performing by SPSS 19; confirmatory factor analysis procedures by means of structural equation modeling using EQS 6.2 version and finally, a linear regression model to obtain evidence of external criterion-related validity, performed by SPSS 19. Findings - This study provides evidence of five sub-dimensions of employee attributes, i.e., “opportunity promoter”, “proactivity”, “flexibility”, “drive”, and “risk taking” that constitute a higher-level construct called intrapreneurial competencies. The scale provided evidence of convergent, discriminant, and criterion-related validity – the latter, using an employee innovative behavior scale. Originality/value - The model offers a first step to continue studies that aim at developing a robust model of intrapreneurial competencies. This potential predictive capacity of an instrument of this nature would be useful for the business sector, particularly as a diagnostic instrument to strengthen processes of staff development in areas that promote the development of innovation and the creation of new businesses for the company.

  16. Development and External Validation of Prognostic Model for 2-Year Survival of Non-Small-Cell Lung Cancer Patients Treated With Chemoradiotherapy

    International Nuclear Information System (INIS)

    Dehing-Oberije, Cary; Yu Shipeng; De Ruysscher, Dirk; Meersschout, Sabine; Van Beek, Karen; Lievens, Yolande; Van Meerbeeck, Jan; De Neve, Wilfried; Rao, Bharat Ph.D.; Weide, Hiska van der; Lambin, Philippe

    2009-01-01

    Purpose: Radiotherapy, combined with chemotherapy, is the treatment of choice for a large group of non-small-cell lung cancer (NSCLC) patients. Recent developments in the treatment of these patients have led to improved survival. However, the clinical TNM stage is highly inaccurate for the prediction of survival, and alternatives are lacking. The objective of this study was to develop and validate a prediction model for survival of NSCLC patients, treated with chemoradiotherapy. Patients and Methods: The clinical data from 377 consecutive inoperable NSCLC patients, Stage I-IIIB, treated radically with chemoradiotherapy were collected. A prognostic model for 2-year survival was developed, using 2-norm support vector machines. The performance of the model was expressed as the area under the curve of the receiver operating characteristic and assessed using leave-one-out cross-validation, as well as two external data sets. Results: The final multivariate model consisted of gender, World Health Organization performance status, forced expiratory volume in 1 s, number of positive lymph node stations, and gross tumor volume. The area under the curve, assessed by leave-one-out cross-validation, was 0.74, and application of the model to the external data sets yielded an area under the curve of 0.75 and 0.76. A high- and low-risk group could be clearly identified using a risk score based on the model. Conclusion: The multivariate model performed very well and was able to accurately predict the 2-year survival of NSCLC patients treated with chemoradiotherapy. The model could support clinicians in the treatment decision-making process.

  17. Development and validation of a prediction model for measurement variability of lung nodule volumetry in patients with pulmonary metastases.

    Science.gov (United States)

    Hwang, Eui Jin; Goo, Jin Mo; Kim, Jihye; Park, Sang Joon; Ahn, Soyeon; Park, Chang Min; Shin, Yeong-Gil

    2017-08-01

    To develop a prediction model for the variability range of lung nodule volumetry and validate the model in detecting nodule growth. For model development, 50 patients with metastatic nodules were prospectively included. Two consecutive CT scans were performed to assess volumetry for 1,586 nodules. Nodule volume, surface voxel proportion (SVP), attachment proportion (AP) and absolute percentage error (APE) were calculated for each nodule and quantile regression analyses were performed to model the 95% percentile of APE. For validation, 41 patients who underwent metastasectomy were included. After volumetry of resected nodules, sensitivity and specificity for diagnosis of metastatic nodules were compared between two different thresholds of nodule growth determination: uniform 25% volume change threshold and individualized threshold calculated from the model (estimated 95% percentile APE). SVP and AP were included in the final model: Estimated 95% percentile APE = 37.82 · SVP + 48.60 · AP-10.87. In the validation session, the individualized threshold showed significantly higher sensitivity for diagnosis of metastatic nodules than the uniform 25% threshold (75.0% vs. 66.0%, P = 0.004) CONCLUSION: Estimated 95% percentile APE as an individualized threshold of nodule growth showed greater sensitivity in diagnosing metastatic nodules than a global 25% threshold. • The 95 % percentile APE of a particular nodule can be predicted. • Estimated 95 % percentile APE can be utilized as an individualized threshold. • More sensitive diagnosis of metastasis can be made with an individualized threshold. • Tailored nodule management can be provided during nodule growth follow-up.

  18. Model validation and calibration based on component functions of model output

    International Nuclear Information System (INIS)

    Wu, Danqing; Lu, Zhenzhou; Wang, Yanping; Cheng, Lei

    2015-01-01

    The target in this work is to validate the component functions of model output between physical observation and computational model with the area metric. Based on the theory of high dimensional model representations (HDMR) of independent input variables, conditional expectations are component functions of model output, and the conditional expectations reflect partial information of model output. Therefore, the model validation of conditional expectations tells the discrepancy between the partial information of the computational model output and that of the observations. Then a calibration of the conditional expectations is carried out to reduce the value of model validation metric. After that, a recalculation of the model validation metric of model output is taken with the calibrated model parameters, and the result shows that a reduction of the discrepancy in the conditional expectations can help decrease the difference in model output. At last, several examples are employed to demonstrate the rationality and necessity of the methodology in case of both single validation site and multiple validation sites. - Highlights: • A validation metric of conditional expectations of model output is proposed. • HDRM explains the relationship of conditional expectations and model output. • An improved approach of parameter calibration updates the computational models. • Validation and calibration process are applied at single site and multiple sites. • Validation and calibration process show a superiority than existing methods

  19. The development and validation of a numerical integration method for non-linear viscoelastic modeling

    Science.gov (United States)

    Ramo, Nicole L.; Puttlitz, Christian M.

    2018-01-01

    Compelling evidence that many biological soft tissues display both strain- and time-dependent behavior has led to the development of fully non-linear viscoelastic modeling techniques to represent the tissue’s mechanical response under dynamic conditions. Since the current stress state of a viscoelastic material is dependent on all previous loading events, numerical analyses are complicated by the requirement of computing and storing the stress at each step throughout the load history. This requirement quickly becomes computationally expensive, and in some cases intractable, for finite element models. Therefore, we have developed a strain-dependent numerical integration approach for capturing non-linear viscoelasticity that enables calculation of the current stress from a strain-dependent history state variable stored from the preceding time step only, which improves both fitting efficiency and computational tractability. This methodology was validated based on its ability to recover non-linear viscoelastic coefficients from simulated stress-relaxation (six strain levels) and dynamic cyclic (three frequencies) experimental stress-strain data. The model successfully fit each data set with average errors in recovered coefficients of 0.3% for stress-relaxation fits and 0.1% for cyclic. The results support the use of the presented methodology to develop linear or non-linear viscoelastic models from stress-relaxation or cyclic experimental data of biological soft tissues. PMID:29293558

  20. Construct validity of the Moral Development Scale for Professionals (MDSP

    Directory of Open Access Journals (Sweden)

    Söderhamn O

    2011-05-01

    Full Text Available Olle Söderhamn1,2, John Olav Bjørnestad1, Anne Skisland1, Christina Cliffordson21Faculty of Health and Sport Sciences, University of Agder, Grimstad and Kristiansand, Norway; 2Department of Nursing, Health and Culture, University West, Trollhättan, SwedenAbstract: The aim of this study was to investigate the construct validity of the Moral Development Scale for Professionals (MDSP using structural equation modeling. The instrument is a 12-item self-report instrument, developed in the Scandinavian cultural context and based on Kohlberg’s theory. A hypothesized simplex structure model underlying the MDSP was tested through structural equation modeling. Validity was also tested as the proportion of respondents older than 20 years that reached the highest moral level, which according to the theory should be small. A convenience sample of 339 nursing students with a mean age of 25.3 years participated. Results confirmed the simplex model structure, indicating that MDSP reflects a moral construct empirically organized from low to high. A minority of respondents >20 years of age (13.5% scored more than 80% on the highest moral level. The findings support the construct validity of the MDSP and the stages and levels in Kohlberg’s theory.Keywords: Kohlberg, scale testing, simplex structure model, structural equation modeling

  1. Development and validation of a prognostic model incorporating texture analysis derived from standardised segmentation of PET in patients with oesophageal cancer

    Energy Technology Data Exchange (ETDEWEB)

    Foley, Kieran G. [Cardiff University, Division of Cancer and Genetics, Cardiff (United Kingdom); Hills, Robert K. [Cardiff University, Haematology Clinical Trials Unit, Cardiff (United Kingdom); Berthon, Beatrice; Marshall, Christopher [Wales Research and Diagnostic PET Imaging Centre, Cardiff (United Kingdom); Parkinson, Craig; Spezi, Emiliano [Cardiff University, School of Engineering, Cardiff (United Kingdom); Lewis, Wyn G. [University Hospital of Wales, Department of Upper GI Surgery, Cardiff (United Kingdom); Crosby, Tom D.L. [Department of Oncology, Velindre Cancer Centre, Cardiff (United Kingdom); Roberts, Stuart Ashley [University Hospital of Wales, Department of Clinical Radiology, Cardiff (United Kingdom)

    2018-01-15

    This retrospective cohort study developed a prognostic model incorporating PET texture analysis in patients with oesophageal cancer (OC). Internal validation of the model was performed. Consecutive OC patients (n = 403) were chronologically separated into development (n = 302, September 2010-September 2014, median age = 67.0, males = 227, adenocarcinomas = 237) and validation cohorts (n = 101, September 2014-July 2015, median age = 69.0, males = 78, adenocarcinomas = 79). Texture metrics were obtained using a machine-learning algorithm for automatic PET segmentation. A Cox regression model including age, radiological stage, treatment and 16 texture metrics was developed. Patients were stratified into quartiles according to a prognostic score derived from the model. A p-value < 0.05 was considered statistically significant. Primary outcome was overall survival (OS). Six variables were significantly and independently associated with OS: age [HR =1.02 (95% CI 1.01-1.04), p < 0.001], radiological stage [1.49 (1.20-1.84), p < 0.001], treatment [0.34 (0.24-0.47), p < 0.001], log(TLG) [5.74 (1.44-22.83), p = 0.013], log(Histogram Energy) [0.27 (0.10-0.74), p = 0.011] and Histogram Kurtosis [1.22 (1.04-1.44), p = 0.017]. The prognostic score demonstrated significant differences in OS between quartiles in both the development (X{sup 2} 143.14, df 3, p < 0.001) and validation cohorts (X{sup 2} 20.621, df 3, p < 0.001). This prognostic model can risk stratify patients and demonstrates the additional benefit of PET texture analysis in OC staging. (orig.)

  2. Non-isothermal processes during the drying of bare soil: Model Development and Validation

    Science.gov (United States)

    Sleep, B.; Talebi, A.; O'Carrol, D. M.

    2017-12-01

    Several coupled liquid water, water vapor, and heat transfer models have been developed either to study non-isothermal processes in the subsurface immediately below the ground surface, or to predict the evaporative flux from the ground surface. Equilibrium phase change between water and gas phases is typically assumed in these models. Recently, a few studies have questioned this assumption and proposed a coupled model considering kinetic phase change. However, none of these models were validated against real field data. In this study, a non-isothermal coupled model incorporating kinetic phase change was developed and examined against the measured data from a green roof test module. The model also incorporated a new surface boundary condition for water vapor transport at the ground surface. The measured field data included soil moisture content and temperature at different depths up to the depth of 15 cm below the ground surface. Lysimeter data were collected to determine the evaporation rates. Short and long wave radiation, wind velocity, air ambient temperature and relative humidity were measured and used as model input. Field data were collected for a period of three months during the warm seasons in south eastern Canada. The model was calibrated using one drying period and then several other drying periods were simulated. In general, the model underestimated the evaporation rates in the early stage of the drying period, however, the cumulative evaporation was in good agreement with the field data. The model predicted the trends in temperature and moisture content at the different depths in the green roof module. The simulated temperature was lower than the measured temperature for most of the simulation time with the maximum difference of 5 ° C. The simulated moisture content changes had the same temporal trend as the lysimeter data for the events simulated.

  3. Validation and comparison of dispersion models of RTARC DSS

    International Nuclear Information System (INIS)

    Duran, J.; Pospisil, M.

    2004-01-01

    RTARC DSS (Real Time Accident Release Consequences - Decision Support System) is a computer code developed at the VUJE Trnava, Inc. (Stubna, M. et al, 1993). The code calculations include atmospheric transport and diffusion, dose assessment, evaluation and displaying of the affected zones, evaluation of the early health effects, concentration and dose rate time dependence in the selected sites etc. The simulation of the protective measures (sheltering, iodine administration) is involved. The aim of this paper is to present the process of validation of the RTARC dispersion models. RTARC includes models for calculations of release for very short (Method Monte Carlo - MEMOC), short (Gaussian Straight-Line Model) and long distances (Puff Trajectory Model - PTM). Validation of the code RTARC was performed using the results of comparisons and experiments summarized in the Table 1.: 1. Experiments and comparisons in the process of validation of the system RTARC - experiments or comparison - distance - model. Wind tunnel experiments (Universitaet der Bundeswehr, Muenchen) - Area of NPP - Method Monte Carlo. INEL (Idaho National Engineering Laboratory) - short/medium - Gaussian model and multi tracer atmospheric experiment - distances - PTM. Model Validation Kit - short distances - Gaussian model. STEP II.b 'Realistic Case Studies' - long distances - PTM. ENSEMBLE comparison - long distances - PTM (orig.)

  4. e-Learning quality: Scale development and validation in Indian context

    Directory of Open Access Journals (Sweden)

    Arun Kumar Agariya

    2012-12-01

    Full Text Available The aim of this paper is to develop a reliable and valid e-learning quality measurement scales from the learner as well as faculty perspectives in Indian context. Exploratory factor analysis followed by confirmatory factor analysis was done which is presented in two forms; covariance model and the structural model. The covariance model shows that the factors namely collaboration, industry acceptance and value addition are important from the learner’s point of view whereas the factors namely transparency in assessment, technical know-how and engagement (from students are important from faculty point of view. Factors namely course content and design structures (technology/website design are found equally important for learner’s as well as faculty’s perspective. The structural models validate the previously extracted factors along with their indicators. The findings of this study validate the long held belief that e-learning quality is a multidimensional construct and serves as a critical success factor. The proposed scale will help in identifying issues that contribute towards e-learning quality in Indian context and thereby formulating strategies accordingly, resulting in efficient (in terms of cost and effective (outcomes e-learning practices, which is the necessity of the hour for the economic development of the country. A fair amount of literature on e-learning dealt with identifying factors explaining the constructs of quality, perceived value and satisfaction. But there is paucity of research pertaining to e-learning quality scale development and validation from the learner as well as faculty perspective. This study is an attempt to bridge this gap in the existing literature.

  5. Model Validation in Ontology Based Transformations

    Directory of Open Access Journals (Sweden)

    Jesús M. Almendros-Jiménez

    2012-10-01

    Full Text Available Model Driven Engineering (MDE is an emerging approach of software engineering. MDE emphasizes the construction of models from which the implementation should be derived by applying model transformations. The Ontology Definition Meta-model (ODM has been proposed as a profile for UML models of the Web Ontology Language (OWL. In this context, transformations of UML models can be mapped into ODM/OWL transformations. On the other hand, model validation is a crucial task in model transformation. Meta-modeling permits to give a syntactic structure to source and target models. However, semantic requirements have to be imposed on source and target models. A given transformation will be sound when source and target models fulfill the syntactic and semantic requirements. In this paper, we present an approach for model validation in ODM based transformations. Adopting a logic programming based transformational approach we will show how it is possible to transform and validate models. Properties to be validated range from structural and semantic requirements of models (pre and post conditions to properties of the transformation (invariants. The approach has been applied to a well-known example of model transformation: the Entity-Relationship (ER to Relational Model (RM transformation.

  6. Three phase heat and mass transfer model for unsaturated soil freezing process: Part 2 - model validation

    Science.gov (United States)

    Zhang, Yaning; Xu, Fei; Li, Bingxi; Kim, Yong-Song; Zhao, Wenke; Xie, Gongnan; Fu, Zhongbin

    2018-04-01

    This study aims to validate the three-phase heat and mass transfer model developed in the first part (Three phase heat and mass transfer model for unsaturated soil freezing process: Part 1 - model development). Experimental results from studies and experiments were used for the validation. The results showed that the correlation coefficients for the simulated and experimental water contents at different soil depths were between 0.83 and 0.92. The correlation coefficients for the simulated and experimental liquid water contents at different soil temperatures were between 0.95 and 0.99. With these high accuracies, the developed model can be well used to predict the water contents at different soil depths and temperatures.

  7. Prospective validation of pathologic complete response models in rectal cancer: Transferability and reproducibility.

    Science.gov (United States)

    van Soest, Johan; Meldolesi, Elisa; van Stiphout, Ruud; Gatta, Roberto; Damiani, Andrea; Valentini, Vincenzo; Lambin, Philippe; Dekker, Andre

    2017-09-01

    Multiple models have been developed to predict pathologic complete response (pCR) in locally advanced rectal cancer patients. Unfortunately, validation of these models normally omit the implications of cohort differences on prediction model performance. In this work, we will perform a prospective validation of three pCR models, including information whether this validation will target transferability or reproducibility (cohort differences) of the given models. We applied a novel methodology, the cohort differences model, to predict whether a patient belongs to the training or to the validation cohort. If the cohort differences model performs well, it would suggest a large difference in cohort characteristics meaning we would validate the transferability of the model rather than reproducibility. We tested our method in a prospective validation of three existing models for pCR prediction in 154 patients. Our results showed a large difference between training and validation cohort for one of the three tested models [Area under the Receiver Operating Curve (AUC) cohort differences model: 0.85], signaling the validation leans towards transferability. Two out of three models had a lower AUC for validation (0.66 and 0.58), one model showed a higher AUC in the validation cohort (0.70). We have successfully applied a new methodology in the validation of three prediction models, which allows us to indicate if a validation targeted transferability (large differences between training/validation cohort) or reproducibility (small cohort differences). © 2017 American Association of Physicists in Medicine.

  8. Cross-validation of an employee safety climate model in Malaysia.

    Science.gov (United States)

    Bahari, Siti Fatimah; Clarke, Sharon

    2013-06-01

    Whilst substantial research has investigated the nature of safety climate, and its importance as a leading indicator of organisational safety, much of this research has been conducted with Western industrial samples. The current study focuses on the cross-validation of a safety climate model in the non-Western industrial context of Malaysian manufacturing. The first-order factorial validity of Cheyne et al.'s (1998) [Cheyne, A., Cox, S., Oliver, A., Tomas, J.M., 1998. Modelling safety climate in the prediction of levels of safety activity. Work and Stress, 12(3), 255-271] model was tested, using confirmatory factor analysis, in a Malaysian sample. Results showed that the model fit indices were below accepted levels, indicating that the original Cheyne et al. (1998) safety climate model was not supported. An alternative three-factor model was developed using exploratory factor analysis. Although these findings are not consistent with previously reported cross-validation studies, we argue that previous studies have focused on validation across Western samples, and that the current study demonstrates the need to take account of cultural factors in the development of safety climate models intended for use in non-Western contexts. The results have important implications for the transferability of existing safety climate models across cultures (for example, in global organisations) and highlight the need for future research to examine cross-cultural issues in relation to safety climate. Copyright © 2013 National Safety Council and Elsevier Ltd. All rights reserved.

  9. Development and validation of a weight-bearing finite element model for total knee replacement.

    Science.gov (United States)

    Woiczinski, M; Steinbrück, A; Weber, P; Müller, P E; Jansson, V; Schröder, Ch

    2016-01-01

    Total knee arthroplasty (TKA) is a successful procedure for osteoarthritis. However, some patients (19%) do have pain after surgery. A finite element model was developed based on boundary conditions of a knee rig. A 3D-model of an anatomical full leg was generated from magnetic resonance image data and a total knee prosthesis was implanted without patella resurfacing. In the finite element model, a restarting procedure was programmed in order to hold the ground reaction force constant with an adapted quadriceps muscle force during a squat from 20° to 105° of flexion. Knee rig experimental data were used to validate the numerical model in the patellofemoral and femorotibial joint. Furthermore, sensitivity analyses of Young's modulus of the patella cartilage, posterior cruciate ligament (PCL) stiffness, and patella tendon origin were performed. Pearson's correlations for retropatellar contact area, pressure, patella flexion, and femorotibial ap-movement were near to 1. Lowest root mean square error for retropatellar pressure, patella flexion, and femorotibial ap-movement were found for the baseline model setup with Young's modulus of 5 MPa for patella cartilage, a downscaled PCL stiffness of 25% compared to the literature given value and an anatomical origin of the patella tendon. The results of the conducted finite element model are comparable with the experimental results. Therefore, the finite element model developed in this study can be used for further clinical investigations and will help to better understand the clinical aspects after TKA with an unresurfaced patella.

  10. Development and validation of a numerical model of the swine head subjected to open-field blasts

    Science.gov (United States)

    Kalra, A.; Zhu, F.; Feng, K.; Saif, T.; Kallakuri, S.; Jin, X.; Yang, K.; King, A.

    2017-11-01

    A finite element model of the head of a 55-kg Yucatan pig was developed to calculate the incident pressure and corresponding intracranial pressure due to the explosion of 8 lb (3.63 kg) of C4 at three different distances. The results from the model were validated by comparing findings with experimentally obtained data from five pigs at three different blast overpressure levels: low (150 kPa), medium (275 kPa), and high (400 kPa). The peak values of intracranial pressures from numerical model at different locations of the brain such as the frontal, central, left temporal, right temporal, parietal, and occipital regions were compared with experimental values. The model was able to predict the peak pressure with reasonable percentage differences. The differences for peak incident and intracranial pressure values between the simulation results and the experimental values were found to be less than 2.2 and 29.3%, respectively, at all locations other than the frontal region. Additionally, a series of parametric studies shows that the intracranial pressure was very sensitive to sensor locations, the presence of air bubbles, and reflections experienced during the experiments. Further efforts will be undertaken to correlate the different biomechanical response parameters, such as the intracranial pressure gradient, stress, and strain results obtained from the validated model with injured brain locations once the histology data become available.

  11. Validation of Computer Models for Homeland Security Purposes

    International Nuclear Information System (INIS)

    Schweppe, John E.; Ely, James; Kouzes, Richard T.; McConn, Ronald J.; Pagh, Richard T.; Robinson, Sean M.; Siciliano, Edward R.; Borgardt, James D.; Bender, Sarah E.; Earnhart, Alison H.

    2005-01-01

    At Pacific Northwest National Laboratory, we are developing computer models of radiation portal monitors for screening vehicles and cargo. Detailed models of the radiation detection equipment, vehicles, cargo containers, cargos, and radioactive sources have been created. These are used to determine the optimal configuration of detectors and the best alarm algorithms for the detection of items of interest while minimizing nuisance alarms due to the presence of legitimate radioactive material in the commerce stream. Most of the modeling is done with the Monte Carlo code MCNP to describe the transport of gammas and neutrons from extended sources through large, irregularly shaped absorbers to large detectors. A fundamental prerequisite is the validation of the computational models against field measurements. We describe the first step of this validation process, the comparison of the models to measurements with bare static sources

  12. Stochastic modeling of oligodendrocyte generation in cell culture: model validation with time-lapse data

    Directory of Open Access Journals (Sweden)

    Noble Mark

    2006-05-01

    Full Text Available Abstract Background The purpose of this paper is two-fold. The first objective is to validate the assumptions behind a stochastic model developed earlier by these authors to describe oligodendrocyte generation in cell culture. The second is to generate time-lapse data that may help biomathematicians to build stochastic models of cell proliferation and differentiation under other experimental scenarios. Results Using time-lapse video recording it is possible to follow the individual evolutions of different cells within each clone. This experimental technique is very laborious and cannot replace model-based quantitative inference from clonal data. However, it is unrivalled in validating the structure of a stochastic model intended to describe cell proliferation and differentiation at the clonal level. In this paper, such data are reported and analyzed for oligodendrocyte precursor cells cultured in vitro. Conclusion The results strongly support the validity of the most basic assumptions underpinning the previously proposed model of oligodendrocyte development in cell culture. However, there are some discrepancies; the most important is that the contribution of progenitor cell death to cell kinetics in this experimental system has been underestimated.

  13. Predicting the ungauged basin : Model validation and realism assessment

    NARCIS (Netherlands)

    Van Emmerik, T.H.M.; Mulder, G.; Eilander, D.; Piet, M.; Savenije, H.H.G.

    2015-01-01

    The hydrological decade on Predictions in Ungauged Basins (PUB) led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of

  14. Predicting the ungauged basin: model validation and realism assessment

    NARCIS (Netherlands)

    van Emmerik, Tim; Mulder, Gert; Eilander, Dirk; Piet, Marijn; Savenije, Hubert

    2015-01-01

    The hydrological decade on Predictions in Ungauged Basins (PUB) led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of

  15. Validation of heat transfer models for gap cooling

    International Nuclear Information System (INIS)

    Okano, Yukimitsu; Nagae, Takashi; Murase, Michio

    2004-01-01

    For severe accident assessment of a light water reactor, models of heat transfer in a narrow annular gap between overheated core debris and a reactor pressure vessel are important for evaluating vessel integrity and accident management. The authors developed and improved the models of heat transfer. However, validation was not sufficient for applicability of the gap heat flux correlation to the debris cooling in the vessel lower head and applicability of the local boiling heat flux correlations to the high-pressure conditions. Therefore, in this paper, we evaluated the validity of the heat transfer models and correlations by analyses for ALPHA and LAVA experiments where molten aluminum oxide (Al 2 O 3 ) at about 2700 K was poured into the high pressure water pool in a small-scale simulated vessel lower head. In the heating process of the vessel wall, the calculated heating rate and peak temperature agreed well with the measured values, and the validity of the heat transfer models and gap heat flux correlation was confirmed. In the cooling process of the vessel wall, the calculated cooling rate was compared with the measured value, and the validity of the nucleate boiling heat flux correlation was confirmed. The peak temperatures of the vessel wall in ALPHA and LAVA experiments were lower than the temperature at the minimum heat flux point between film boiling and transition boiling, so the minimum heat flux correlation could not be validated. (author)

  16. Predicting the ungauged basin: Model validation and realism assessment

    Directory of Open Access Journals (Sweden)

    Tim evan Emmerik

    2015-10-01

    Full Text Available The hydrological decade on Predictions in Ungauged Basins (PUB led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of model outcome has not been discussed to a great extent. With this paper we aim to contribute to the discussion on how one can determine the value and validity of a hydrological model developed for an ungauged basin. As in many cases no local, or even regional, data are available, alternative methods should be applied. Using a PUB case study in a genuinely ungauged basin in southern Cambodia, we give several examples of how one can use different types of soft data to improve model design, calibrate and validate the model, and assess the realism of the model output. A rainfall-runoff model was coupled to an irrigation reservoir, allowing the use of additional and unconventional data. The model was mainly forced with remote sensing data, and local knowledge was used to constrain the parameters. Model realism assessment was done using data from surveys. This resulted in a successful reconstruction of the reservoir dynamics, and revealed the different hydrological characteristics of the two topographical classes. This paper does not present a generic approach that can be transferred to other ungauged catchments, but it aims to show how clever model design and alternative data acquisition can result in a valuable hydrological model for an ungauged catchment.

  17. Validation of Hydrodynamic Numerical Model of a Pitching Wave Energy Converter

    DEFF Research Database (Denmark)

    López, Maria del Pilar Heras; Thomas, Sarah; Kramer, Morten Mejlhede

    2017-01-01

    Validation of numerical model is essential in the development of new technologies. Commercial software and codes available simulating wave energy converters (WECs) have not been proved to work for all the available and upcoming technologies yet. The present paper presents the first stages...... of the validation process of a hydrodynamic numerical model for a pitching wave energy converter. The development of dry tests, wave flume and wave basin experiments are going to be explained, lessons learned shared and results presented....

  18. Preliminary validation of a Monte Carlo model for IMRT fields

    International Nuclear Information System (INIS)

    Wright, Tracy; Lye, Jessica; Mohammadi, Mohammad

    2011-01-01

    Full text: A Monte Carlo model of an Elekta linac, validated for medium to large (10-30 cm) symmetric fields, has been investigated for small, irregular and asymmetric fields suitable for IMRT treatments. The model has been validated with field segments using radiochromic film in solid water. The modelled positions of the multileaf collimator (MLC) leaves have been validated using EBT film, In the model, electrons with a narrow energy spectrum are incident on the target and all components of the linac head are included. The MLC is modelled using the EGSnrc MLCE component module. For the validation, a number of single complex IMRT segments with dimensions approximately 1-8 cm were delivered to film in solid water (see Fig, I), The same segments were modelled using EGSnrc by adjusting the MLC leaf positions in the model validated for 10 cm symmetric fields. Dose distributions along the centre of each MLC leaf as determined by both methods were compared. A picket fence test was also performed to confirm the MLC leaf positions. 95% of the points in the modelled dose distribution along the leaf axis agree with the film measurement to within 1%/1 mm for dose difference and distance to agreement. Areas of most deviation occur in the penumbra region. A system has been developed to calculate the MLC leaf positions in the model for any planned field size.

  19. Development, validation and application of multi-point kinetics model in RELAP5 for analysis of asymmetric nuclear transients

    Energy Technology Data Exchange (ETDEWEB)

    Pradhan, Santosh K., E-mail: santosh@aerb.gov.in [Nuclear Safety Analysis Division, Atomic Energy Regulatory Board, Mumbai 400094 (India); Obaidurrahman, K. [Nuclear Safety Analysis Division, Atomic Energy Regulatory Board, Mumbai 400094 (India); Iyer, Kannan N. [Department of Mechanical Engineering, IIT Bombay, Mumbai 400076 (India); Gaikwad, Avinash J. [Nuclear Safety Analysis Division, Atomic Energy Regulatory Board, Mumbai 400094 (India)

    2016-04-15

    Highlights: • A multi-point kinetics model is developed for RELAP5 system thermal hydraulics code. • Model is validated against extensive 3D kinetics code. • RELAP5 multi-point kinetics formulation is used to investigate critical break for LOCA in PHWR. - Abstract: Point kinetics approach in system code RELAP5 limits its use for many of the reactivity induced transients, which involve asymmetric core behaviour. Development of fully coupled 3D core kinetics code with system thermal-hydraulics is the ultimate requirement in this regard; however coupling and validation of 3D kinetics module with system code is cumbersome and it also requires access to source code. An intermediate approach with multi-point kinetics is appropriate and relatively easy to implement for analysis of several asymmetric transients for large cores. Multi-point kinetics formulation is based on dividing the entire core into several regions and solving ODEs describing kinetics in each region. These regions are interconnected by spatial coupling coefficients which are estimated from diffusion theory approximation. This model offers an advantage that associated ordinary differential equations (ODEs) governing multi-point kinetics formulation can be solved using numerical methods to the desired level of accuracy and thus allows formulation based on user defined control variables, i.e., without disturbing the source code and hence also avoiding associated coupling issues. Euler's method has been used in the present formulation to solve several coupled ODEs internally at each time step. The results have been verified against inbuilt point-kinetics models of RELAP5 and validated against 3D kinetics code TRIKIN. The model was used to identify the critical break in RIH of a typical large PHWR core. The neutronic asymmetry produced in the core due to the system induced transient was effectively handled by the multi-point kinetics model overcoming the limitation of in-built point kinetics model

  20. External validation of multivariable prediction models: a systematic review of methodological conduct and reporting

    Science.gov (United States)

    2014-01-01

    Background Before considering whether to use a multivariable (diagnostic or prognostic) prediction model, it is essential that its performance be evaluated in data that were not used to develop the model (referred to as external validation). We critically appraised the methodological conduct and reporting of external validation studies of multivariable prediction models. Methods We conducted a systematic review of articles describing some form of external validation of one or more multivariable prediction models indexed in PubMed core clinical journals published in 2010. Study data were extracted in duplicate on design, sample size, handling of missing data, reference to the original study developing the prediction models and predictive performance measures. Results 11,826 articles were identified and 78 were included for full review, which described the evaluation of 120 prediction models. in participant data that were not used to develop the model. Thirty-three articles described both the development of a prediction model and an evaluation of its performance on a separate dataset, and 45 articles described only the evaluation of an existing published prediction model on another dataset. Fifty-seven percent of the prediction models were presented and evaluated as simplified scoring systems. Sixteen percent of articles failed to report the number of outcome events in the validation datasets. Fifty-four percent of studies made no explicit mention of missing data. Sixty-seven percent did not report evaluating model calibration whilst most studies evaluated model discrimination. It was often unclear whether the reported performance measures were for the full regression model or for the simplified models. Conclusions The vast majority of studies describing some form of external validation of a multivariable prediction model were poorly reported with key details frequently not presented. The validation studies were characterised by poor design, inappropriate handling

  1. Ground-water models: Validate or invalidate

    Science.gov (United States)

    Bredehoeft, J.D.; Konikow, Leonard F.

    1993-01-01

    The word validation has a clear meaning to both the scientific community and the general public. Within the scientific community the validation of scientific theory has been the subject of philosophical debate. The philosopher of science, Karl Popper, argued that scientific theory cannot be validated, only invalidated. Popper’s view is not the only opinion in this debate; however, many scientists today agree with Popper (including the authors). To the general public, proclaiming that a ground-water model is validated carries with it an aura of correctness that we do not believe many of us who model would claim. We can place all the caveats we wish, but the public has its own understanding of what the word implies. Using the word valid with respect to models misleads the public; verification carries with it similar connotations as far as the public is concerned. Our point is this: using the terms validation and verification are misleading, at best. These terms should be abandoned by the ground-water community.

  2. Political Representation and Gender Inequalities Testing the Validity of Model Developed for Pakistan using a Data Set of Malaysia

    OpenAIRE

    Najeebullah Khan; Adnan Hussein; Zahid Awan; Bakhtiar Khan

    2012-01-01

    This study measured the impacts of six independent variables (political rights, election system type, political quota, literacy rate, labor force participation and GDP per capita at current price in US dollar) on the dependent variable (percentage of women representation in national legislature) using multiple linear regression models. At a first step we developed and tested the model without of sample data of Pakistan. For model construction and validation ten years data from the year 1999 a...

  3. Evaluation factors for verification and validation of low-level waste disposal site models

    International Nuclear Information System (INIS)

    Moran, M.S.; Mezga, L.J.

    1982-01-01

    The purpose of this paper is to identify general evaluation factors to be used to verify and validate LLW disposal site performance models in order to assess their site-specific applicability and to determine their accuracy and sensitivity. It is intended that the information contained in this paper be employed by model users involved with LLW site performance model verification and validation. It should not be construed as providing protocols, but rather as providing a framework for the preparation of specific protocols or procedures. A brief description of each evaluation factor is provided. The factors have been categorized according to recommended use during either the model verification or the model validation process. The general responsibilities of the developer and user are provided. In many cases it is difficult to separate the responsibilities of the developer and user, but the user is ultimately accountable for both verification and validation processes. 4 refs

  4. Development and validation of prediction models for endometrial cancer in postmenopausal bleeding.

    Science.gov (United States)

    Wong, Alyssa Sze-Wai; Cheung, Chun Wai; Fung, Linda Wen-Ying; Lao, Terence Tzu-Hsi; Mol, Ben Willem J; Sahota, Daljit Singh

    2016-08-01

    To develop and assess the accuracy of risk prediction models to diagnose endometrial cancer in women having postmenopausal bleeding (PMB). A retrospective cohort study of 4383 women in a One-stop PMB clinic from a university teaching hospital in Hong Kong. Clinical risk factors, transvaginal ultrasonic measurement of endometrial thickness (ET) and endometrial histology were obtained from consecutive women between 2002 and 2013. Two models to predict risk of endometrial cancer were developed and assessed, one based on patient characteristics alone and a second incorporated ET with patient characteristics. Endometrial histology was used as the reference standard. The split-sample internal validation and bootstrapping technique were adopted. The optimal threshold for prediction of endometrial cancer by the final models was determined using a receiver-operating characteristics (ROC) curve and Youden Index. The diagnostic gain was compared to a reference strategy of measuring ET only by comparing the AUC using the Delong test. Out of 4383 women with PMB, 168 (3.8%) were diagnosed with endometrial cancer. ET alone had an area under curve (AUC) of 0.92 (95% confidence intervals [CIs] 0.89-0.94). In the patient characteristics only model, independent predictors of cancer were age at presentation, age at menopause, body mass index, nulliparity and recurrent vaginal bleeding. The AUC and Youdens Index of the patient characteristic only model were respectively 0.73 (95% CI 0.67-0.80) and 0.72 (Sensitivity=66.5%; Specificity=68.9%; +ve LR=2.14; -ve LR=0.49). ET, age at presentation, nulliparity and recurrent vaginal bleeding were independent predictors in the patient characteristics plus ET model. The AUC and Youdens Index of the patient characteristic plus ET model where respectively 0.92 (95% CI 0.88-0.96) and 0.71 (Sensitivity=82.7%; Specificity=88.3%; +ve LR=6.38; -ve LR=0.2). Comparison of AUC indicated that a history alone model was inferior to a model using ET alone

  5. Novel intrinsic-based submodel for char particle gasification in entrained-flow gasifiers: Model development, validation and illustration

    International Nuclear Information System (INIS)

    Schulze, S.; Richter, A.; Vascellari, M.; Gupta, A.; Meyer, B.; Nikrityuk, P.A.

    2016-01-01

    Highlights: • Model resolving intra-particle species transport for char conversion was formulated. • TGA experiments of char particle conversion in gas flow were conducted. • The experimental results for char conversion validated the model. • CFD simulations of endothermic reactor with developed model were carried out. - Abstract: The final carbon conversion rate is of critical importance in the efficiency of gasifiers. Therefore, comprehensive modeling of char particle conversion is of primary interest for designing new gasifiers. This work presents a novel intrinsic-based submodel for the gasification of a char particle moving in a hot flue gas environment considering CO 2 and H 2 O as inlet species. The first part of the manuscript describes the model and its derivation. Validations against experiments carried out in this work for German lignite char are reported in the second part. The comparison between submodel predictions and experimental data shows good agreement. The importance of char porosity change during gasification is demonstrated. The third part presents the results of CFD simulations using the new submodel and a surface-based submodel for a generic endothermic gasifier. The focus of CFD simulations is to demonstrate the crucial role of intrinsic based heterogeneous reactions in the adequate prediction of carbon conversion rates.

  6. Development and validation of the short-form Adolescent Health Promotion Scale.

    Science.gov (United States)

    Chen, Mei-Yen; Lai, Li-Ju; Chen, Hsiu-Chih; Gaete, Jorge

    2014-10-26

    Health-promoting lifestyle choices of adolescents are closely related to current and subsequent health status. However, parsimonious yet reliable and valid screening tools are scarce. The original 40-item adolescent health promotion (AHP) scale was developed by our research team and has been applied to measure adolescent health-promoting behaviors worldwide. The aim of our study was to examine the psychometric properties of a newly developed short-form version of the AHP (AHP-SF) including tests of its reliability and validity. The study was conducted in nine middle and high schools in southern Taiwan. Participants were 814 adolescents randomly divided into two subgroups with equal size and homogeneity of baseline characteristics. The first subsample (calibration sample) was used to modify and shorten the factorial model while the second subsample (validation sample) was utilized to validate the result obtained from the first one. The psychometric testing of the AHP-SF included internal reliability of McDonald's omega and Cronbach's alpha, convergent validity, discriminant validity, and construct validity with confirmatory factor analysis (CFA). The results of the CFA supported a six-factor model and 21 items were retained in the AHP-SF with acceptable model fit. For the discriminant validity test, results indicated that adolescents with lower AHP-SF scores were more likely to be overweight or obese, skip breakfast, and spend more time watching TV and playing computer games. The AHP-SF also showed excellent internal consistency with a McDonald's omega of 0.904 (Cronbach's alpha 0.905) in the calibration group. The current findings suggest that the AHP-SF is a valid and reliable instrument for the evaluation of adolescent health-promoting behaviors. Primary health care providers and clinicians can use the AHP-SF to assess these behaviors and evaluate the outcome of health promotion programs in the adolescent population.

  7. Validation Hydrodynamic Models of Three Topological Models of Secondary Facultative Ponds

    OpenAIRE

    Aponte-Reyes Alxander

    2014-01-01

    A methodology was developed to analyze boundary conditions, the size of the mesh and the turbulence of a mathematical model of CFD, which could explain hydrodynamic behavior on facultative stabilization ponds, FSP, built to pilot scale: conventional pond, CP, baffled pond, BP, and baffled-mesh pond, BMP. Models dispersion studies were performed in field for validation, taking samples into and out of the FSP, the information was used to carry out CFD model simulations of the three topologies. ...

  8. Model-based verification and validation of the SMAP uplink processes

    Science.gov (United States)

    Khan, M. O.; Dubos, G. F.; Tirona, J.; Standley, S.

    Model-Based Systems Engineering (MBSE) is being used increasingly within the spacecraft design community because of its benefits when compared to document-based approaches. As the complexity of projects expands dramatically with continually increasing computational power and technology infusion, the time and effort needed for verification and validation (V& V) increases geometrically. Using simulation to perform design validation with system-level models earlier in the life cycle stands to bridge the gap between design of the system (based on system-level requirements) and verifying those requirements/validating the system as a whole. This case study stands as an example of how a project can validate a system-level design earlier in the project life cycle than traditional V& V processes by using simulation on a system model. Specifically, this paper describes how simulation was added to a system model of the Soil Moisture Active-Passive (SMAP) mission's uplink process. Also discussed are the advantages and disadvantages of the methods employed and the lessons learned; which are intended to benefit future model-based and simulation-based development efforts.

  9. Modelling the fate of sulphur-35 in crops. 2. Development and validation of the CROPS-35 model

    International Nuclear Information System (INIS)

    Collins, Chris; Cunningham, Nathan

    2005-01-01

    Gas-cooled nuclear power plants in the UK release sulphur-35 during their routine operation, which can be readily assimilated by vegetation. It is therefore necessary to be able to model the uptake of such releases in order to quantify any potential contamination of the food chain. A model is described which predicts the concentration of 35 S in crop components following an aerial gaseous release. Following deposition the allocation to crop components is determined by an export function from a labile pool, the leaves, to those components growing most actively post exposure. The growth rates are determined by crop growth data, which is also used to determine the concentration. The loss of activity is controlled by radioactive decay only. The paper describes the calibration and the validation of the model. To improve the model, further experimental work is required particularly on the export kinetics of 35 S. It may be possible to adapt such a modelling approach to the prediction of crop content for gaseous releases of 3 H and 14 C from nuclear facilities. - The calibration and validation of a model for the prediction of the fate of 35 S in vegetation is described

  10. SPR Hydrostatic Column Model Verification and Validation.

    Energy Technology Data Exchange (ETDEWEB)

    Bettin, Giorgia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lord, David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rudeen, David Keith [Gram, Inc. Albuquerque, NM (United States)

    2015-10-01

    A Hydrostatic Column Model (HCM) was developed to help differentiate between normal "tight" well behavior and small-leak behavior under nitrogen for testing the pressure integrity of crude oil storage wells at the U.S. Strategic Petroleum Reserve. This effort was motivated by steady, yet distinct, pressure behavior of a series of Big Hill caverns that have been placed under nitrogen for extended period of time. This report describes the HCM model, its functional requirements, the model structure and the verification and validation process. Different modes of operation are also described, which illustrate how the software can be used to model extended nitrogen monitoring and Mechanical Integrity Tests by predicting wellhead pressures along with nitrogen interface movements. Model verification has shown that the program runs correctly and it is implemented as intended. The cavern BH101 long term nitrogen test was used to validate the model which showed very good agreement with measured data. This supports the claim that the model is, in fact, capturing the relevant physical phenomena and can be used to make accurate predictions of both wellhead pressure and interface movements.

  11. Development and validation of a facial expression database based on the dimensional and categorical model of emotions.

    Science.gov (United States)

    Fujimura, Tomomi; Umemura, Hiroyuki

    2018-01-15

    The present study describes the development and validation of a facial expression database comprising five different horizontal face angles in dynamic and static presentations. The database includes twelve expression types portrayed by eight Japanese models. This database was inspired by the dimensional and categorical model of emotions: surprise, fear, sadness, anger with open mouth, anger with closed mouth, disgust with open mouth, disgust with closed mouth, excitement, happiness, relaxation, sleepiness, and neutral (static only). The expressions were validated using emotion classification and Affect Grid rating tasks [Russell, Weiss, & Mendelsohn, 1989. Affect Grid: A single-item scale of pleasure and arousal. Journal of Personality and Social Psychology, 57(3), 493-502]. The results indicate that most of the expressions were recognised as the intended emotions and could systematically represent affective valence and arousal. Furthermore, face angle and facial motion information influenced emotion classification and valence and arousal ratings. Our database will be available online at the following URL. https://www.dh.aist.go.jp/database/face2017/ .

  12. NRPB models for calculating the transfer of radionuclides through the environment. Verification and validation

    International Nuclear Information System (INIS)

    Attwood, C.; Barraclough, I.; Brown, J.

    1998-06-01

    There is a wide range of models available at NRPB to predict the transfer of radionuclides through the environment. Such models form an essential part of assessments of the radiological impact of releases of radionuclides into the environment. These models cover: the atmosphere; the aquatic environment; the geosphere; the terrestrial environment including foodchains. It is important that the models used for radiological impact assessments are robust, reliable and suitable for the assessment being undertaken. During model development it is, therefore, important that the model is both verified and validated. Verification of a model involves ensuring that it has been implemented correctly, while validation consists of demonstrating that the model is an adequate representation of the real environment. The extent to which a model can be verified depends on its complexity and whether similar models exist. For relatively simple models verification is straightforward, but for more complex models verification has to form part of the development, coding and testing of the model within quality assurance procedures. Validation of models should ideally consist of comparisons between the results of the models and experimental or environmental measurement data that were not used to develop the model. This is more straightforward for some models than for others depending on the quantity and type of data available. Validation becomes increasingly difficult for models which are intended to predict environmental transfer at long times or at great distances. It is, therefore, necessary to adopt qualitative validation techniques to ensure that the model is an adequate representation of the real environment. This report summarises the models used at NRPB to predict the transfer of radionuclides through the environment as part of a radiological impact assessment. It outlines the work carried out to verify and validate the models. The majority of these models are not currently available

  13. Development and validation of a catalytic recombiner model for the containment code RALOC MOD4.0

    International Nuclear Information System (INIS)

    Rohde, J.; Klein-Hebling, W.; Chakraborty, A.K.

    1997-01-01

    This paper reports on the development of a catalytic recombiner model for the containment code RALOC MOD4.0 /KLH 95, KLH 96/ and the detailed validation work, carried out at GRS. The model was qualified by using the results of medium and large scale experiments, being performed in Germany /KAN 91/. The comparison of measured data with the calculations demonstrates, that this new model is suitable for real plant applications to investigate the overall effectiveness of a catalytic recombiner system under severe accident conditions for large dry containments of German PWR design. The results of such investigations will serve as the basis to work out some guidance for the determination of the system capacity needed and an optimal positioning of such devices in containments. (author)

  14. Identification of patients at high risk for Clostridium difficile infection : Development and validation of a risk prediction model in hospitalized patients treated with antibiotics

    NARCIS (Netherlands)

    van Werkhoven, C. H.; van der Tempel, J.; Jajou, R.; Thijsen, S. F T; Diepersloot, R. J A; Bonten, M. J M; Postma, D. F.; Oosterheert, J. J.

    2015-01-01

    To develop and validate a prediction model for Clostridium difficile infection (CDI) in hospitalized patients treated with systemic antibiotics, we performed a case-cohort study in a tertiary (derivation) and secondary care hospital (validation). Cases had a positive Clostridium test and were

  15. Development and validation of models for bubble coalescence and breakup

    Energy Technology Data Exchange (ETDEWEB)

    Liao, Yiaxiang

    2013-10-08

    A generalized model for bubble coalescence and breakup has been developed, which is based on a comprehensive survey of existing theories and models. One important feature of the model is that all important mechanisms leading to bubble coalescence and breakup in a turbulent gas-liquid flow are considered. The new model is tested extensively in a 1D Test Solver and a 3D CFD code ANSYS CFX for the case of vertical gas-liquid pipe flow under adiabatic conditions, respectively. Two kinds of extensions of the standard multi-fluid model, i.e. the discrete population model and the inhomogeneous MUSIG (multiple-size group) model, are available in the two solvers, respectively. These extensions with suitable closure models such as those for coalescence and breakup are able to predict the evolution of bubble size distribution in dispersed flows and to overcome the mono-dispersed flow limitation of the standard multi-fluid model. For the validation of the model the high quality database of the TOPFLOW L12 experiments for air-water flow in a vertical pipe was employed. A wide range of test points, which cover the bubbly flow, turbulent-churn flow as well as the transition regime, is involved in the simulations. The comparison between the simulated results such as bubble size distribution, gas velocity and volume fraction and the measured ones indicates a generally good agreement for all selected test points. As the superficial gas velocity increases, bubble size distribution evolves via coalescence dominant regimes first, then breakup-dominant regimes and finally turns into a bimodal distribution. The tendency of the evolution is well reproduced by the model. However, the tendency is almost always overestimated, i.e. too much coalescence in the coalescence dominant case while too much breakup in breakup dominant ones. The reason of this problem is discussed by studying the contribution of each coalescence and breakup mechanism at different test points. The redistribution of the

  16. Development and validation of models for bubble coalescence and breakup

    International Nuclear Information System (INIS)

    Liao, Yiaxiang

    2013-01-01

    A generalized model for bubble coalescence and breakup has been developed, which is based on a comprehensive survey of existing theories and models. One important feature of the model is that all important mechanisms leading to bubble coalescence and breakup in a turbulent gas-liquid flow are considered. The new model is tested extensively in a 1D Test Solver and a 3D CFD code ANSYS CFX for the case of vertical gas-liquid pipe flow under adiabatic conditions, respectively. Two kinds of extensions of the standard multi-fluid model, i.e. the discrete population model and the inhomogeneous MUSIG (multiple-size group) model, are available in the two solvers, respectively. These extensions with suitable closure models such as those for coalescence and breakup are able to predict the evolution of bubble size distribution in dispersed flows and to overcome the mono-dispersed flow limitation of the standard multi-fluid model. For the validation of the model the high quality database of the TOPFLOW L12 experiments for air-water flow in a vertical pipe was employed. A wide range of test points, which cover the bubbly flow, turbulent-churn flow as well as the transition regime, is involved in the simulations. The comparison between the simulated results such as bubble size distribution, gas velocity and volume fraction and the measured ones indicates a generally good agreement for all selected test points. As the superficial gas velocity increases, bubble size distribution evolves via coalescence dominant regimes first, then breakup-dominant regimes and finally turns into a bimodal distribution. The tendency of the evolution is well reproduced by the model. However, the tendency is almost always overestimated, i.e. too much coalescence in the coalescence dominant case while too much breakup in breakup dominant ones. The reason of this problem is discussed by studying the contribution of each coalescence and breakup mechanism at different test points. The redistribution of the

  17. A broad view of model validation

    International Nuclear Information System (INIS)

    Tsang, C.F.

    1989-10-01

    The safety assessment of a nuclear waste repository requires the use of models. Such models need to be validated to ensure, as much as possible, that they are a good representation of the actual processes occurring in the real system. In this paper we attempt to take a broad view by reviewing step by step the modeling process and bringing out the need to validating every step of this process. This model validation includes not only comparison of modeling results with data from selected experiments, but also evaluation of procedures for the construction of conceptual models and calculational models as well as methodologies for studying data and parameter correlation. The need for advancing basic scientific knowledge in related fields, for multiple assessment groups, and for presenting our modeling efforts in open literature to public scrutiny is also emphasized. 16 refs

  18. Development and Validity of a Silicone Renal Tumor Model for Robotic Partial Nephrectomy Training.

    Science.gov (United States)

    Monda, Steven M; Weese, Jonathan R; Anderson, Barrett G; Vetter, Joel M; Venkatesh, Ramakrishna; Du, Kefu; Andriole, Gerald L; Figenshau, Robert S

    2018-04-01

    To provide a training tool to address the technical challenges of robot-assisted laparoscopic partial nephrectomy, we created silicone renal tumor models using 3-dimensional printed molds of a patient's kidney with a mass. In this study, we assessed the face, content, and construct validity of these models. Surgeons of different training levels completed 4 simulations on silicone renal tumor models. Participants were surveyed on the usefulness and realism of the model as a training tool. Performance was measured using operation-specific metrics, self-reported operative demands (NASA Task Load Index [NASA TLX]), and blinded expert assessment (Global Evaluative Assessment of Robotic Surgeons [GEARS]). Twenty-four participants included attending urologists, endourology fellows, urology residents, and medical students. Post-training surveys of expert participants yielded mean results of 79.2 on the realism of the model's overall feel and 90.2 on the model's overall usefulness for training. Renal artery clamp times and GEARS scores were significantly better in surgeons further in training (P ≤.005 and P ≤.025). Renal artery clamp times, preserved renal parenchyma, positive margins, NASA TLX, and GEARS scores were all found to improve across trials (P <.001, P = .025, P = .024, P ≤.020, and P ≤.006, respectively). Face, content, and construct validity were demonstrated in the use of a silicone renal tumor model in a cohort of surgeons of different training levels. Expert participants deemed the model useful and realistic. Surgeons of higher training levels performed better than less experienced surgeons in various study metrics, and improvements within individuals were observed over sequential trials. Future studies should aim to assess model predictive validity, namely, the association between model performance improvements and improvements in live surgery. Copyright © 2018 Elsevier Inc. All rights reserved.

  19. Unit testing, model validation, and biological simulation.

    Science.gov (United States)

    Sarma, Gopal P; Jacobs, Travis W; Watts, Mark D; Ghayoomie, S Vahid; Larson, Stephen D; Gerkin, Richard C

    2016-01-01

    The growth of the software industry has gone hand in hand with the development of tools and cultural practices for ensuring the reliability of complex pieces of software. These tools and practices are now acknowledged to be essential to the management of modern software. As computational models and methods have become increasingly common in the biological sciences, it is important to examine how these practices can accelerate biological software development and improve research quality. In this article, we give a focused case study of our experience with the practices of unit testing and test-driven development in OpenWorm, an open-science project aimed at modeling Caenorhabditis elegans. We identify and discuss the challenges of incorporating test-driven development into a heterogeneous, data-driven project, as well as the role of model validation tests, a category of tests unique to software which expresses scientific models.

  20. Validation of a Previously Developed Geospatial Model That Predicts the Prevalence of Listeria monocytogenes in New York State Produce Fields

    Science.gov (United States)

    Weller, Daniel; Shiwakoti, Suvash; Bergholz, Peter; Grohn, Yrjo; Wiedmann, Martin

    2015-01-01

    Technological advancements, particularly in the field of geographic information systems (GIS), have made it possible to predict the likelihood of foodborne pathogen contamination in produce production environments using geospatial models. Yet, few studies have examined the validity and robustness of such models. This study was performed to test and refine the rules associated with a previously developed geospatial model that predicts the prevalence of Listeria monocytogenes in produce farms in New York State (NYS). Produce fields for each of four enrolled produce farms were categorized into areas of high or low predicted L. monocytogenes prevalence using rules based on a field's available water storage (AWS) and its proximity to water, impervious cover, and pastures. Drag swabs (n = 1,056) were collected from plots assigned to each risk category. Logistic regression, which tested the ability of each rule to accurately predict the prevalence of L. monocytogenes, validated the rules based on water and pasture. Samples collected near water (odds ratio [OR], 3.0) and pasture (OR, 2.9) showed a significantly increased likelihood of L. monocytogenes isolation compared to that for samples collected far from water and pasture. Generalized linear mixed models identified additional land cover factors associated with an increased likelihood of L. monocytogenes isolation, such as proximity to wetlands. These findings validated a subset of previously developed rules that predict L. monocytogenes prevalence in produce production environments. This suggests that GIS and geospatial models can be used to accurately predict L. monocytogenes prevalence on farms and can be used prospectively to minimize the risk of preharvest contamination of produce. PMID:26590280

  1. Development and validation of health service management competencies.

    Science.gov (United States)

    Liang, Zhanming; Howard, Peter F; Leggat, Sandra; Bartram, Timothy

    2018-04-09

    Purpose The importance of managerial competencies in monitoring and improving the performance of organisational leaders and managers is well accepted. Different processes have been used to identify and develop competency frameworks or models for healthcare managers around the world to meet different contextual needs. The purpose of the paper is to introduce a validated process in management competency identification and development applied in Australia - a process leading to a management competency framework with associated behavioural items that can be used to measure core management competencies of health service managers. Design/methodology/approach The management competency framework development study incorporated both qualitative and quantitative methods, implemented in four stages, including job description analysis, focus group discussions and online surveys. Findings The study confirmed that the four-stage process could identify management competencies and the framework developed is considered reliable and valid for developing a management competency assessment tool that can measure management competence amongst managers in health organisations. In addition, supervisors of health service managers could use the framework to distinguish perceived superior and average performers among managers in health organisations. Practical implications Developing the core competencies of health service managers is important for management performance improvement and talent management. The six core management competencies identified can be used to guide the design professional development activities for health service managers. Originality/value The validated management competency identification and development process can be applied in other countries and different industrial contexts to identify core management competency requirements.

  2. Validation of a multi-objective, predictive urban traffic model

    NARCIS (Netherlands)

    Wilmink, I.R.; Haak, P. van den; Woldeab, Z.; Vreeswijk, J.

    2013-01-01

    This paper describes the results of the verification and validation of the ecoStrategic Model, which was developed, implemented and tested in the eCoMove project. The model uses real-time and historical traffic information to determine the current, predicted and desired state of traffic in a

  3. Predicting Overall Survival After Stereotactic Ablative Radiation Therapy in Early-Stage Lung Cancer: Development and External Validation of the Amsterdam Prognostic Model

    Energy Technology Data Exchange (ETDEWEB)

    Louie, Alexander V., E-mail: Dr.alexlouie@gmail.com [Department of Radiation Oncology, VU University Medical Center, Amsterdam (Netherlands); Department of Radiation Oncology, London Regional Cancer Program, University of Western Ontario, London, Ontario (Canada); Department of Epidemiology, Harvard School of Public Health, Harvard University, Boston, Massachusetts (United States); Haasbeek, Cornelis J.A. [Department of Radiation Oncology, VU University Medical Center, Amsterdam (Netherlands); Mokhles, Sahar [Department of Cardio-Thoracic Surgery, Erasmus University Medical Center, Rotterdam (Netherlands); Rodrigues, George B. [Department of Radiation Oncology, London Regional Cancer Program, University of Western Ontario, London, Ontario (Canada); Stephans, Kevin L. [Department of Radiation Oncology, Taussig Cancer Institute, Cleveland Clinic, Cleveland, Ohio (United States); Lagerwaard, Frank J. [Department of Radiation Oncology, VU University Medical Center, Amsterdam (Netherlands); Palma, David A. [Department of Radiation Oncology, London Regional Cancer Program, University of Western Ontario, London, Ontario (Canada); Videtic, Gregory M.M. [Department of Radiation Oncology, Taussig Cancer Institute, Cleveland Clinic, Cleveland, Ohio (United States); Warner, Andrew [Department of Radiation Oncology, London Regional Cancer Program, University of Western Ontario, London, Ontario (Canada); Takkenberg, Johanna J.M. [Department of Cardio-Thoracic Surgery, Erasmus University Medical Center, Rotterdam (Netherlands); Reddy, Chandana A. [Department of Radiation Oncology, Taussig Cancer Institute, Cleveland Clinic, Cleveland, Ohio (United States); Maat, Alex P.W.M. [Department of Cardio-Thoracic Surgery, Erasmus University Medical Center, Rotterdam (Netherlands); Woody, Neil M. [Department of Radiation Oncology, Taussig Cancer Institute, Cleveland Clinic, Cleveland, Ohio (United States); Slotman, Ben J.; Senan, Suresh [Department of Radiation Oncology, VU University Medical Center, Amsterdam (Netherlands)

    2015-09-01

    Purpose: A prognostic model for 5-year overall survival (OS), consisting of recursive partitioning analysis (RPA) and a nomogram, was developed for patients with early-stage non-small cell lung cancer (ES-NSCLC) treated with stereotactic ablative radiation therapy (SABR). Methods and Materials: A primary dataset of 703 ES-NSCLC SABR patients was randomly divided into a training (67%) and an internal validation (33%) dataset. In the former group, 21 unique parameters consisting of patient, treatment, and tumor factors were entered into an RPA model to predict OS. Univariate and multivariate models were constructed for RPA-selected factors to evaluate their relationship with OS. A nomogram for OS was constructed based on factors significant in multivariate modeling and validated with calibration plots. Both the RPA and the nomogram were externally validated in independent surgical (n=193) and SABR (n=543) datasets. Results: RPA identified 2 distinct risk classes based on tumor diameter, age, World Health Organization performance status (PS) and Charlson comorbidity index. This RPA had moderate discrimination in SABR datasets (c-index range: 0.52-0.60) but was of limited value in the surgical validation cohort. The nomogram predicting OS included smoking history in addition to RPA-identified factors. In contrast to RPA, validation of the nomogram performed well in internal validation (r{sup 2}=0.97) and external SABR (r{sup 2}=0.79) and surgical cohorts (r{sup 2}=0.91). Conclusions: The Amsterdam prognostic model is the first externally validated prognostication tool for OS in ES-NSCLC treated with SABR available to individualize patient decision making. The nomogram retained strong performance across surgical and SABR external validation datasets. RPA performance was poor in surgical patients, suggesting that 2 different distinct patient populations are being treated with these 2 effective modalities.

  4. Verification and Validation of FAARR Model and Data Envelopment Analysis Models for United States Army Recruiting

    National Research Council Canada - National Science Library

    Piskator, Gene

    1998-01-01

    ...) model and to develop a Data Envelopment Analysis (DEA) modeling strategy. First, the FAARR model was verified using a simulation of a known production function and validated using sensitivity analysis and ex-post forecasts...

  5. Model performance analysis and model validation in logistic regression

    Directory of Open Access Journals (Sweden)

    Rosa Arboretti Giancristofaro

    2007-10-01

    Full Text Available In this paper a new model validation procedure for a logistic regression model is presented. At first, we illustrate a brief review of different techniques of model validation. Next, we define a number of properties required for a model to be considered "good", and a number of quantitative performance measures. Lastly, we describe a methodology for the assessment of the performance of a given model by using an example taken from a management study.

  6. The validity of world class business criteria across developed and developing countries

    Directory of Open Access Journals (Sweden)

    Andre J. Parker

    2010-11-01

    Research purpose: To assess the validity of the general assumption in the literature that world class criteria are equally applicable worldwide. Motivation for research: The possibility exists that developing countries require an adjusted mix of world class criteria and practices to become globally competitive. Research design, approach and method: A quantitative field survey research approach was adopted. A web-enabled questionnaire was designed, covering 35 world class practices grouped under 7 world class criteria. A cross-section of the senior management from 14 developing and 20 developed country’s organisations partook in the study. Main findings: It was empirically confirmed that the majority of world class practices posited in the literature are used by participating organisations; that world class criteria do not apply equally across developed and developing countries; and that more important than country location, is the deliberate choice by an organisation’s leadership to become world class. An empirically based model of ascending to world class was proposed. Practical/managerial implications: Regardless of country location, the leadership of an organisation can make their organisation world class by applying the proposed world class model. Contribution/value add: A reliable web enabled instrument was designed that can be used to assess an organisation’s world class standing; the assumption that world class criteria are equally valid across developing and developed countries was proven partially incorrect; since becoming or being world class is also a leadership choice regardless of location.

  7. Validation of Inhibition Effect in the Cellulose Hydrolysis: a Dynamic Modelling Approach

    DEFF Research Database (Denmark)

    Morales Rodriguez, Ricardo; Tsai, Chien-Tai; Meyer, Anne S.

    2011-01-01

    Enzymatic hydrolysis is one of the main steps in the processing of bioethanol from lignocellulosic raw materials. However, complete understanding of the underlying phenomena is still under development. Hence, this study has focused on validation of the inhibition effects in the cellulosic biomass...... for parameter estimation (calibration) and validation purposes. The model predictions using calibrated parameters have shown good agreement with the validation data sets, which provides credibility to the model structure and the parameter values....

  8. Establishing model credibility involves more than validation

    International Nuclear Information System (INIS)

    Kirchner, T.

    1991-01-01

    One widely used definition of validation is that the quantitative test of the performance of a model through the comparison of model predictions to independent sets of observations from the system being simulated. The ability to show that the model predictions compare well with observations is often thought to be the most rigorous test that can be used to establish credibility for a model in the scientific community. However, such tests are only part of the process used to establish credibility, and in some cases may be either unnecessary or misleading. Naylor and Finger extended the concept of validation to include the establishment of validity for the postulates embodied in the model and the test of assumptions used to select postulates for the model. Validity of postulates is established through concurrence by experts in the field of study that the mathematical or conceptual model contains the structural components and mathematical relationships necessary to adequately represent the system with respect to the goals for the model. This extended definition of validation provides for consideration of the structure of the model, not just its performance, in establishing credibility. Evaluation of a simulation model should establish the correctness of the code and the efficacy of the model within its domain of applicability. (24 refs., 6 figs.)

  9. Accounting for treatment use when validating a prognostic model: a simulation study.

    Science.gov (United States)

    Pajouheshnia, Romin; Peelen, Linda M; Moons, Karel G M; Reitsma, Johannes B; Groenwold, Rolf H H

    2017-07-14

    Prognostic models often show poor performance when applied to independent validation data sets. We illustrate how treatment use in a validation set can affect measures of model performance and present the uses and limitations of available analytical methods to account for this using simulated data. We outline how the use of risk-lowering treatments in a validation set can lead to an apparent overestimation of risk by a prognostic model that was developed in a treatment-naïve cohort to make predictions of risk without treatment. Potential methods to correct for the effects of treatment use when testing or validating a prognostic model are discussed from a theoretical perspective.. Subsequently, we assess, in simulated data sets, the impact of excluding treated individuals and the use of inverse probability weighting (IPW) on the estimated model discrimination (c-index) and calibration (observed:expected ratio and calibration plots) in scenarios with different patterns and effects of treatment use. Ignoring the use of effective treatments in a validation data set leads to poorer model discrimination and calibration than would be observed in the untreated target population for the model. Excluding treated individuals provided correct estimates of model performance only when treatment was randomly allocated, although this reduced the precision of the estimates. IPW followed by exclusion of the treated individuals provided correct estimates of model performance in data sets where treatment use was either random or moderately associated with an individual's risk when the assumptions of IPW were met, but yielded incorrect estimates in the presence of non-positivity or an unobserved confounder. When validating a prognostic model developed to make predictions of risk without treatment, treatment use in the validation set can bias estimates of the performance of the model in future targeted individuals, and should not be ignored. When treatment use is random, treated

  10. Progress in Geant4 Electromagnetic Physics Modelling and Validation

    International Nuclear Information System (INIS)

    Apostolakis, J; Burkhardt, H; Ivanchenko, V N; Asai, M; Bagulya, A; Grichine, V; Brown, J M C; Chikuma, N; Cortes-Giraldo, M A; Elles, S; Jacquemier, J; Guatelli, S; Incerti, S; Kadri, O; Maire, M; Urban, L; Pandola, L; Sawkey, D; Toshito, T; Yamashita, T

    2015-01-01

    In this work we report on recent improvements in the electromagnetic (EM) physics models of Geant4 and new validations of EM physics. Improvements have been made in models of the photoelectric effect, Compton scattering, gamma conversion to electron and muon pairs, fluctuations of energy loss, multiple scattering, synchrotron radiation, and high energy positron annihilation. The results of these developments are included in the new Geant4 version 10.1 and in patches to previous versions 9.6 and 10.0 that are planned to be used for production for run-2 at LHC. The Geant4 validation suite for EM physics has been extended and new validation results are shown in this work. In particular, the effect of gamma-nuclear interactions on EM shower shape at LHC energies is discussed. (paper)

  11. Development and validation of a prediction model for long-term sickness absence based on occupational health survey variables.

    Science.gov (United States)

    Roelen, Corné; Thorsen, Sannie; Heymans, Martijn; Twisk, Jos; Bültmann, Ute; Bjørner, Jakob

    2018-01-01

    The purpose of this study is to develop and validate a prediction model for identifying employees at increased risk of long-term sickness absence (LTSA), by using variables commonly measured in occupational health surveys. Based on the literature, 15 predictor variables were retrieved from the DAnish National working Environment Survey (DANES) and included in a model predicting incident LTSA (≥4 consecutive weeks) during 1-year follow-up in a sample of 4000 DANES participants. The 15-predictor model was reduced by backward stepwise statistical techniques and then validated in a sample of 2524 DANES participants, not included in the development sample. Identification of employees at increased LTSA risk was investigated by receiver operating characteristic (ROC) analysis; the area-under-the-ROC-curve (AUC) reflected discrimination between employees with and without LTSA during follow-up. The 15-predictor model was reduced to a 9-predictor model including age, gender, education, self-rated health, mental health, prior LTSA, work ability, emotional job demands, and recognition by the management. Discrimination by the 9-predictor model was significant (AUC = 0.68; 95% CI 0.61-0.76), but not practically useful. A prediction model based on occupational health survey variables identified employees with an increased LTSA risk, but should be further developed into a practically useful tool to predict the risk of LTSA in the general working population. Implications for rehabilitation Long-term sickness absence risk predictions would enable healthcare providers to refer high-risk employees to rehabilitation programs aimed at preventing or reducing work disability. A prediction model based on health survey variables discriminates between employees at high and low risk of long-term sickness absence, but discrimination was not practically useful. Health survey variables provide insufficient information to determine long-term sickness absence risk profiles. There is a need for

  12. Validation of a Hot Water Distribution Model Using Laboratory and Field Data

    Energy Technology Data Exchange (ETDEWEB)

    Backman, C.; Hoeschele, M.

    2013-07-01

    Characterizing the performance of hot water distribution systems is a critical step in developing best practice guidelines for the design and installation of high performance hot water systems. Developing and validating simulation models is critical to this effort, as well as collecting accurate input data to drive the models. In this project, the ARBI team validated the newly developed TRNSYS Type 604 pipe model against both detailed laboratory and field distribution system performance data. Validation efforts indicate that the model performs very well in handling different pipe materials, insulation cases, and varying hot water load conditions. Limitations of the model include the complexity of setting up the input file and long simulation run times. In addition to completing validation activities, this project looked at recent field hot water studies to better understand use patterns and potential behavioral changes as homeowners convert from conventional storage water heaters to gas tankless units. Based on these datasets, we conclude that the current Energy Factor test procedure overestimates typical use and underestimates the number of hot water draws. This has implications for both equipment and distribution system performance. Gas tankless water heaters were found to impact how people use hot water, but the data does not necessarily suggest an increase in usage. Further study in hot water usage and patterns is needed to better define these characteristics in different climates and home vintages.

  13. Development and internal validation of a prognostic model to predict recurrence free survival in patients with adult granulosa cell tumors of the ovary

    NARCIS (Netherlands)

    van Meurs, Hannah S.; Schuit, Ewoud; Horlings, Hugo M.; van der Velden, Jacobus; van Driel, Willemien J.; Mol, Ben Willem J.; Kenter, Gemma G.; Buist, Marrije R.

    2014-01-01

    Models to predict the probability of recurrence free survival exist for various types of malignancies, but a model for recurrence free survival in individuals with an adult granulosa cell tumor (GCT) of the ovary is lacking. We aimed to develop and internally validate such a prognostic model. We

  14. Development and validation of the primary care team dynamics survey.

    Science.gov (United States)

    Song, Hummy; Chien, Alyna T; Fisher, Josephine; Martin, Julia; Peters, Antoinette S; Hacker, Karen; Rosenthal, Meredith B; Singer, Sara J

    2015-06-01

    To develop and validate a survey instrument designed to measure team dynamics in primary care. We studied 1,080 physician and nonphysician health care professionals working at 18 primary care practices participating in a learning collaborative aimed at improving team-based care. We developed a conceptual model and administered a cross-sectional survey addressing team dynamics, and we assessed reliability and discriminant validity of survey factors and the overall survey's goodness-of-fit using structural equation modeling. We administered the survey between September 2012 and March 2013. Overall response rate was 68 percent (732 respondents). Results support a seven-factor model of team dynamics, suggesting that conditions for team effectiveness, shared understanding, and three supportive processes are associated with acting and feeling like a team and, in turn, perceived team effectiveness. This model demonstrated adequate fit (goodness-of-fit index: 0.91), scale reliability (Cronbach's alphas: 0.71-0.91), and discriminant validity (average factor correlations: 0.49). It is possible to measure primary care team dynamics reliably using a 29-item survey. This survey may be used in ambulatory settings to study teamwork and explore the effect of efforts to improve team-based care. Future studies should demonstrate the importance of team dynamics for markers of team effectiveness (e.g., work satisfaction, care quality, clinical outcomes). © Health Research and Educational Trust.

  15. Development and Validation of Personality Disorder Spectra Scales for the MMPI-2-RF.

    Science.gov (United States)

    Sellbom, Martin; Waugh, Mark H; Hopwood, Christopher J

    2018-01-01

    The purpose of this study was to develop and validate a set of MMPI-2-RF (Ben-Porath & Tellegen, 2008/2011) personality disorder (PD) spectra scales. These scales could serve the purpose of assisting with DSM-5 PD diagnosis and help link categorical and dimensional conceptions of personality pathology within the MMPI-2-RF. We developed and provided initial validity results for scales corresponding to the 10 PD constructs listed in the DSM-5 using data from student, community, clinical, and correctional samples. Initial validation efforts indicated good support for criterion validity with an external PD measure as well as with dimensional personality traits included in the DSM-5 alternative model for PDs. Construct validity results using psychosocial history and therapists' ratings in a large clinical sample were generally supportive as well. Overall, these brief scales provide clinicians using MMPI-2-RF data with estimates of DSM-5 PD constructs that can support cross-model connections between categorical and dimensional assessment approaches.

  16. Context discovery using attenuated Bloom codes: model description and validation

    NARCIS (Netherlands)

    Liu, F.; Heijenk, Geert

    A novel approach to performing context discovery in ad-hoc networks based on the use of attenuated Bloom filters is proposed in this report. In order to investigate the performance of this approach, a model has been developed. This document describes the model and its validation. The model has been

  17. Recent validation studies for two NRPB environmental transfer models

    International Nuclear Information System (INIS)

    Brown, J.; Simmonds, J.R.

    1991-01-01

    The National Radiological Protection Board (NRPB) developed a dynamic model for the transfer of radionuclides through terrestrial food chains some years ago. This model, now called FARMLAND, predicts both instantaneous and time integrals of concentration of radionuclides in a variety of foods. The model can be used to assess the consequences of both accidental and routine releases of radioactivity to the environment; and results can be obtained as a function of time. A number of validation studies have been carried out on FARMLAND. In these the model predictions have been compared with a variety of sets of environmental measurement data. Some of these studies will be outlined in the paper. A model to predict external radiation exposure from radioactivity deposited on different surfaces in the environment has also been developed at NRPB. This model, called EXPURT (EXPosure from Urban Radionuclide Transfer), can be used to predict radiation doses as a function of time following deposition in a variety of environments, ranging from rural to inner-city areas. This paper outlines validation studies and future extensions to be carried out on EXPURT. (12 refs., 4 figs.)

  18. Statistical validation of normal tissue complication probability models.

    Science.gov (United States)

    Xu, Cheng-Jian; van der Schaaf, Arjen; Van't Veld, Aart A; Langendijk, Johannes A; Schilstra, Cornelis

    2012-09-01

    To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use. Copyright © 2012 Elsevier Inc. All rights reserved.

  19. Statistical Validation of Normal Tissue Complication Probability Models

    Energy Technology Data Exchange (ETDEWEB)

    Xu Chengjian, E-mail: c.j.xu@umcg.nl [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schaaf, Arjen van der; Veld, Aart A. van' t; Langendijk, Johannes A. [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schilstra, Cornelis [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Radiotherapy Institute Friesland, Leeuwarden (Netherlands)

    2012-09-01

    Purpose: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. Methods and Materials: A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Results: Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Conclusion: Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use.

  20. Validation of the Continuum of Care Conceptual Model for Athletic Therapy

    Directory of Open Access Journals (Sweden)

    Mark R. Lafave

    2015-01-01

    Full Text Available Utilization of conceptual models in field-based emergency care currently borrows from existing standards of medical and paramedical professions. The purpose of this study was to develop and validate a comprehensive conceptual model that could account for injuries ranging from nonurgent to catastrophic events including events that do not follow traditional medical or prehospital care protocols. The conceptual model should represent the continuum of care from the time of initial injury spanning to an athlete’s return to participation in their sport. Finally, the conceptual model should accommodate both novices and experts in the AT profession. This paper chronicles the content validation steps of the Continuum of Care Conceptual Model for Athletic Therapy (CCCM-AT. The stages of model development were domain and item generation, content expert validation using a three-stage modified Ebel procedure, and pilot testing. Only the final stage of the modified Ebel procedure reached a priori 80% consensus on three domains of interest: (1 heading descriptors; (2 the order of the model; (3 the conceptual model as a whole. Future research is required to test the use of the CCCM-AT in order to understand its efficacy in teaching and practice within the AT discipline.

  1. Validation of a Previously Developed Geospatial Model That Predicts the Prevalence of Listeria monocytogenes in New York State Produce Fields.

    Science.gov (United States)

    Weller, Daniel; Shiwakoti, Suvash; Bergholz, Peter; Grohn, Yrjo; Wiedmann, Martin; Strawn, Laura K

    2016-02-01

    Technological advancements, particularly in the field of geographic information systems (GIS), have made it possible to predict the likelihood of foodborne pathogen contamination in produce production environments using geospatial models. Yet, few studies have examined the validity and robustness of such models. This study was performed to test and refine the rules associated with a previously developed geospatial model that predicts the prevalence of Listeria monocytogenes in produce farms in New York State (NYS). Produce fields for each of four enrolled produce farms were categorized into areas of high or low predicted L. monocytogenes prevalence using rules based on a field's available water storage (AWS) and its proximity to water, impervious cover, and pastures. Drag swabs (n = 1,056) were collected from plots assigned to each risk category. Logistic regression, which tested the ability of each rule to accurately predict the prevalence of L. monocytogenes, validated the rules based on water and pasture. Samples collected near water (odds ratio [OR], 3.0) and pasture (OR, 2.9) showed a significantly increased likelihood of L. monocytogenes isolation compared to that for samples collected far from water and pasture. Generalized linear mixed models identified additional land cover factors associated with an increased likelihood of L. monocytogenes isolation, such as proximity to wetlands. These findings validated a subset of previously developed rules that predict L. monocytogenes prevalence in produce production environments. This suggests that GIS and geospatial models can be used to accurately predict L. monocytogenes prevalence on farms and can be used prospectively to minimize the risk of preharvest contamination of produce. Copyright © 2016, American Society for Microbiology. All Rights Reserved.

  2. Development and preliminary validation of the Opioid Abuse Risk Screener

    Directory of Open Access Journals (Sweden)

    Patricia Henrie-Barrus

    2016-05-01

    Full Text Available Prescription opioid drug abuse has reached epidemic proportions. Individuals with chronic pain represent a large population at considerable risk of abusing opioids. The Opioid Abuse Risk Screener was developed as a comprehensive self-administered measure of potential risk that includes a wide range of critical elements noted in the literature to be relevant to opioid risk. The creation, refinement, and preliminary modeling of the item pool, establishment of preliminary concurrent validity, and the determination of the factor structure are presented. The initial development and validation of the Opioid Abuse Risk Screener shows promise for effective risk stratification.

  3. Applying the Mixed Methods Instrument Development and Construct Validation Process: the Transformative Experience Questionnaire

    Science.gov (United States)

    Koskey, Kristin L. K.; Sondergeld, Toni A.; Stewart, Victoria C.; Pugh, Kevin J.

    2018-01-01

    Onwuegbuzie and colleagues proposed the Instrument Development and Construct Validation (IDCV) process as a mixed methods framework for creating and validating measures. Examples applying IDCV are lacking. We provide an illustrative case integrating the Rasch model and cognitive interviews applied to the development of the Transformative…

  4. Biotrickling filter modeling for styrene abatement. Part 1: Model development, calibration and validation on an industrial scale.

    Science.gov (United States)

    San-Valero, Pau; Dorado, Antonio D; Martínez-Soria, Vicente; Gabaldón, Carmen

    2018-01-01

    A three-phase dynamic mathematical model based on mass balances describing the main processes in biotrickling filtration: convection, mass transfer, diffusion, and biodegradation was calibrated and validated for the simulation of an industrial styrene-degrading biotrickling filter. The model considered the key features of the industrial operation of biotrickling filters: variable conditions of loading and intermittent irrigation. These features were included in the model switching from the mathematical description of periods with and without irrigation. Model equations were based on the mass balances describing the main processes in biotrickling filtration: convection, mass transfer, diffusion, and biodegradation. The model was calibrated with steady-state data from a laboratory biotrickling filter treating inlet loads at 13-74 g C m -3 h -1 and at empty bed residence time of 30-15 s. The model predicted the dynamic emission in the outlet of the biotrickling filter, simulating the small peaks of concentration occurring during irrigation. The validation of the model was performed using data from a pilot on-site biotrickling filter treating styrene installed in a fiber-reinforced facility. The model predicted the performance of the biotrickling filter working under high-oscillating emissions at an inlet load in a range of 5-23 g C m -3 h -1 and at an empty bed residence time of 31 s for more than 50 days, with a goodness of fit of 0.84. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Community pharmacist attitudes towards collaboration with general practitioners: development and validation of a measure and a model

    Directory of Open Access Journals (Sweden)

    Van Connie

    2012-09-01

    Full Text Available Abstract Background Community Pharmacists and General Practitioners (GPs are increasingly being encouraged to adopt more collaborative approaches to health care delivery as collaboration in primary care has been shown to be effective in improving patient outcomes. However, little is known about pharmacist attitudes towards collaborating with their GP counterparts and variables that influence this interprofessional collaboration. This study aims to develop and validate 1 an instrument to measure pharmacist attitudes towards collaboration with GPs and 2 a model that illustrates how pharmacist attitudes (and other variables influence collaborative behaviour with GPs. Methods A questionnaire containing the newly developed “Attitudes Towards Collaboration Instrument for Pharmacists” (ATCI-P and a previously validated behavioural measure “Frequency of Interprofessional Collaboration Instrument for Pharmacists” (FICI-P was administered to a sample of 1215 Australian pharmacists. The ATCI-P was developed based on existing literature and qualitative interviews with GPs and community pharmacists. Principal Component Analysis was used to assess the structure of the ATCI-P and the Cronbach’s alpha coefficient was used to assess the internal consistency of the instrument. Structural equation modelling was used to determine how pharmacist attitudes (as measured by the ATCI-P and other variables, influence collaborative behaviour (as measured by the FICI-P. Results Four hundred and ninety-two surveys were completed and returned for a response rate of 40%. Principal Component Analysis revealed the ATCI-P consisted of two factors: ‘interactional determinants’ and ‘practitioner determinants’, both with good internal consistency (Cronbach’s alpha = .90 and .93 respectively. The model demonstrated adequate fit (χ2/df = 1.89, CFI = .955, RMSEA = .062, 90% CI [.049-.074] and illustrated that ‘interactional determinants’ was

  6. Development and validation of a mathematical model for growth of pathogens in cut melons.

    Science.gov (United States)

    Li, Di; Friedrich, Loretta M; Danyluk, Michelle D; Harris, Linda J; Schaffner, Donald W

    2013-06-01

    Many outbreaks of foodborne illness associated with the consumption of fresh-cut melons have been reported. The objective of our research was to develop a mathematical model that predicts the growth rate of Salmonella on fresh-cut cantaloupe over a range of storage temperatures and to validate that model by using Salmonella and Escherichia coli O157:H7 on cantaloupe, honeydew, and watermelon, using both new data and data from the published studies. The growth of Salmonella on honeydew and watermelon and E. coli O157:H7 on cantaloupe, honeydew, and watermelon was monitored at temperatures of 4 to 25°C. The Ratkowsky (or square-root model) was used to describe Salmonella growth on cantaloupe as a function of storage temperature. Our results show that the levels of Salmonella on fresh-cut cantaloupe with an initial load of 3 log CFU/g can reach over 7 log CFU/g at 25°C within 24 h. No growth was observed at 4°C. A linear correlation was observed between the square root of Salmonella growth rate and temperature, such that √growth rate = 0.026 × (T - 5.613), R(2) = 0.9779. The model was generally suitable for predicting the growth of both Salmonella and E. coli O157:H7 on cantaloupe, honeydew, and watermelon, for both new data and data from the published literature. When compared with existing models for growth of Salmonella, the new model predicts a theoretic minimum growth temperature similar to the ComBase Predictive Models and Pathogen Modeling Program models but lower than other food-specific models. The ComBase Prediction Models results are very similar to the model developed in this study. Our research confirms that Salmonella can grow quickly and reach high concentrations when cut cantaloupe is stored at ambient temperatures, without visual signs of spoilage. Our model provides a fast and cost-effective method to estimate the effects of storage temperature on fresh-cut melon safety and could also be used in subsequent quantitative microbial risk

  7. Development and initial validation of the determinants of physical activity questionnaire.

    Science.gov (United States)

    Taylor, Natalie; Lawton, Rebecca; Conner, Mark

    2013-06-11

    Physical activity interventions are more likely to be effective if they target causal determinants of behaviour change. Targeting requires accurate identification of specific theoretical determinants of physical activity. Two studies were undertaken to develop and validate the Determinants of Physical Activity Questionnaire. In Study 1, 832 male and female university staff and students were recruited from 49 universities across the UK and completed the 66-item measure, which is based on the Theoretical Domains Framework. Confirmatory factor analysis was undertaken on a calibration sample to generate the model, which resulted in a loss of 31 items. A validation sample was used to cross-validate the model. 20 new items were added and Study 2 tested the revised model in a sample of 466 male and female university students together with a physical activity measure. The final model consisted of 11 factors and 34 items, and CFA produced a reasonable fit χ2 (472) = 852.3, p < .001, CFI = .933, SRMR = .105, RMSEA = .042 (CI = .037-.046), as well as generally acceptable levels of discriminant validity, internal consistency, and test-retest reliability. Eight subscales significantly differentiated between high and low exercisers, indicating that those who exercise less report more barriers for physical activity. A theoretically underpinned measure of determinants of physical activity has been developed with reasonable reliability and validity. Further work is required to test the measure amongst a more representative sample. This study provides an innovative approach to identifying potential barriers to physical activity. This approach illustrates a method for moving from diagnosing implementation difficulties to designing and evaluating interventions.

  8. Development and validation of a prediction model for tube feeding dependence after curative (chemo- radiation in head and neck cancer.

    Directory of Open Access Journals (Sweden)

    Kim Wopken

    Full Text Available BACKGROUND: Curative radiotherapy or chemoradiation for head and neck cancer (HNC may result in severe acute and late side effects, including tube feeding dependence. The purpose of this prospective cohort study was to develop a prediction model for tube feeding dependence 6 months (TUBEM6 after curative (chemo- radiotherapy in HNC patients. PATIENTS AND METHODS: Tube feeding dependence was scored prospectively. To develop the multivariable model, a group LASSO analysis was carried out, with TUBEM6 as the primary endpoint (n = 427. The model was then validated in a test cohort (n = 183. The training cohort was divided into three groups based on the risk of TUBEM6 to test whether the model could be extrapolated to later time points (12, 18 and 24 months. RESULTS: Most important predictors for TUBEM6 were weight loss prior to treatment, advanced T-stage, positive N-stage, bilateral neck irradiation, accelerated radiotherapy and chemoradiation. Model performance was good, with an Area under the Curve of 0.86 in the training cohort and 0.82 in the test cohort. The TUBEM6-based risk groups were significantly associated with tube feeding dependence at later time points (p<0.001. CONCLUSION: We established an externally validated predictive model for tube feeding dependence after curative radiotherapy or chemoradiation, which can be used to predict TUBEM6.

  9. Development and Validation of the Sorokin Psychosocial Love Inventory for Divorced Individuals

    Science.gov (United States)

    D'Ambrosio, Joseph G.; Faul, Anna C.

    2013-01-01

    Objective: This study describes the development and validation of the Sorokin Psychosocial Love Inventory (SPSLI) measuring love actions toward a former spouse. Method: Classical measurement theory and confirmatory factor analysis (CFA) were utilized with an a priori theory and factor model to validate the SPSLI. Results: A 15-item scale…

  10. Validation Data and Model Development for Fuel Assembly Response to Seismic Loads

    International Nuclear Information System (INIS)

    Bardet, Philippe; Ricciardi, Guillaume

    2016-01-01

    Vibrations are inherently present in nuclear reactors, especially in cores and steam generators of pressurized water reactors (PWR). They can have significant effects on local heat transfer and wear and tear in the reactor and often set safety margins. The simulation of these multiphysics phenomena from first principles requires the coupling of several codes, which is one the most challenging task in modern computer simulation. Here an ambitious multiphysics multidisciplinary validation campaign is conducted. It relied on an integrated team of experimentalists and code developers to acquire benchmark and validation data for fluid-structure interaction codes. Data are focused on PWR fuel bundle behavior during seismic transients.

  11. Validation Data and Model Development for Fuel Assembly Response to Seismic Loads

    Energy Technology Data Exchange (ETDEWEB)

    Bardet, Philippe [George Washington Univ., Washington, DC (United States); Ricciardi, Guillaume [Atomic Energy Commission (CEA) (France)

    2016-01-31

    Vibrations are inherently present in nuclear reactors, especially in cores and steam generators of pressurized water reactors (PWR). They can have significant effects on local heat transfer and wear and tear in the reactor and often set safety margins. The simulation of these multiphysics phenomena from first principles requires the coupling of several codes, which is one the most challenging tasks in modern computer simulation. Here an ambitious multiphysics multidisciplinary validation campaign is conducted. It relied on an integrated team of experimentalists and code developers to acquire benchmark and validation data for fluid-structure interaction codes. Data are focused on PWR fuel bundle behavior during seismic transients.

  12. Validation and calibration of structural models that combine information from multiple sources.

    Science.gov (United States)

    Dahabreh, Issa J; Wong, John B; Trikalinos, Thomas A

    2017-02-01

    Mathematical models that attempt to capture structural relationships between their components and combine information from multiple sources are increasingly used in medicine. Areas covered: We provide an overview of methods for model validation and calibration and survey studies comparing alternative approaches. Expert commentary: Model validation entails a confrontation of models with data, background knowledge, and other models, and can inform judgments about model credibility. Calibration involves selecting parameter values to improve the agreement of model outputs with data. When the goal of modeling is quantitative inference on the effects of interventions or forecasting, calibration can be viewed as estimation. This view clarifies issues related to parameter identifiability and facilitates formal model validation and the examination of consistency among different sources of information. In contrast, when the goal of modeling is the generation of qualitative insights about the modeled phenomenon, calibration is a rather informal process for selecting inputs that result in model behavior that roughly reproduces select aspects of the modeled phenomenon and cannot be equated to an estimation procedure. Current empirical research on validation and calibration methods consists primarily of methodological appraisals or case-studies of alternative techniques and cannot address the numerous complex and multifaceted methodological decisions that modelers must make. Further research is needed on different approaches for developing and validating complex models that combine evidence from multiple sources.

  13. Development and initial validation of the volition in exercise questionnaire (VEQ)

    DEFF Research Database (Denmark)

    Elsborg, Peter; Wikman, Johan Michael; Nielsen, Glen

    2017-01-01

    The present study describes the development and validation of an instrument to measure volition in the exercise context. Volition describes an individual’s self-regulatory mental processes that are responsible for taking and maintaining a desirable action (e.g., exercising regularly). The scale...... questionnaire with strong model fit and good internal consistency. In addition, the Volition in Exercise Questionnaire showed convergent validity because it was able to predict exercise participation. It showed incremental validity by explaining additional variance to the Sport Motivation Scale’s well...

  14. Validation of a phytoremediation computer model

    International Nuclear Information System (INIS)

    Corapcioglu, M.Y.; Sung, K.; Rhykerd, R.L.; Munster, C.; Drew, M.

    1999-01-01

    The use of plants to stimulate remediation of contaminated soil is an effective, low-cost cleanup method which can be applied to many different sites. A phytoremediation computer model has been developed to simulate how recalcitrant hydrocarbons interact with plant roots in unsaturated soil. A study was conducted to provide data to validate and calibrate the model. During the study, lysimeters were constructed and filled with soil contaminated with 10 [mg kg -1 ] TNT, PBB and chrysene. Vegetated and unvegetated treatments were conducted in triplicate to obtain data regarding contaminant concentrations in the soil, plant roots, root distribution, microbial activity, plant water use and soil moisture. When given the parameters of time and depth, the model successfully predicted contaminant concentrations under actual field conditions. Other model parameters are currently being evaluated. 15 refs., 2 figs

  15. Validation of the Colorado Retinopathy of Prematurity Screening Model.

    Science.gov (United States)

    McCourt, Emily A; Ying, Gui-Shuang; Lynch, Anne M; Palestine, Alan G; Wagner, Brandie D; Wymore, Erica; Tomlinson, Lauren A; Binenbaum, Gil

    2018-04-01

    The Colorado Retinopathy of Prematurity (CO-ROP) model uses birth weight, gestational age, and weight gain at the first month of life (WG-28) to predict risk of severe retinopathy of prematurity (ROP). In previous validation studies, the model performed very well, predicting virtually all cases of severe ROP and potentially reducing the number of infants who need ROP examinations, warranting validation in a larger, more diverse population. To validate the performance of the CO-ROP model in a large multicenter cohort. This study is a secondary analysis of data from the Postnatal Growth and Retinopathy of Prematurity (G-ROP) Study, a retrospective multicenter cohort study conducted in 29 hospitals in the United States and Canada between January 2006 and June 2012 of 6351 premature infants who received ROP examinations. Sensitivity and specificity for severe (early treatment of ROP [ETROP] type 1 or 2) ROP, and reduction in infants receiving examinations. The CO-ROP model was applied to the infants in the G-ROP data set with all 3 data points (infants would have received examinations if they met all 3 criteria: birth weight, large validation cohort. The model requires all 3 criteria to be met to signal a need for examinations, but some infants with a birth weight or gestational age above the thresholds developed severe ROP. Most of these infants who were not detected by the CO-ROP model had obvious deviation in expected weight trajectories or nonphysiologic weight gain. These findings suggest that the CO-ROP model needs to be revised before considering implementation into clinical practice.

  16. Power-based electric vehicle energy consumption model: Model development and validation

    International Nuclear Information System (INIS)

    Fiori, Chiara; Ahn, Kyoungho; Rakha, Hesham A.

    2016-01-01

    Highlights: • The study developed an instantaneous energy consumption model (VT-CPEM) for EVs. • The model captures instantaneous braking energy regeneration. • The model can be used for transportation modeling and vehicle applications (e.g. eco-routing). • The proposed model can be easily calibrated using publically available EV data. • Usages of air conditioning and heating systems reduce EV energy consumption by up to 10% and 24%, respectively. - Abstract: The limited drive range (The maximum distance that an EV can travel.) of Electric Vehicles (EVs) is one of the major challenges that EV manufacturers are attempting to overcome. To this end, a simple, accurate, and efficient energy consumption model is needed to develop real-time eco-driving and eco-routing systems that can enhance the energy efficiency of EVs and thus extend their travel range. Although numerous publications have focused on the modeling of EV energy consumption levels, these studies are limited to measuring energy consumption of an EV’s control algorithm, macro-project evaluations, or simplified well-to-wheels analyses. Consequently, this paper addresses this need by developing a simple EV energy model that computes an EV’s instantaneous energy consumption using second-by-second vehicle speed, acceleration and roadway grade data as input variables. In doing so, the model estimates the instantaneous braking energy regeneration. The proposed model can be easily implemented in the following applications: in-vehicle, Smartphone eco-driving, eco-routing and transportation simulation software to quantify the network-wide energy consumption levels for a fleet of EVs. One of the main advantages of EVs is their ability to recover energy while braking using a regenerative braking system. State-of-the-art vehicle energy consumption models consider an average constant regenerative braking energy efficiency or regenerative braking factors that are mainly dependent on the vehicle’s average

  17. Validation by simulation of a clinical trial model using the standardized mean and variance criteria.

    Science.gov (United States)

    Abbas, Ismail; Rovira, Joan; Casanovas, Josep

    2006-12-01

    To develop and validate a model of a clinical trial that evaluates the changes in cholesterol level as a surrogate marker for lipodystrophy in HIV subjects under alternative antiretroviral regimes, i.e., treatment with Protease Inhibitors vs. a combination of nevirapine and other antiretroviral drugs. Five simulation models were developed based on different assumptions, on treatment variability and pattern of cholesterol reduction over time. The last recorded cholesterol level, the difference from the baseline, the average difference from the baseline and level evolution, are the considered endpoints. Specific validation criteria based on a 10% minus or plus standardized distance in means and variances were used to compare the real and the simulated data. The validity criterion was met by all models for considered endpoints. However, only two models met the validity criterion when all endpoints were considered. The model based on the assumption that within-subjects variability of cholesterol levels changes over time is the one that minimizes the validity criterion, standardized distance equal to or less than 1% minus or plus. Simulation is a useful technique for calibration, estimation, and evaluation of models, which allows us to relax the often overly restrictive assumptions regarding parameters required by analytical approaches. The validity criterion can also be used to select the preferred model for design optimization, until additional data are obtained allowing an external validation of the model.

  18. Development and validation of a low-frequency modeling code for high-moment transmitter rod antennas

    Science.gov (United States)

    Jordan, Jared Williams; Sternberg, Ben K.; Dvorak, Steven L.

    2009-12-01

    The goal of this research is to develop and validate a low-frequency modeling code for high-moment transmitter rod antennas to aid in the design of future low-frequency TX antennas with high magnetic moments. To accomplish this goal, a quasi-static modeling algorithm was developed to simulate finite-length, permeable-core, rod antennas. This quasi-static analysis is applicable for low frequencies where eddy currents are negligible, and it can handle solid or hollow cores with winding insulation thickness between the antenna's windings and its core. The theory was programmed in Matlab, and the modeling code has the ability to predict the TX antenna's gain, maximum magnetic moment, saturation current, series inductance, and core series loss resistance, provided the user enters the corresponding complex permeability for the desired core magnetic flux density. In order to utilize the linear modeling code to model the effects of nonlinear core materials, it is necessary to use the correct complex permeability for a specific core magnetic flux density. In order to test the modeling code, we demonstrated that it can accurately predict changes in the electrical parameters associated with variations in the rod length and the core thickness for antennas made out of low carbon steel wire. These tests demonstrate that the modeling code was successful in predicting the changes in the rod antenna characteristics under high-current nonlinear conditions due to changes in the physical dimensions of the rod provided that the flux density in the core was held constant in order to keep the complex permeability from changing.

  19. Investigating the construct validity of a development assessment centre

    Directory of Open Access Journals (Sweden)

    Nadia M. Brits

    2013-11-01

    Research purpose: The aim of this study was to determine the construct validity of a one-day development assessment centre (DAC using a convenience sample of 202 managers in a large South African banking institution. Motivation for the study: Although the AC method is popular, it has been widely criticised as to whether it predominantly measures the dimensions it is designed to measure. Research design, approach and method: The fit of the measurement models implied by the dimensions measured was analysed in a quantitative study using an ex post facto correlation design and structural equation modelling. Main findings: Bi-factor confirmatory factor analysis was used to assess the relative contribution of higher-order exercise and dimension effects. Empirical under-identification stemming from the small number of exercises designed to reflect designated latent dimensions restricted the number of DAC dimensions that could be evaluated. Ultimately, only one global dimension had enough measurement points and was analysed. The results suggested that dimension effects explained the majority of variance in the post-exercise dimension ratings. Practical/managerial implications: Candidates’ proficiency on each dimension was used as the basis for development reports. The validity of inferences holds important implications for candidates’ career development and growth. Contribution/value-add: The authors found only one study on construct validity of AC dimensions in the South African context. The present study is the first use the bi-factor approach. This study will consequently contribute to the scarce AC literature in South Africa.

  20. Statistical methods for mechanistic model validation: Salt Repository Project

    International Nuclear Information System (INIS)

    Eggett, D.L.

    1988-07-01

    As part of the Department of Energy's Salt Repository Program, Pacific Northwest Laboratory (PNL) is studying the emplacement of nuclear waste containers in a salt repository. One objective of the SRP program is to develop an overall waste package component model which adequately describes such phenomena as container corrosion, waste form leaching, spent fuel degradation, etc., which are possible in the salt repository environment. The form of this model will be proposed, based on scientific principles and relevant salt repository conditions with supporting data. The model will be used to predict the future characteristics of the near field environment. This involves several different submodels such as the amount of time it takes a brine solution to contact a canister in the repository, how long it takes a canister to corrode and expose its contents to the brine, the leach rate of the contents of the canister, etc. These submodels are often tested in a laboratory and should be statistically validated (in this context, validate means to demonstrate that the model adequately describes the data) before they can be incorporated into the waste package component model. This report describes statistical methods for validating these models. 13 refs., 1 fig., 3 tabs

  1. Statistical Validation of Engineering and Scientific Models: Background

    International Nuclear Information System (INIS)

    Hills, Richard G.; Trucano, Timothy G.

    1999-01-01

    A tutorial is presented discussing the basic issues associated with propagation of uncertainty analysis and statistical validation of engineering and scientific models. The propagation of uncertainty tutorial illustrates the use of the sensitivity method and the Monte Carlo method to evaluate the uncertainty in predictions for linear and nonlinear models. Four example applications are presented; a linear model, a model for the behavior of a damped spring-mass system, a transient thermal conduction model, and a nonlinear transient convective-diffusive model based on Burger's equation. Correlated and uncorrelated model input parameters are considered. The model validation tutorial builds on the material presented in the propagation of uncertainty tutoriaI and uses the damp spring-mass system as the example application. The validation tutorial illustrates several concepts associated with the application of statistical inference to test model predictions against experimental observations. Several validation methods are presented including error band based, multivariate, sum of squares of residuals, and optimization methods. After completion of the tutorial, a survey of statistical model validation literature is presented and recommendations for future work are made

  2. Results from the Savannah River Laboratory model validation workshop

    International Nuclear Information System (INIS)

    Pepper, D.W.

    1981-01-01

    To evaluate existing and newly developed air pollution models used in DOE-funded laboratories, the Savannah River Laboratory sponsored a model validation workshop. The workshop used Kr-85 measurements and meteorology data obtained at SRL during 1975 to 1977. Individual laboratories used models to calculate daily, weekly, monthly or annual test periods. Cumulative integrated air concentrations were reported at each grid point and at each of the eight sampler locations

  3. IN-DRIFT MICROBIAL COMMUNITIES MODEL VALIDATION CALCULATIONS

    Energy Technology Data Exchange (ETDEWEB)

    D.M. Jolley

    2001-12-18

    The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS M&O 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS M&O 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN M09909SPAMINGl.003 using its replacement DTN M00106SPAIDMO 1.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS M&O 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS M&O (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 200 1) which includes controls for the management of electronic data.

  4. In-Drift Microbial Communities Model Validation Calculations

    Energy Technology Data Exchange (ETDEWEB)

    D. M. Jolley

    2001-09-24

    The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS M&O 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS M&O 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN MO9909SPAMING1.003 using its replacement DTN MO0106SPAIDM01.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS M&O 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS M&O (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 2001) which includes controls for the management of electronic data.

  5. In-Drift Microbial Communities Model Validation Calculation

    Energy Technology Data Exchange (ETDEWEB)

    D. M. Jolley

    2001-10-31

    The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS M&O 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS M&O 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN MO9909SPAMING1.003 using its replacement DTN MO0106SPAIDM01.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS M&O 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS M&O (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 2001) which includes controls for the management of electronic data.

  6. In-Drift Microbial Communities Model Validation Calculations

    International Nuclear Information System (INIS)

    Jolley, D.M.

    2001-01-01

    The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS MandO 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS MandO 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN MO9909SPAMING1.003 using its replacement DTN MO0106SPAIDM01.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS MandO 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS MandO (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 2001) which includes controls for the management of electronic data

  7. IN-DRIFT MICROBIAL COMMUNITIES MODEL VALIDATION CALCULATIONS

    International Nuclear Information System (INIS)

    D.M. Jolley

    2001-01-01

    The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS M andO 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS M andO 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN M09909SPAMINGl.003 using its replacement DTN M00106SPAIDMO 1.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS M andO 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS M andO (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 200 1) which includes controls for the management of electronic data

  8. Accounting for treatment use when validating a prognostic model: a simulation study

    Directory of Open Access Journals (Sweden)

    Romin Pajouheshnia

    2017-07-01

    Full Text Available Abstract Background Prognostic models often show poor performance when applied to independent validation data sets. We illustrate how treatment use in a validation set can affect measures of model performance and present the uses and limitations of available analytical methods to account for this using simulated data. Methods We outline how the use of risk-lowering treatments in a validation set can lead to an apparent overestimation of risk by a prognostic model that was developed in a treatment-naïve cohort to make predictions of risk without treatment. Potential methods to correct for the effects of treatment use when testing or validating a prognostic model are discussed from a theoretical perspective.. Subsequently, we assess, in simulated data sets, the impact of excluding treated individuals and the use of inverse probability weighting (IPW on the estimated model discrimination (c-index and calibration (observed:expected ratio and calibration plots in scenarios with different patterns and effects of treatment use. Results Ignoring the use of effective treatments in a validation data set leads to poorer model discrimination and calibration than would be observed in the untreated target population for the model. Excluding treated individuals provided correct estimates of model performance only when treatment was randomly allocated, although this reduced the precision of the estimates. IPW followed by exclusion of the treated individuals provided correct estimates of model performance in data sets where treatment use was either random or moderately associated with an individual's risk when the assumptions of IPW were met, but yielded incorrect estimates in the presence of non-positivity or an unobserved confounder. Conclusions When validating a prognostic model developed to make predictions of risk without treatment, treatment use in the validation set can bias estimates of the performance of the model in future targeted individuals, and

  9. The Perceived Leadership Communication Questionnaire (PLCQ): Development and Validation.

    Science.gov (United States)

    Schneider, Frank M; Maier, Michaela; Lovrekovic, Sara; Retzbach, Andrea

    2015-01-01

    The Perceived Leadership Communication Questionnaire (PLCQ) is a short, reliable, and valid instrument for measuring leadership communication from both perspectives of the leader and the follower. Drawing on a communication-based approach to leadership and following a theoretical framework of interpersonal communication processes in organizations, this article describes the development and validation of a one-dimensional 6-item scale in four studies (total N = 604). Results from Study 1 and 2 provide evidence for the internal consistency and factorial validity of the PLCQ's self-rating version (PLCQ-SR)-a version for measuring how leaders perceive their own communication with their followers. Results from Study 3 and 4 show internal consistency, construct validity, and criterion validity of the PLCQ's other-rating version (PLCQ-OR)-a version for measuring how followers perceive the communication of their leaders. Cronbach's α had an average of.80 over the four studies. All confirmatory factor analyses yielded good to excellent model fit indices. Convergent validity was established by average positive correlations of.69 with subdimensions of transformational leadership and leader-member exchange scales. Furthermore, nonsignificant correlations with socially desirable responding indicated discriminant validity. Last, criterion validity was supported by a moderately positive correlation with job satisfaction (r =.31).

  10. Model-Based Method for Sensor Validation

    Science.gov (United States)

    Vatan, Farrokh

    2012-01-01

    Fault detection, diagnosis, and prognosis are essential tasks in the operation of autonomous spacecraft, instruments, and in situ platforms. One of NASA s key mission requirements is robust state estimation. Sensing, using a wide range of sensors and sensor fusion approaches, plays a central role in robust state estimation, and there is a need to diagnose sensor failure as well as component failure. Sensor validation can be considered to be part of the larger effort of improving reliability and safety. The standard methods for solving the sensor validation problem are based on probabilistic analysis of the system, from which the method based on Bayesian networks is most popular. Therefore, these methods can only predict the most probable faulty sensors, which are subject to the initial probabilities defined for the failures. The method developed in this work is based on a model-based approach and provides the faulty sensors (if any), which can be logically inferred from the model of the system and the sensor readings (observations). The method is also more suitable for the systems when it is hard, or even impossible, to find the probability functions of the system. The method starts by a new mathematical description of the problem and develops a very efficient and systematic algorithm for its solution. The method builds on the concepts of analytical redundant relations (ARRs).

  11. Issues in developing valid assessments of speech pathology students' performance in the workplace.

    Science.gov (United States)

    McAllister, Sue; Lincoln, Michelle; Ferguson, Alison; McAllister, Lindy

    2010-01-01

    Workplace-based learning is a critical component of professional preparation in speech pathology. A validated assessment of this learning is seen to be 'the gold standard', but it is difficult to develop because of design and validation issues. These issues include the role and nature of judgement in assessment, challenges in measuring quality, and the relationship between assessment and learning. Valid assessment of workplace-based performance needs to capture the development of competence over time and account for both occupation specific and generic competencies. This paper reviews important conceptual issues in the design of valid and reliable workplace-based assessments of competence including assessment content, process, impact on learning, measurement issues, and validation strategies. It then goes on to share what has been learned about quality assessment and validation of a workplace-based performance assessment using competency-based ratings. The outcomes of a four-year national development and validation of an assessment tool are described. A literature review of issues in conceptualizing, designing, and validating workplace-based assessments was conducted. Key factors to consider in the design of a new tool were identified and built into the cycle of design, trialling, and data analysis in the validation stages of the development process. This paper provides an accessible overview of factors to consider in the design and validation of workplace-based assessment tools. It presents strategies used in the development and national validation of a tool COMPASS, used in an every speech pathology programme in Australia, New Zealand, and Singapore. The paper also describes Rasch analysis, a model-based statistical approach which is useful for establishing validity and reliability of assessment tools. Through careful attention to conceptual and design issues in the development and trialling of workplace-based assessments, it has been possible to develop the

  12. Model Validation Using Coordinate Distance with Performance Sensitivity

    Directory of Open Access Journals (Sweden)

    Jiann-Shiun Lew

    2008-01-01

    Full Text Available This paper presents an innovative approach to model validation for a structure with significant parameter variations. Model uncertainty of the structural dynamics is quantified with the use of a singular value decomposition technique to extract the principal components of parameter change, and an interval model is generated to represent the system with parameter uncertainty. The coordinate vector, corresponding to the identified principal directions, of the validation system is computed. The coordinate distance between the validation system and the identified interval model is used as a metric for model validation. A beam structure with an attached subsystem, which has significant parameter uncertainty, is used to demonstrate the proposed approach.

  13. Some considerations for validation of repository performance assessment models

    International Nuclear Information System (INIS)

    Eisenberg, N.

    1991-01-01

    Validation is an important aspect of the regulatory uses of performance assessment. A substantial body of literature exists indicating the manner in which validation of models is usually pursued. Because performance models for a nuclear waste repository cannot be tested over the long time periods for which the model must make predictions, the usual avenue for model validation is precluded. Further impediments to model validation include a lack of fundamental scientific theory to describe important aspects of repository performance and an inability to easily deduce the complex, intricate structures characteristic of a natural system. A successful strategy for validation must attempt to resolve these difficulties in a direct fashion. Although some procedural aspects will be important, the main reliance of validation should be on scientific substance and logical rigor. The level of validation needed will be mandated, in part, by the uses to which these models are put, rather than by the ideal of validation of a scientific theory. Because of the importance of the validation of performance assessment models, the NRC staff has engaged in a program of research and international cooperation to seek progress in this important area. 2 figs., 16 refs

  14. Development and validation of an Eulerian model towards the simulation of fuel injection in internal combustion engines; Developpement et validation d'un modele eulerien en vue de la simulation des jets de carburants dans les moteurs a combustion interne

    Energy Technology Data Exchange (ETDEWEB)

    Truchot, B.

    2005-12-15

    The objective of this work is to develop an Eulerian two phase model to improve the prediction of fuel injection in internal combustion engines, particularly the dense liquid zone close to the nozzle. Lagrangian models, usually used in engine simulations, are based on the assumption of dispersed two phase flows with low liquid volume fraction, which is not fulfilled in the case of direct injection engine technology. Different Eulerian approaches are available in the literature. Physical phenomena that occur near the nozzle and characteristics of each model lead to the choice of a two fluids two pressures model. Several open terms appear in the equations of the model: exchange between the two phases and turbulent correlations. Closures of exchange terms are based on the spherical droplets hypothesis while a RANS approach is adopted to close turbulent correlations. This model has been integrated in the IFP CFD code, IFP-C3D. Several numerical tests and analytical validations (for single and two phase flows) have been then carried out in order to check the correct implementation of equations and the predictivity of the model and closures. Modifications in the turbulent model of the gas have required validations in both the gas phase (flow behind a sudden enlargement) and the liquid phase (pure liquid injection). A two phase mixing layer has been then used to validate the whole model. Finally, injection tests have been achieved under realistic conditions (similar to those encountered in automotive engines) in order to check the feasibility of engine computations using the developed Eulerian approach. These tests have also allowed to check the compatibility of this approach with the specificities of engine simulations (especially mesh movement). (author)

  15. Statistical Analysis Methods for Physics Models Verification and Validation

    CERN Document Server

    De Luca, Silvia

    2017-01-01

    The validation and verification process is a fundamental step for any software like Geant4 and GeantV, which aim to perform data simulation using physics models and Monte Carlo techniques. As experimental physicists, we have to face the problem to compare the results obtained using simulations with what the experiments actually observed. One way to solve the problem is to perform a consistency test. Within the Geant group, we developed a C++ compact library which will be added to the automated validation process on the Geant Validation Portal

  16. Validation analysis of probabilistic models of dietary exposure to food additives.

    Science.gov (United States)

    Gilsenan, M B; Thompson, R L; Lambe, J; Gibney, M J

    2003-10-01

    The validity of a range of simple conceptual models designed specifically for the estimation of food additive intakes using probabilistic analysis was assessed. Modelled intake estimates that fell below traditional conservative point estimates of intake and above 'true' additive intakes (calculated from a reference database at brand level) were considered to be in a valid region. Models were developed for 10 food additives by combining food intake data, the probability of an additive being present in a food group and additive concentration data. Food intake and additive concentration data were entered as raw data or as a lognormal distribution, and the probability of an additive being present was entered based on the per cent brands or the per cent eating occasions within a food group that contained an additive. Since the three model components assumed two possible modes of input, the validity of eight (2(3)) model combinations was assessed. All model inputs were derived from the reference database. An iterative approach was employed in which the validity of individual model components was assessed first, followed by validation of full conceptual models. While the distribution of intake estimates from models fell below conservative intakes, which assume that the additive is present at maximum permitted levels (MPLs) in all foods in which it is permitted, intake estimates were not consistently above 'true' intakes. These analyses indicate the need for more complex models for the estimation of food additive intakes using probabilistic analysis. Such models should incorporate information on market share and/or brand loyalty.

  17. Perception of competence in middle school physical education: instrument development and validation.

    Science.gov (United States)

    Scrabis-Fletcher, Kristin; Silverman, Stephen

    2010-03-01

    Perception of Competence (POC) has been studied extensively in physical activity (PA) research with similar instruments adapted for physical education (PE) research. Such instruments do not account for the unique PE learning environment. Therefore, an instrument was developed and the scores validated to measure POC in middle school PE. A multiphase design was used consisting of an intensive theoretical review, elicitation study, prepilot study, pilot study, content validation study, and final validation study (N=1281). Data analysis included a multistep iterative process to identify the best model fit. A three-factor model for POC was tested and resulted in root mean square error of approximation = .09, root mean square residual = .07, goodness offit index = .90, and adjusted goodness offit index = .86 values in the acceptable range (Hu & Bentler, 1999). A two-factor model was also tested and resulted in a good fit (two-factor fit indexes values = .05, .03, .98, .97, respectively). The results of this study suggest that an instrument using a three- or two-factor model provides reliable and valid scores ofPOC measurement in middle school PE.

  18. Development and Validation of an Older Occupant Finite Element Model of a Mid-Sized Male for Investigation of Age-related Injury Risk.

    Science.gov (United States)

    Schoell, Samantha L; Weaver, Ashley A; Urban, Jillian E; Jones, Derek A; Stitzel, Joel D; Hwang, Eunjoo; Reed, Matthew P; Rupp, Jonathan D; Hu, Jingwen

    2015-11-01

    The aging population is a growing concern as the increased fragility and frailty of the elderly results in an elevated incidence of injury as well as an increased risk of mortality and morbidity. To assess elderly injury risk, age-specific computational models can be developed to directly calculate biomechanical metrics for injury. The first objective was to develop an older occupant Global Human Body Models Consortium (GHBMC) average male model (M50) representative of a 65 year old (YO) and to perform regional validation tests to investigate predicted fractures and injury severity with age. Development of the GHBMC M50 65 YO model involved implementing geometric, cortical thickness, and material property changes with age. Regional validation tests included a chest impact, a lateral impact, a shoulder impact, a thoracoabdominal impact, an abdominal bar impact, a pelvic impact, and a lateral sled test. The second objective was to investigate age-related injury risks by performing a frontal US NCAP simulation test with the GHBMC M50 65 YO and the GHBMC M50 v4.2 models. Simulation results were compared to the GHBMC M50 v4.2 to evaluate the effect of age on occupant response and risk for head injury, neck injury, thoracic injury, and lower extremity injury. Overall, the GHBMC M50 65 YO model predicted higher probabilities of AIS 3+ injury for the head and thorax.

  19. Object-oriented simulation model of a parabolic trough solar collector: Static and dynamic validation

    Science.gov (United States)

    Ubieta, Eduardo; Hoyo, Itzal del; Valenzuela, Loreto; Lopez-Martín, Rafael; Peña, Víctor de la; López, Susana

    2017-06-01

    A simulation model of a parabolic-trough solar collector developed in Modelica® language is calibrated and validated. The calibration is performed in order to approximate the behavior of the solar collector model to a real one due to the uncertainty in some of the system parameters, i.e. measured data is used during the calibration process. Afterwards, the validation of this calibrated model is done. During the validation, the results obtained from the model are compared to the ones obtained during real operation in a collector from the Plataforma Solar de Almeria (PSA).

  20. Development and Validation of Triarchic Construct Scales from the Psychopathic Personality Inventory

    Science.gov (United States)

    Hall, Jason R.; Drislane, Laura E.; Patrick, Christopher J.; Morano, Mario; Lilienfeld, Scott O.; Poythress, Norman G.

    2014-01-01

    The Triarchic model of psychopathy describes this complex condition in terms of distinct phenotypic components of boldness, meanness, and disinhibition. Brief self-report scales designed specifically to index these psychopathy facets have thus far demonstrated promising construct validity. The present study sought to develop and validate scales for assessing facets of the Triarchic model using items from a well-validated existing measure of psychopathy—the Psychopathic Personality Inventory (PPI). A consensus rating approach was used to identify PPI items relevant to each Triarchic facet, and the convergent and discriminant validity of the resulting PPI-based Triarchic scales were evaluated in relation to multiple criterion variables (i.e., other psychopathy inventories, antisocial personality disorder features, personality traits, psychosocial functioning) in offender and non-offender samples. The PPI-based Triarchic scales showed good internal consistency and related to criterion variables in ways consistent with predictions based on the Triarchic model. Findings are discussed in terms of implications for conceptualization and assessment of psychopathy. PMID:24447280

  1. Thermal hydraulic model validation for HOR mixed core fuel management

    International Nuclear Information System (INIS)

    Gibcus, H.P.M.; Vries, J.W. de; Leege, P.F.A. de

    1997-01-01

    A thermal-hydraulic core management model has been developed for the Hoger Onderwijsreactor (HOR), a 2 MW pool-type university research reactor. The model was adopted for safety analysis purposes in the framework of HEU/LEU core conversion studies. It is applied in the thermal-hydraulic computer code SHORT (Steady-state HOR Thermal-hydraulics) which is presently in use in designing core configurations and for in-core fuel management. An elaborate measurement program was performed for establishing the core hydraulic characteristics for a variety of conditions. The hydraulic data were obtained with a dummy fuel element with special equipment allowing a.o. direct measurement of the true core flow rate. Using these data the thermal-hydraulic model was validated experimentally. The model, experimental tests, and model validation are discussed. (author)

  2. Numerical studies and metric development for validation of magnetohydrodynamic models on the HIT-SI experiment

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, C., E-mail: hansec@uw.edu [PSI-Center, University of Washington, Seattle, Washington 98195 (United States); Columbia University, New York, New York 10027 (United States); Victor, B.; Morgan, K.; Hossack, A.; Sutherland, D. [HIT-SI Group, University of Washington, Seattle, Washington 98195 (United States); Jarboe, T.; Nelson, B. A. [HIT-SI Group, University of Washington, Seattle, Washington 98195 (United States); PSI-Center, University of Washington, Seattle, Washington 98195 (United States); Marklin, G. [PSI-Center, University of Washington, Seattle, Washington 98195 (United States)

    2015-05-15

    We present application of three scalar metrics derived from the Biorthogonal Decomposition (BD) technique to evaluate the level of agreement between macroscopic plasma dynamics in different data sets. BD decomposes large data sets, as produced by distributed diagnostic arrays, into principal mode structures without assumptions on spatial or temporal structure. These metrics have been applied to validation of the Hall-MHD model using experimental data from the Helicity Injected Torus with Steady Inductive helicity injection experiment. Each metric provides a measure of correlation between mode structures extracted from experimental data and simulations for an array of 192 surface-mounted magnetic probes. Numerical validation studies have been performed using the NIMROD code, where the injectors are modeled as boundary conditions on the flux conserver, and the PSI-TET code, where the entire plasma volume is treated. Initial results from a comprehensive validation study of high performance operation with different injector frequencies are presented, illustrating application of the BD method. Using a simplified (constant, uniform density and temperature) Hall-MHD model, simulation results agree with experimental observation for two of the three defined metrics when the injectors are driven with a frequency of 14.5 kHz.

  3. Validation of PV-RPM Code in the System Advisor Model.

    Energy Technology Data Exchange (ETDEWEB)

    Klise, Geoffrey Taylor [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lavrova, Olga [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Freeman, Janine [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-04-01

    This paper describes efforts made by Sandia National Laboratories (SNL) and the National Renewable Energy Laboratory (NREL) to validate the SNL developed PV Reliability Performance Model (PV - RPM) algorithm as implemented in the NREL System Advisor Model (SAM). The PV - RPM model is a library of functions that estimates component failure and repair in a photovoltaic system over a desired simulation period. The failure and repair distributions in this paper are probabilistic representations of component failure and repair based on data collected by SNL for a PV power plant operating in Arizona. The validation effort focuses on whether the failure and repair dist ributions used in the SAM implementation result in estimated failures that match the expected failures developed in the proof - of - concept implementation. Results indicate that the SAM implementation of PV - RPM provides the same results as the proof - of - concep t implementation, indicating the algorithms were reproduced successfully.

  4. Validation of a tuber blight (Phytophthora infestans) prediction model

    Science.gov (United States)

    Potato tuber blight caused by Phytophthora infestans accounts for significant losses in storage. There is limited published quantitative data on predicting tuber blight. We validated a tuber blight prediction model developed in New York with cultivars Allegany, NY 101, and Katahdin using independent...

  5. SDG and qualitative trend based model multiple scale validation

    Science.gov (United States)

    Gao, Dong; Xu, Xin; Yin, Jianjin; Zhang, Hongyu; Zhang, Beike

    2017-09-01

    Verification, Validation and Accreditation (VV&A) is key technology of simulation and modelling. For the traditional model validation methods, the completeness is weak; it is carried out in one scale; it depends on human experience. The SDG (Signed Directed Graph) and qualitative trend based multiple scale validation is proposed. First the SDG model is built and qualitative trends are added to the model. And then complete testing scenarios are produced by positive inference. The multiple scale validation is carried out by comparing the testing scenarios with outputs of simulation model in different scales. Finally, the effectiveness is proved by carrying out validation for a reactor model.

  6. Development, description and validation of a Tritium Environmental Release Model (TERM).

    Science.gov (United States)

    Jeffers, Rebecca S; Parker, Geoffrey T

    2014-01-01

    Tritium is a radioisotope of hydrogen that exists naturally in the environment and may also be released through anthropogenic activities. It bonds readily with hydrogen and oxygen atoms to form tritiated water, which then cycles through the hydrosphere. This paper seeks to model the migration of tritiated species throughout the environment - including atmospheric, river and coastal systems - more comprehensively and more consistently across release scenarios than is currently in the literature. A review of the features and underlying conceptual models of some existing tritium release models was conducted, and an underlying aggregated conceptual process model defined, which is presented. The new model, dubbed 'Tritium Environmental Release Model' (TERM), was then tested against multiple validation sets from literature, including experimental data and reference tests for tritium models. TERM has been shown to be capable of providing reasonable results which are broadly comparable with atmospheric HTO release models from the literature, spanning both continuous and discrete release conditions. TERM also performed well when compared with atmospheric data. TERM is believed to be a useful tool for examining discrete and continuous atmospheric releases or combinations thereof. TERM also includes further capabilities (e.g. river and coastal release scenarios) that may be applicable to certain scenarios that atmospheric models alone may not handle well. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. Development of boiling transition analysis code TCAPE-INS/B based on mechanistic methods for BWR fuel bundles. Models and validations with boiling transition experimental data

    International Nuclear Information System (INIS)

    Ishida, Naoyuki; Utsuno, Hideaki; Kasahara, Fumio

    2003-01-01

    The Boiling Transition (BT) analysis code TCAPE-INS/B based on the mechanistic methods coupled with subchannel analysis has been developed for the evaluation of the integrity of Boiling Water Reactor (BWR) fuel rod bundles under abnormal operations. Objective of the development is the evaluation of the BT without using empirical BT and rewetting correlations needed for different bundle designs in the current analysis methods. TCAPE-INS/B consisted mainly of the drift-flux model, the film flow model, the cross-flow model, the thermal conductivity model and the heat transfer correlations. These models were validated systematically with the experimental data. The accuracy of the prediction for the steady-state Critical Heat Flux (CHF) and the transient temperature of the fuel rod surface after the occurrence of BT were evaluated on the validations. The calculations for the experiments with the single tube and bundles were carried out for the validations of the models incorporated in the code. The results showed that the steady-state CHF was predicted within about 6% average error. In the transient calculations, BT timing and temperature of the fuel rod surface gradient agreed well with experimental results, but rewetting was predicted lately. So, modeling of heat transfer phenomena during post-BT is under modification. (author)

  8. Flight Testing an Iced Business Jet for Flight Simulation Model Validation

    Science.gov (United States)

    Ratvasky, Thomas P.; Barnhart, Billy P.; Lee, Sam; Cooper, Jon

    2007-01-01

    A flight test of a business jet aircraft with various ice accretions was performed to obtain data to validate flight simulation models developed through wind tunnel tests. Three types of ice accretions were tested: pre-activation roughness, runback shapes that form downstream of the thermal wing ice protection system, and a wing ice protection system failure shape. The high fidelity flight simulation models of this business jet aircraft were validated using a software tool called "Overdrive." Through comparisons of flight-extracted aerodynamic forces and moments to simulation-predicted forces and moments, the simulation models were successfully validated. Only minor adjustments in the simulation database were required to obtain adequate match, signifying the process used to develop the simulation models was successful. The simulation models were implemented in the NASA Ice Contamination Effects Flight Training Device (ICEFTD) to enable company pilots to evaluate flight characteristics of the simulation models. By and large, the pilots confirmed good similarities in the flight characteristics when compared to the real airplane. However, pilots noted pitch up tendencies at stall with the flaps extended that were not representative of the airplane and identified some differences in pilot forces. The elevator hinge moment model and implementation of the control forces on the ICEFTD were identified as a driver in the pitch ups and control force issues, and will be an area for future work.

  9. MATSIM -The Development and Validation of a Numerical Voxel Model based on the MATROSHKA Phantom

    Science.gov (United States)

    Beck, Peter; Rollet, Sofia; Berger, Thomas; Bergmann, Robert; Hajek, Michael; Latocha, Marcin; Vana, Norbert; Zechner, Andrea; Reitz, Guenther

    The AIT Austrian Institute of Technology coordinates the project MATSIM (MATROSHKA Simulation) in collaboration with the Vienna University of Technology and the German Aerospace Center. The aim of the project is to develop a voxel-based model of the MATROSHKA anthro-pomorphic torso used at the International Space Station (ISS) as foundation to perform Monte Carlo high-energy particle transport simulations for different irradiation conditions. Funded by the Austrian Space Applications Programme (ASAP), MATSIM is a co-investigation with the European Space Agency (ESA) ELIPS project MATROSHKA, an international collaboration of more than 18 research institutes and space agencies from all over the world, under the science and project lead of the German Aerospace Center. The MATROSHKA facility is designed to determine the radiation exposure of an astronaut onboard ISS and especially during an ex-travehicular activity. The numerical model developed in the frame of MATSIM is validated by reference measurements. In this report we give on overview of the model development and compare photon and neutron irradiations of the detector-equipped phantom torso with Monte Carlo simulations using FLUKA. Exposure to Co-60 photons was realized in the standard ir-radiation laboratory at Seibersdorf, while investigations with neutrons were performed at the thermal column of the Vienna TRIGA Mark-II reactor. The phantom was loaded with passive thermoluminescence dosimeters. In addition, first results of the calculated dose distribution within the torso are presented for a simulated exposure in low-Earth orbit.

  10. Validating EHR clinical models using ontology patterns.

    Science.gov (United States)

    Martínez-Costa, Catalina; Schulz, Stefan

    2017-12-01

    Clinical models are artefacts that specify how information is structured in electronic health records (EHRs). However, the makeup of clinical models is not guided by any formal constraint beyond a semantically vague information model. We address this gap by advocating ontology design patterns as a mechanism that makes the semantics of clinical models explicit. This paper demonstrates how ontology design patterns can validate existing clinical models using SHACL. Based on the Clinical Information Modelling Initiative (CIMI), we show how ontology patterns detect both modeling and terminology binding errors in CIMI models. SHACL, a W3C constraint language for the validation of RDF graphs, builds on the concept of "Shape", a description of data in terms of expected cardinalities, datatypes and other restrictions. SHACL, as opposed to OWL, subscribes to the Closed World Assumption (CWA) and is therefore more suitable for the validation of clinical models. We have demonstrated the feasibility of the approach by manually describing the correspondences between six CIMI clinical models represented in RDF and two SHACL ontology design patterns. Using a Java-based SHACL implementation, we found at least eleven modeling and binding errors within these CIMI models. This demonstrates the usefulness of ontology design patterns not only as a modeling tool but also as a tool for validation. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. The development and validation of the Blended Socratic Method of Teaching (BSMT: An instructional model to enhance critical thinking skills of undergraduate business students

    Directory of Open Access Journals (Sweden)

    Eugenia Arazo Boa

    2018-01-01

    Full Text Available Enhancing critical thinking skills is one of the paramount goals of many educational institutions. This study presents the development and validation of the Blended Socratic Method of Teaching (BSMT, a teaching model intended to foster critical thinking skills of business students in the undergraduate level. The main objectives of the study were to 1 to survey the critical thinking skills of undergraduate business students, and 2 to develop and validate the BSMT model designed to enhance critical thinking skills. The research procedure comprised of two phases related to the two research objectives: 1 surveying the critical thinking skills of 371 undergraduate business students at Naresuan University International College focusing on the three critical thinking competencies of the RED model—recognize assumptions, evaluate arguments, and draw conclusion, and the determination of the level of their critical thinking; and 2 developing the instructional model followed by validation of the model by five experts. The results of the study were: 1 the undergraduate business students have deficient critical thinking based on the RED Model competencies as they scored “below average” on the critical thinking appraisal, and 2 the developed model comprised six elements: focus, syntax, principles of reaction, the social system, the support system, and application. The experts were in complete agreement that the model is “highly appropriate” in improving the critical thinking skills of the business students. The main essence of the model is the syntax comprising of five steps: group assignment, analysis and writing of case studies; group presentation of the business case analysis in class; Socratic discussion/questioning in class; posting of the case study on the class Facebook account; and online Socratic discussion/questioning. The BSMT model is an authentic and comprehensive model combining the Socratic method of teaching, information and

  12. Development and validation of a CFD model predicting the backfill process of a nuclear waste gallery

    International Nuclear Information System (INIS)

    Gopala, Vinay Ramohalli; Lycklama a Nijeholt, Jan-Aiso; Bakker, Paul; Haverkate, Benno

    2011-01-01

    Research highlights: → This work presents the CFD simulation of the backfill process of Supercontainers with nuclear waste emplaced in a disposal gallery. → The cement-based material used for backfill is grout and the flow of grout is modelled as a Bingham fluid. → The model is verified against an analytical solution and validated against the flowability tests for concrete. → Comparison between backfill plexiglas experiment and simulation shows a distinct difference in the filling pattern. → The numerical model needs to be further developed to include segregation effects and thixotropic behavior of grout. - Abstract: Nuclear waste material may be stored in underground tunnels for long term storage. The example treated in this article is based on the current Belgian disposal concept for High-Level Waste (HLW), in which the nuclear waste material is packed in concrete shielded packages, called Supercontainers, which are inserted into these tunnels. After placement of the packages in the underground tunnels, the remaining voids between the packages and the tunnel lining is filled-up with a cement-based material called grout in order to encase the stored containers into the underground spacing. This encasement of the stored containers inside the tunnels is known as the backfill process. A good backfill process is necessary to stabilize the waste gallery against ground settlements. A numerical model to simulate the backfill process can help to improve and optimize the process by ensuring a homogeneous filling with no air voids and also optimization of the injection positions to achieve a homogeneous filling. The objective of the present work is to develop such a numerical code that can predict the backfill process well and validate the model against the available experiments and analytical solutions. In the present work the rheology of Grout is modelled as a Bingham fluid which is implemented in OpenFOAM - a finite volume-based open source computational fluid

  13. Analysis of various quality attributes of sunflower and soybean plants by near infra-red reflectance spectroscopy: Development and validation calibration models

    Science.gov (United States)

    Sunflower and soybean are summer annuals that can be grown as an alternative to corn and may be particularly useful in organic production systems. Rapid and low cost methods of analyzing plant quality would be helpful for crop management. We developed and validated calibration models for Near-infrar...

  14. Tracer travel time and model validation

    International Nuclear Information System (INIS)

    Tsang, Chin-Fu.

    1988-01-01

    The performance assessment of a nuclear waste repository demands much more in comparison to the safety evaluation of any civil constructions such as dams, or the resource evaluation of a petroleum or geothermal reservoir. It involves the estimation of low probability (low concentration) of radionuclide transport extrapolated 1000's of years into the future. Thus models used to make these estimates need to be carefully validated. A number of recent efforts have been devoted to the study of this problem. Some general comments on model validation were given by Tsang. The present paper discusses some issues of validation in regards to radionuclide transport. 5 refs

  15. Development and Validation of Multi-Dimensional Personality ...

    African Journals Online (AJOL)

    This study was carried out to establish the scientific processes for the development and validation of Multi-dimensional Personality Inventory (MPI). The process of development and validation occurred in three phases with five components of Agreeableness, Conscientiousness, Emotional stability, Extroversion, and ...

  16. Gap Conductance model Validation in the TASS/SMR-S code using MARS code

    International Nuclear Information System (INIS)

    Ahn, Sang Jun; Yang, Soo Hyung; Chung, Young Jong; Lee, Won Jae

    2010-01-01

    Korea Atomic Energy Research Institute (KAERI) has been developing the TASS/SMR-S (Transient and Setpoint Simulation/Small and Medium Reactor) code, which is a thermal hydraulic code for the safety analysis of the advanced integral reactor. An appropriate work to validate the applicability of the thermal hydraulic models within the code should be demanded. Among the models, the gap conductance model which is describes the thermal gap conductivity between fuel and cladding was validated through the comparison with MARS code. The validation of the gap conductance model was performed by evaluating the variation of the gap temperature and gap width as the changed with the power fraction. In this paper, a brief description of the gap conductance model in the TASS/SMR-S code is presented. In addition, calculated results to validate the gap conductance model are demonstrated by comparing with the results of the MARS code with the test case

  17. Calibration and validation of coarse-grained models of atomic systems: application to semiconductor manufacturing

    Science.gov (United States)

    Farrell, Kathryn; Oden, J. Tinsley

    2014-07-01

    Coarse-grained models of atomic systems, created by aggregating groups of atoms into molecules to reduce the number of degrees of freedom, have been used for decades in important scientific and technological applications. In recent years, interest in developing a more rigorous theory for coarse graining and in assessing the predictivity of coarse-grained models has arisen. In this work, Bayesian methods for the calibration and validation of coarse-grained models of atomistic systems in thermodynamic equilibrium are developed. For specificity, only configurational models of systems in canonical ensembles are considered. Among major challenges in validating coarse-grained models are (1) the development of validation processes that lead to information essential in establishing confidence in the model's ability predict key quantities of interest and (2), above all, the determination of the coarse-grained model itself; that is, the characterization of the molecular architecture, the choice of interaction potentials and thus parameters, which best fit available data. The all-atom model is treated as the "ground truth," and it provides the basis with respect to which properties of the coarse-grained model are compared. This base all-atom model is characterized by an appropriate statistical mechanics framework in this work by canonical ensembles involving only configurational energies. The all-atom model thus supplies data for Bayesian calibration and validation methods for the molecular model. To address the first challenge, we develop priors based on the maximum entropy principle and likelihood functions based on Gaussian approximations of the uncertainties in the parameter-to-observation error. To address challenge (2), we introduce the notion of model plausibilities as a means for model selection. This methodology provides a powerful approach toward constructing coarse-grained models which are most plausible for given all-atom data. We demonstrate the theory and

  18. The development and validation of a new collision processor for MONK

    International Nuclear Information System (INIS)

    Connolly, Simon; Grimstone, Malcolm

    2003-01-01

    This paper summarises the progress with a major development project for the MONK Monte Carlo criticality code, namely the development of a new collision processing modelling package and nuclear data library. The development has now reached the final validation stage and further results will become available over the coming months in the period leading up the release of MONK9. (J.P.N.)

  19. Liver stiffness value-based risk estimation of late recurrence after curative resection of hepatocellular carcinoma: development and validation of a predictive model.

    Directory of Open Access Journals (Sweden)

    Kyu Sik Jung

    Full Text Available Preoperative liver stiffness (LS measurement using transient elastography (TE is useful for predicting late recurrence after curative resection of hepatocellular carcinoma (HCC. We developed and validated a novel LS value-based predictive model for late recurrence of HCC.Patients who were due to undergo curative resection of HCC between August 2006 and January 2010 were prospectively enrolled and TE was performed prior to operations by study protocol. The predictive model of late recurrence was constructed based on a multiple logistic regression model. Discrimination and calibration were used to validate the model.Among a total of 139 patients who were finally analyzed, late recurrence occurred in 44 patients, with a median follow-up of 24.5 months (range, 12.4-68.1. We developed a predictive model for late recurrence of HCC using LS value, activity grade II-III, presence of multiple tumors, and indocyanine green retention rate at 15 min (ICG R15, which showed fairly good discrimination capability with an area under the receiver operating characteristic curve (AUROC of 0.724 (95% confidence intervals [CIs], 0.632-0.816. In the validation, using a bootstrap method to assess discrimination, the AUROC remained largely unchanged between iterations, with an average AUROC of 0.722 (95% CIs, 0.718-0.724. When we plotted a calibration chart for predicted and observed risk of late recurrence, the predicted risk of late recurrence correlated well with observed risk, with a correlation coefficient of 0.873 (P<0.001.A simple LS value-based predictive model could estimate the risk of late recurrence in patients who underwent curative resection of HCC.

  20. Development and validation of a dynamical atmosphere-vegetation-soil HTO transport and OBT formation model

    Energy Technology Data Exchange (ETDEWEB)

    Ota, Masakazu, E-mail: ohta.masakazu@jaea.go.jp [Research Group for Environmental Science, Division of Environment and Radiation, Nuclear Science and Engineering Directorate, Japan Atomic Energy Agency (Japan); Nagai, Haruyasu [Research Group for Environmental Science, Division of Environment and Radiation, Nuclear Science and Engineering Directorate, Japan Atomic Energy Agency (Japan)

    2011-09-15

    A numerical model simulating transport of tritiated water (HTO) in atmosphere-soil-vegetation system, and, accumulation of organically bound tritium (OBT) in vegetative leaves was developed. Characteristic of the model is, for calculating tritium transport, it incorporates a dynamical atmosphere-soil-vegetation model (SOLVEG-II) that calculates transport of heat and water, and, exchange of CO{sub 2}. The processes included for calculating tissue free water tritium (TFWT) in leaves are HTO exchange between canopy air and leaf cellular water, root uptake of aqueous HTO in soil, photosynthetic assimilation of TFWT into OBT, and, TFWT formation from OBT through respiration. Tritium fluxes at the last two processes are input to a carbohydrate compartment model in leaves that calculates OBT translocation from leaves and allocation in them, by using photosynthesis and respiration rate in leaves. The developed model was then validated through a simulation of an existing experiment of acute exposure of grape plants to atmospheric HTO. Calculated TFWT concentration in leaves increased soon after the start of HTO exposure, reaching to equilibrium with the atmospheric HTO within a few hours, and then rapidly decreased after the end of the exposure. Calculated non-exchangeable OBT amount in leaves linearly increased during the exposure, and after the exposure, rapidly decreased in daytime, and, moderately nighttime. These variations in the calculated TFWT concentrations and OBT amounts, each mainly controlled by HTO exchange between canopy air and leaf cellular water and by carbohydrates translocation from leaves, fairly agreed with the observations within average errors of a factor of two. - Highlights: > TFWT retention and OBT formation in leaves were modeled > The model fairly well calculates TFWT concentration after an acute HTO exposure > The model well assesses OBT formation and attenuation of OBT amount in leaves.

  1. Geochemistry Model Validation Report: External Accumulation Model

    International Nuclear Information System (INIS)

    Zarrabi, K.

    2001-01-01

    The purpose of this Analysis and Modeling Report (AMR) is to validate the External Accumulation Model that predicts accumulation of fissile materials in fractures and lithophysae in the rock beneath a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. (Lithophysae are voids in the rock having concentric shells of finely crystalline alkali feldspar, quartz, and other materials that were formed due to entrapped gas that later escaped, DOE 1998, p. A-25.) The intended use of this model is to estimate the quantities of external accumulation of fissile material for use in external criticality risk assessments for different types of degrading WPs: U.S. Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The scope of the model validation is to (1) describe the model and the parameters used to develop the model, (2) provide rationale for selection of the parameters by comparisons with measured values, and (3) demonstrate that the parameters chosen are the most conservative selection for external criticality risk calculations. To demonstrate the applicability of the model, a Pu-ceramic WP is used as an example. The model begins with a source term from separately documented EQ6 calculations; where the source term is defined as the composition versus time of the water flowing out of a breached waste package (WP). Next, PHREEQC, is used to simulate the transport and interaction of the source term with the resident water and fractured tuff below the repository. In these simulations the primary mechanism for accumulation is mixing of the high pH, actinide-laden source term with resident water; thus lowering the pH values sufficiently for fissile minerals to become insoluble and precipitate. In the final section of the model, the outputs from PHREEQC, are processed to produce mass of accumulation

  2. External Validation of a Prediction Model for Successful External Cephalic Version

    NARCIS (Netherlands)

    de Hundt, Marcella; Vlemmix, Floortje; Kok, Marjolein; van der Steeg, Jan W.; Bais, Joke M.; Mol, Ben W.; van der Post, Joris A.

    2012-01-01

    We sought external validation of a prediction model for the probability of a successful external cephalic version (ECV). We evaluated the performance of the prediction model with calibration and discrimination. For clinical practice, we developed a score chart to calculate the probability of a

  3. Systematic validation of non-equilibrium thermochemical models using Bayesian inference

    KAUST Repository

    Miki, Kenji

    2015-10-01

    © 2015 Elsevier Inc. The validation process proposed by Babuška et al. [1] is applied to thermochemical models describing post-shock flow conditions. In this validation approach, experimental data is involved only in the calibration of the models, and the decision process is based on quantities of interest (QoIs) predicted on scenarios that are not necessarily amenable experimentally. Moreover, uncertainties present in the experimental data, as well as those resulting from an incomplete physical model description, are propagated to the QoIs. We investigate four commonly used thermochemical models: a one-temperature model (which assumes thermal equilibrium among all inner modes), and two-temperature models developed by Macheret et al. [2], Marrone and Treanor [3], and Park [4]. Up to 16 uncertain parameters are estimated using Bayesian updating based on the latest absolute volumetric radiance data collected at the Electric Arc Shock Tube (EAST) installed inside the NASA Ames Research Center. Following the solution of the inverse problems, the forward problems are solved in order to predict the radiative heat flux, QoI, and examine the validity of these models. Our results show that all four models are invalid, but for different reasons: the one-temperature model simply fails to reproduce the data while the two-temperature models exhibit unacceptably large uncertainties in the QoI predictions.

  4. Animal models of binge drinking, current challenges to improve face validity.

    Science.gov (United States)

    Jeanblanc, Jérôme; Rolland, Benjamin; Gierski, Fabien; Martinetti, Margaret P; Naassila, Mickael

    2018-05-05

    Binge drinking (BD), i.e., consuming a large amount of alcohol in a short period of time, is an increasing public health issue. Though no clear definition has been adopted worldwide the speed of drinking seems to be a keystone of this behavior. Developing relevant animal models of BD is a priority for gaining a better characterization of the neurobiological and psychobiological mechanisms underlying this dangerous and harmful behavior. Until recently, preclinical research on BD has been conducted mostly using forced administration of alcohol, but more recent studies used scheduled access to alcohol, to model more voluntary excessive intakes, and to achieve signs of intoxications that mimic the human behavior. The main challenges for future research are discussed regarding the need of good face validity, construct validity and predictive validity of animal models of BD. Copyright © 2018 Elsevier Ltd. All rights reserved.

  5. Development and Validation of a Risk-Score Model for Type 2 Diabetes: A Cohort Study of a Rural Adult Chinese Population.

    Directory of Open Access Journals (Sweden)

    Ming Zhang

    Full Text Available Some global models to predict the risk of diabetes may not be applicable to local populations. We aimed to develop and validate a score to predict type 2 diabetes mellitus (T2DM in a rural adult Chinese population. Data for a cohort of 12,849 participants were randomly divided into derivation (n = 11,564 and validation (n = 1285 datasets. A questionnaire interview and physical and blood biochemical examinations were performed at baseline (July to August 2007 and July to August 2008 and follow-up (July to August 2013 and July to October 2014. A Cox regression model was used to weigh each variable in the derivation dataset. For each significant variable, a score was calculated by multiplying β by 100 and rounding to the nearest integer. Age, body mass index, triglycerides and fasting plasma glucose (scores 3, 12, 24 and 76, respectively were predictors of incident T2DM. The model accuracy was assessed by the area under the receiver operating characteristic curve (AUC, with optimal cut-off value 936. With the derivation dataset, sensitivity, specificity and AUC of the model were 66.7%, 74.0% and 0.768 (95% CI 0.760-0.776, respectively. With the validation dataset, the performance of the model was superior to the Chinese (simple, FINDRISC, Oman and IDRS models of T2DM risk but equivalent to the Framingham model, which is widely applicable in a variety of populations. Our model for predicting 6-year risk of T2DM could be used in a rural adult Chinese population.

  6. Development and validation of an in vitro–in vivo correlation (IVIVC model for propranolol hydrochloride extended-release matrix formulations

    Directory of Open Access Journals (Sweden)

    Chinhwa Cheng

    2014-06-01

    Full Text Available The objective of this study was to develop an in vitro–in vivo correlation (IVIVC model for hydrophilic matrix extended-release (ER propranolol dosage formulations. The in vitro release characteristics of the drug were determined using USP apparatus I at 100 rpm, in a medium of varying pH (from pH 1.2 to pH 6.8. In vivo plasma concentrations and pharmacokinetic parameters in male beagle dogs were obtained after administering oral, ER formulations and immediate-release (IR commercial products. The similarity factor f2 was used to compare the dissolution data. The IVIVC model was developed using pooled fraction dissolved and fraction absorbed of propranolol ER formulations, ER-F and ER-S, with different release rates. An additional formulation ER-V, with a different release rate of propranolol, was prepared for evaluating the external predictability. The results showed that the percentage prediction error (%PE values of Cmax and AUC0–∞ were 0.86% and 5.95%, respectively, for the external validation study. The observed low prediction errors for Cmax and AUC0–∞ demonstrated that the propranolol IVIVC model was valid.

  7. Validation of spectral gas radiation models under oxyfuel conditions

    Energy Technology Data Exchange (ETDEWEB)

    Becher, Johann Valentin

    2013-05-15

    Combustion of hydrocarbon fuels with pure oxygen results in a different flue gas composition than combustion with air. Standard computational-fluid-dynamics (CFD) spectral gas radiation models for air combustion are therefore out of their validity range in oxyfuel combustion. This thesis provides a common spectral basis for the validation of new spectral models. A literature review about fundamental gas radiation theory, spectral modeling and experimental methods provides the reader with a basic understanding of the topic. In the first results section, this thesis validates detailed spectral models with high resolution spectral measurements in a gas cell with the aim of recommending one model as the best benchmark model. In the second results section, spectral measurements from a turbulent natural gas flame - as an example for a technical combustion process - are compared to simulated spectra based on measured gas atmospheres. The third results section compares simplified spectral models to the benchmark model recommended in the first results section and gives a ranking of the proposed models based on their accuracy. A concluding section gives recommendations for the selection and further development of simplified spectral radiation models. Gas cell transmissivity spectra in the spectral range of 2.4 - 5.4 {mu}m of water vapor and carbon dioxide in the temperature range from 727 C to 1500 C and at different concentrations were compared in the first results section at a nominal resolution of 32 cm{sup -1} to line-by-line models from different databases, two statistical-narrow-band models and the exponential-wide-band model. The two statistical-narrow-band models EM2C and RADCAL showed good agreement with a maximal band transmissivity deviation of 3 %. The exponential-wide-band model showed a deviation of 6 %. The new line-by-line database HITEMP2010 had the lowest band transmissivity deviation of 2.2% and was therefore recommended as a reference model for the

  8. Predictive modeling of infrared radiative heating in tomato dry-peeling process: Part II. Model validation and sensitivity analysis

    Science.gov (United States)

    A predictive mathematical model was developed to simulate heat transfer in a tomato undergoing double sided infrared (IR) heating in a dry-peeling process. The aims of this study were to validate the developed model using experimental data and to investigate different engineering parameters that mos...

  9. Validation of a Hot Water Distribution Model Using Laboratory and Field Data

    Energy Technology Data Exchange (ETDEWEB)

    Backman, C. [Alliance for Residential Building Innovation (ARBI), Davis, CA (United States); Hoeschele, M. [Alliance for Residential Building Innovation (ARBI), Davis, CA (United States)

    2013-07-01

    Characterizing the performance of hot water distribution systems is a critical step in developing best practice guidelines for the design and installation of high performance hot water systems. Developing and validating simulation models is critical to this effort, as well as collecting accurate input data to drive the models. In this project, the Building America research team ARBI validated the newly developed TRNSYS Type 604 pipe model against both detailed laboratory and field distribution system performance data. Validation efforts indicate that the model performs very well in handling different pipe materials, insulation cases, and varying hot water load conditions. Limitations of the model include the complexity of setting up the input file and long simulation run times. This project also looked at recent field hot water studies to better understand use patterns and potential behavioral changes as homeowners convert from conventional storage water heaters to gas tankless units. The team concluded that the current Energy Factor test procedure overestimates typical use and underestimates the number of hot water draws, which has implications for both equipment and distribution system performance. Gas tankless water heaters were found to impact how people use hot water, but the data does not necessarily suggest an increase in usage. Further study in hot water usage and patterns is needed to better define these characteristics in different climates and home vintages.

  10. Validation of models in an imaging infrared simulation

    CSIR Research Space (South Africa)

    Willers, C

    2007-10-01

    Full Text Available threeprocessesfortransformingtheinformationbetweentheentities. Reality/ Problem Entity Conceptual Model Computerized Model Model Validation ModelVerification Model Qualification Computer Implementation Analysisand Modelling Simulationand Experimentation “Substantiationthata....C.Refsgaard ,ModellingGuidelines-terminology andguidingprinciples, AdvancesinWaterResources, Vol27,No1,January2004,?pp.71-82(12),Elsevier. et.al. [5]N.Oreskes,et.al.,Verification,Validation,andConfirmationof NumericalModelsintheEarthSciences,Science,Vol263, Number...

  11. Transient Model Validation of Fixed-Speed Induction Generator Using Wind Farm Measurements

    DEFF Research Database (Denmark)

    Rogdakis, Georgios; Garcia-Valle, Rodrigo; Arana Aristi, Iván

    2012-01-01

    In this paper, an electromagnetic transient model for fixed-speed wind turbines equipped with induction generators is developed and implemented in PSCAD/EMTDC. The model is comprised by: an induction generator, aerodynamic rotor, and a two-mass representation of the shaft system. Model validation...

  12. Clinical prediction models for bronchopulmonary dysplasia: a systematic review and external validation study

    NARCIS (Netherlands)

    Onland, Wes; Debray, Thomas P.; Laughon, Matthew M.; Miedema, Martijn; Cools, Filip; Askie, Lisa M.; Asselin, Jeanette M.; Calvert, Sandra A.; Courtney, Sherry E.; Dani, Carlo; Durand, David J.; Marlow, Neil; Peacock, Janet L.; Pillow, J. Jane; Soll, Roger F.; Thome, Ulrich H.; Truffert, Patrick; Schreiber, Michael D.; van Reempts, Patrick; Vendettuoli, Valentina; Vento, Giovanni; van Kaam, Anton H.; Moons, Karel G.; Offringa, Martin

    2013-01-01

    Bronchopulmonary dysplasia (BPD) is a common complication of preterm birth. Very different models using clinical parameters at an early postnatal age to predict BPD have been developed with little extensive quantitative validation. The objective of this study is to review and validate clinical

  13. Validation of the dynamic model for a pressurized water reactor

    International Nuclear Information System (INIS)

    Zwingelstein, Gilles.

    1979-01-01

    Dynamic model validation is a necessary procedure to assure that the developed empirical or physical models are satisfactorily representing the dynamic behavior of the actual plant during normal or abnormal transients. For small transients, physical models which represent isolated core, isolated steam generator and the overall pressurized water reactor are described. Using data collected during the step power changes that occured during the startup procedures, comparisons of experimental and actual transients are given at 30% and 100% of full power. The agreement between the transients derived from the model and those recorded on the plant indicates that the developed models are well suited for use for functional or control studies

  14. Social anxiety questionnaire (SAQ): Development and preliminary validation.

    Science.gov (United States)

    Łakuta, Patryk

    2018-05-30

    The Social Anxiety Questionnaire (SAQ) was designed to assess five dimensions of social anxiety as posited by the Clark and Wells' (1995; Clark, 2001) cognitive model. The development of the SAQ involved generation of an item pool, followed by a verification of content validity and the theorized factor structure (Study 1). The final version of the SAQ was then assessed for reliability, temporal stability (test re-test reliability), and construct, criterion-related, and contrasted-group validity (Study 2, 3, and 4). Following a systematic process, the results provide support for the SAQ as reliable, and both theoretically and empirically valid measure. A five-factor structure of the SAQ verified and replicated through confirmatory factor analyses reflect five dimensions of social anxiety: negative self-processing; self-focused attention and self-monitoring; safety behaviours; somatic and cognitive symptoms; and anticipatory and post-event rumination. Results suggest that the SAQ possesses good psychometric properties, while recognizing that additional validation is a required future research direction. It is important to replicate these findings in diverse populations, including a large clinical sample. The SAQ is a promising measure that supports social anxiety as a multidimensional construct, and the foundational role of self-focused cognitive processes in generation and maintenance of social anxiety symptoms. The findings make a significant contribution to the literature, moreover, the SAQ is a first instrument that offers to assess all, proposed by the Clark-Wells model, specific cognitive-affective, physiological, attitudinal, and attention processes related to social anxiety. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. Towards the development of improved tests for negative symptoms of schizophrenia in a validated animal model.

    Science.gov (United States)

    Sahin, Ceren; Doostdar, Nazanin; Neill, Joanna C

    2016-10-01

    Negative symptoms in schizophrenia remain an unmet clinical need. There is no licensed treatment specifically for this debilitating aspect of the disorder and effect sizes of new therapies are too small to make an impact on quality of life and function. Negative symptoms are multifactorial but often considered in terms of two domains, expressive deficit incorporating blunted affect and poverty of speech and avolition incorporating asociality and lack of drive. There is a clear need for improved understanding of the neurobiology of negative symptoms which can be enabled through the use of carefully validated animal models. While there are several tests for assessing sociability in animals, tests for blunted affect in schizophrenia are currently lacking. Two paradigms have recently been developed for assessing negative affect of relevance to depression in rats. Here we assess their utility for studying negative symptoms in schizophrenia using our well validated model for schizophrenia of sub-chronic (sc) treatment with Phencyclidine (PCP) in adult female rats. Results demonstrate that sc PCP treatment produces a significant negative affect bias in response to a high value reward in the optimistic and affective bias tests. Our results are not easily explained by the known cognitive deficits induced by sc PCP and support the hypothesis of a negative affective bias in this model. We suggest that further refinement of these two tests will provide a means to investigate the neurobiological basis of negative affect in schizophrenia, thus supporting the assessment of efficacy of new targets for this currently untreated symptom domain. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Development and validation of an extensive growth and growth boundary model for psychrotolerant Lactobacillus spp. in seafood and meat products

    DEFF Research Database (Denmark)

    Mejlholm, Ole; Dalgaard, Paw

    2013-01-01

    A new and extensive growth and growth boundary model for psychrotolerant Lactobacillus spp. was developed and validated for processed and unprocessed products of seafood and meat. The new model was developed by refitting and expanding an existing cardinal parameter model for growth and the growth...... of psychrotolerant Lactobacillus spp. was clearly demonstrated. The new model can be used to predict growth of psychrotolerant Lactobacillus spp. in seafood and meat products e.g. prediction of the time to a critical cell concentration of bacteria is considered useful for establishing the shelf life. In addition...... boundary of lactic acid bacteria (LAB) in processed seafood (O. Mejlholm and P. Dalgaard, J. Food Prot. 70. 2485–2497, 2007). Initially, to estimate values for the maximum specific growth rate at the reference temperature of 25°C (μref) and the theoretical minimum temperature that prevents growth...

  17. Validation Testing of a Peridynamic Impact Damage Model Using NASA's Micro-Particle Gun

    Science.gov (United States)

    Baber, Forrest E.; Zelinski, Brian J.; Guven, Ibrahim; Gray, Perry

    2017-01-01

    Through a collaborative effort between the Virginia Commonwealth University and Raytheon, a peridynamic model for sand impact damage has been developed1-3. Model development has focused on simulating impacts of sand particles on ZnS traveling at velocities consistent with aircraft take-off and landing speeds. The model reproduces common features of impact damage including pit and radial cracks, and, under some conditions, lateral cracks. This study focuses on a preliminary validation exercise in which simulation results from the peridynamic model are compared to a limited experimental data set generated by NASA's recently developed micro-particle gun (MPG). The MPG facility measures the dimensions and incoming and rebound velocities of the impact particles. It also links each particle to a specific impact site and its associated damage. In this validation exercise parameters of the peridynamic model are adjusted to fit the experimentally observed pit diameter, average length of radial cracks and rebound velocities for 4 impacts of 300 µm glass beads on ZnS. Results indicate that a reasonable fit of these impact characteristics can be obtained by suitable adjustment of the peridynamic input parameters, demonstrating that the MPG can be used effectively as a validation tool for impact modeling and that the peridynamic sand impact model described herein possesses not only a qualitative but also a quantitative ability to simulate sand impact events.

  18. Development and validation of a new simple Healthy Meal Index for canteen meals

    DEFF Research Database (Denmark)

    Lassen, Anne Dahl; Biltoft-Jensen, Anja Pia; L Hansen, Gitte

    2010-01-01

    OBJECTIVE: Nutrition evaluation tools should be developed both for scientific purposes and to encourage and facilitate healthy nutritional practices. The purpose of the present study was to develop and validate a simple food-based Healthy Meal Index (HMI) reflecting the nutritional profile...... and potatoes. The development was built on the principles embodied by the Plate Model, but providing more specificity in some areas. The simple HMI was validated against weighed and chemically analysed food and nutrient content of a representative sample of canteen meals. The sample was split into four...

  19. Development and validation of a multidimensional measure of lean manufacturing

    Directory of Open Access Journals (Sweden)

    Juan A. Marin-Garcia

    2010-02-01

    Full Text Available In the last 30 years of research of lean manufacturing many different questionnaires was proposed to check the degree of the use of the concept. The set of the items used changed considerably from one investigation to another one. Until now isn’t appreciate a movement that converge towards the use, by the investigators, of a few instruments whose validity and reliability have been compared in different surroundings. In fact, the majority of investigations are based on ad-hoc questionnaires and a few of them present the questionnaire validation checking only the unidimensionality and -Cronbach. Nevertheless it seems to have a consensus in identifying 5 big constructs that compose the lean manufacturing (TQM, JIT, TPM, supply chain management and high-involvement. Our research has consisted of identifying and summarizing the models that have been published previously to add the items in constructs or sub-scales of constructs. Later we developed an integrating questionnaire, starting off of the items that appeared in previous investigations. Finally we realized the sub-scales and models validation through a confirmatory factorial analysis, using date of a sample of Spanish Sheltered Work Centre’s (N=128. Of all proposed models, the best an adjustment takes place with the first order model with 20 sub-scales. Our investigation contributes to an integrating vision of the published models and the lean manufacturing sub-scales validity and reliability verification raised by other investigators. Due to his confirming approach, it can serve as generalization of studies that had been realized in contexts with different samples to which we have used for the replication.

  20. Validation Assessment of a Glass-to-Metal Seal Finite-Element Model

    Energy Technology Data Exchange (ETDEWEB)

    Jamison, Ryan Dale [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Buchheit, Thomas E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Emery, John M [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Romero, Vicente J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Stavig, Mark E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Newton, Clay S. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Brown, Arthur [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-10-01

    Sealing glasses are ubiquitous in high pressure and temperature engineering applications, such as hermetic feed-through electrical connectors. A common connector technology are glass-to-metal seals where a metal shell compresses a sealing glass to create a hermetic seal. Though finite-element analysis has been used to understand and design glass-to-metal seals for many years, there has been little validation of these models. An indentation technique was employed to measure the residual stress on the surface of a simple glass-to-metal seal. Recently developed rate- dependent material models of both Schott 8061 and 304L VAR stainless steel have been applied to a finite-element model of the simple glass-to-metal seal. Model predictions of residual stress based on the evolution of material models are shown. These model predictions are compared to measured data. Validity of the finite- element predictions is discussed. It will be shown that the finite-element model of the glass-to-metal seal accurately predicts the mean residual stress in the glass near the glass-to-metal interface and is valid for this quantity of interest.

  1. How to enhance the future use of energy policy simulation models through ex post validation

    International Nuclear Information System (INIS)

    Qudrat-Ullah, Hassan

    2017-01-01

    Although simulation and modeling in general and system dynamics models in particular has long served the energy policy domain, ex post validation of these energy policy models is rarely addressed. In fact, ex post validation is a valuable area of research because it offers modelers a chance to enhance the future use of their simulation models by validating them against the field data. This paper contributes by presenting (i) a system dynamics simulation model, which was developed and used to do a three dimensional, socio-economical and environmental long-term assessment of Pakistan's energy policy in 1999, (ii) a systematic analysis of the 15-years old predictive scenarios produced by a system dynamics simulation model through ex post validation. How did the model predictions compare with the actual data? We report that the ongoing crisis of the electricity sector of Pakistan is unfolding, as the model-based scenarios had projected. - Highlights: • Argues that increased use of energy policy models is dependent on their credibility validation. • An ex post validation process is presented as a solution to build confidence in models. • A unique system dynamics model, MDESRAP, is presented. • The root mean square percentage error and Thiel's inequality statistics are applied. • The dynamic model, MDESRAP, is presented as an ex ante and ex post validated model.

  2. Fostering creativity in product and service development: validation in the domain of information technology.

    Science.gov (United States)

    Zeng, Liang; Proctor, Robert W; Salvendy, Gavriel

    2011-06-01

    This research is intended to empirically validate a general model of creative product and service development proposed in the literature. A current research gap inspired construction of a conceptual model to capture fundamental phases and pertinent facilitating metacognitive strategies in the creative design process. The model also depicts the mechanism by which design creativity affects consumer behavior. The validity and assets of this model have not yet been investigated. Four laboratory studies were conducted to demonstrate the value of the proposed cognitive phases and associated metacognitive strategies in the conceptual model. Realistic product and service design problems were used in creativity assessment to ensure ecological validity. Design creativity was enhanced by explicit problem analysis, whereby one formulates problems from different perspectives and at different levels of abstraction. Remote association in conceptual combination spawned more design creativity than did near association. Abstraction led to greater creativity in conducting conceptual expansion than did specificity, which induced mental fixation. Domain-specific knowledge and experience enhanced design creativity, indicating that design can be of a domain-specific nature. Design creativity added integrated value to products and services and positively influenced customer behavior. The validity and value of the proposed conceptual model is supported by empirical findings. The conceptual model of creative design could underpin future theory development. Propositions advanced in this article should provide insights and approaches to facilitate organizations pursuing product and service creativity to gain competitive advantage.

  3. Prediction of early death among patients enrolled in phase I trials: development and validation of a new model based on platelet count and albumin.

    Science.gov (United States)

    Ploquin, A; Olmos, D; Lacombe, D; A'Hern, R; Duhamel, A; Twelves, C; Marsoni, S; Morales-Barrera, R; Soria, J-C; Verweij, J; Voest, E E; Schöffski, P; Schellens, J H; Kramar, A; Kristeleit, R S; Arkenau, H-T; Kaye, S B; Penel, N

    2012-09-25

    Selecting patients with 'sufficient life expectancy' for Phase I oncology trials remains challenging. The Royal Marsden Hospital Score (RMS) previously identified high-risk patients as those with ≥ 2 of the following: albumin upper limit of normal; >2 metastatic sites. This study developed an alternative prognostic model, and compared its performance with that of the RMS. The primary end point was the 90-day mortality rate. The new model was developed from the same database as RMS, but it used Chi-squared Automatic Interaction Detection (CHAID). The ROC characteristics of both methods were then validated in an independent database of 324 patients enrolled in European Organization on Research and Treatment of Cancer Phase I trials of cytotoxic agents between 2000 and 2009. The CHAID method identified high-risk patients as those with albumin model and RMS, respectively. The negative predictive values (NPV) were similar for the CHAID model and RMS. The CHAID model and RMS provided a similarly high level of NPV, but the CHAID model gave a better accuracy in the validation set. Both CHAID model and RMS may improve the screening process in phase I trials.

  4. Validation of a two-fluid model used for the simulation of dense fluidized beds; Validation d`un modele a deux fluides applique a la simulation des lits fluidises denses

    Energy Technology Data Exchange (ETDEWEB)

    Boelle, A.

    1997-02-17

    A two-fluid model applied to the simulation of gas-solid dense fluidized beds is validated on micro scale and on macro scale. Phase coupling is carried out in the momentum and energy transport equation of both phases. The modeling is built on the kinetic theory of granular media in which the gas action has been taken into account in order to get correct expressions of transport coefficients. A description of hydrodynamic interactions between particles in high Stokes number flow is also incorporated in the model. The micro scale validation uses Lagrangian numerical simulations viewed as numerical experiments. The first validation case refers to a gas particle simple shear flow. It allows to validate the competition between two dissipation mechanisms: drag and particle collisions. The second validation case is concerted with sedimenting particles in high Stokes number flow. It allows to validate our approach of hydrodynamic interactions. This last case had led us to develop an original Lagrangian simulation with a two-way coupling between the fluid and the particles. The macro scale validation uses the results of Eulerian simulations of dense fluidized bed. Bed height, particles circulation and spontaneous created bubbles characteristics are studied and compared to experimental measurement, both looking at physical and numerical parameters. (author) 159 refs.

  5. Software Process Validation: Quantitatively Measuring the Correspondence of a Process to a Model

    National Research Council Canada - National Science Library

    Cook, Jonathan E; Wolf, Alexander L

    1997-01-01

    .... When process models and process executions diverge, something significant is happening. The authors have developed techniques for uncovering and measuring the discrepancies between models and executions, which they call process validation...

  6. Development and validation of spray models for investigating diesel engine combustion and emissions

    Science.gov (United States)

    Som, Sibendu

    Diesel engines intrinsically generate NOx and particulate matter which need to be reduced significantly in order to comply with the increasingly stringent regulations worldwide. This motivates the diesel engine manufacturers to gain fundamental understanding of the spray and combustion processes so as to optimize these processes and reduce engine emissions. Strategies being investigated to reduce engine's raw emissions include advancements in fuel injection systems, efficient nozzle orifice design, injection and combustion control strategies, exhaust gas recirculation, use of alternative fuels such as biodiesel etc. This thesis explores several of these approaches (such as nozzle orifice design, injection control strategy, and biodiesel use) by performing computer modeling of diesel engine processes. Fuel atomization characteristics are known to have a significant effect on the combustion and emission processes in diesel engines. Primary fuel atomization is induced by aerodynamics in the near nozzle region as well as cavitation and turbulence from the injector nozzle. The breakup models that are currently used in diesel engine simulations generally consider aerodynamically induced breakup using the Kelvin-Helmholtz (KH) instability model, but do not account for inner nozzle flow effects. An improved primary breakup (KH-ACT) model incorporating cavitation and turbulence effects along with aerodynamically induced breakup is developed and incorporated in the computational fluid dynamics code CONVERGE. The spray simulations using KH-ACT model are "quasi-dynamically" coupled with inner nozzle flow (using FLUENT) computations. This presents a novel tool to capture the influence of inner nozzle flow effects such as cavitation and turbulence on spray, combustion, and emission processes. Extensive validation is performed against the non-evaporating spray data from Argonne National Laboratory. Performance of the KH and KH-ACT models is compared against the evaporating and

  7. Transient simulation of an endothermic chemical process facility coupled to a high temperature reactor: Model development and validation

    International Nuclear Information System (INIS)

    Brown, Nicholas R.; Seker, Volkan; Revankar, Shripad T.; Downar, Thomas J.

    2012-01-01

    Highlights: ► Models for PBMR and thermochemical sulfur cycle based hydrogen plant are developed. ► Models are validated against available data in literature. ► Transient in coupled reactor and hydrogen plant system is studied. ► For loss-of-heat sink accident, temperature feedback within the reactor core enables shut down of the reactor. - Abstract: A high temperature reactor (HTR) is a candidate to drive high temperature water-splitting using process heat. While both high temperature nuclear reactors and hydrogen generation plants have high individual degrees of development, study of the coupled plant is lacking. Particularly absent are considerations of the transient behavior of the coupled plant, as well as studies of the safety of the overall plant. The aim of this document is to contribute knowledge to the effort of nuclear hydrogen generation. In particular, this study regards identification of safety issues in the coupled plant and the transient modeling of some leading candidates for implementation in the Nuclear Hydrogen Initiative (NHI). The Sulfur Iodine (SI) and Hybrid Sulfur (HyS) cycles are considered as candidate hydrogen generation schemes. Three thermodynamically derived chemical reaction chamber models are coupled to a well-known reference design of a high temperature nuclear reactor. These chemical reaction chamber models have several dimensions of validation, including detailed steady state flowsheets, integrated loop test data, and bench scale chemical kinetics. The models and coupling scheme are presented here, as well as a transient test case initiated within the chemical plant. The 50% feed flow failure within the chemical plant results in a slow loss-of-heat sink (LOHS) accident in the nuclear reactor. Due to the temperature feedback within the reactor core the nuclear reactor partially shuts down over 1500 s. Two distinct regions are identified within the coupled plant response: (1) immediate LOHS due to the loss of the sulfuric

  8. Updating and prospective validation of a prognostic model for high sickness absence

    NARCIS (Netherlands)

    Roelen, C.A.M.; Heymans, M.W.; Twisk, J.W.R.; van Rhenen, W.; Pallesen, S.; Bjorvatn, B.; Moen, B.E.; Mageroy, N.

    2015-01-01

    Objectives To further develop and validate a Dutch prognostic model for high sickness absence (SA). Methods Three-wave longitudinal cohort study of 2,059 Norwegian nurses. The Dutch prognostic model was used to predict high SA among Norwegian nurses at wave 2. Subsequently, the model was updated by

  9. NAIRAS aircraft radiation model development, dose climatology, and initial validation.

    Science.gov (United States)

    Mertens, Christopher J; Meier, Matthias M; Brown, Steven; Norman, Ryan B; Xu, Xiaojing

    2013-10-01

    [1] The Nowcast of Atmospheric Ionizing Radiation for Aviation Safety (NAIRAS) is a real-time, global, physics-based model used to assess radiation exposure to commercial aircrews and passengers. The model is a free-running physics-based model in the sense that there are no adjustment factors applied to nudge the model into agreement with measurements. The model predicts dosimetric quantities in the atmosphere from both galactic cosmic rays (GCR) and solar energetic particles, including the response of the geomagnetic field to interplanetary dynamical processes and its subsequent influence on atmospheric dose. The focus of this paper is on atmospheric GCR exposure during geomagnetically quiet conditions, with three main objectives. First, provide detailed descriptions of the NAIRAS GCR transport and dosimetry methodologies. Second, present a climatology of effective dose and ambient dose equivalent rates at typical commercial airline altitudes representative of solar cycle maximum and solar cycle minimum conditions and spanning the full range of geomagnetic cutoff rigidities. Third, conduct an initial validation of the NAIRAS model by comparing predictions of ambient dose equivalent rates with tabulated reference measurement data and recent aircraft radiation measurements taken in 2008 during the minimum between solar cycle 23 and solar cycle 24. By applying the criterion of the International Commission on Radiation Units and Measurements (ICRU) on acceptable levels of aircraft radiation dose uncertainty for ambient dose equivalent greater than or equal to an annual dose of 1 mSv, the NAIRAS model is within 25% of the measured data, which fall within the ICRU acceptable uncertainty limit of 30%. The NAIRAS model predictions of ambient dose equivalent rate are generally within 50% of the measured data for any single-point comparison. The largest differences occur at low latitudes and high cutoffs, where the radiation dose level is low. Nevertheless, analysis

  10. NAIRAS aircraft radiation model development, dose climatology, and initial validation

    Science.gov (United States)

    Mertens, Christopher J.; Meier, Matthias M.; Brown, Steven; Norman, Ryan B.; Xu, Xiaojing

    2013-10-01

    The Nowcast of Atmospheric Ionizing Radiation for Aviation Safety (NAIRAS) is a real-time, global, physics-based model used to assess radiation exposure to commercial aircrews and passengers. The model is a free-running physics-based model in the sense that there are no adjustment factors applied to nudge the model into agreement with measurements. The model predicts dosimetric quantities in the atmosphere from both galactic cosmic rays (GCR) and solar energetic particles, including the response of the geomagnetic field to interplanetary dynamical processes and its subsequent influence on atmospheric dose. The focus of this paper is on atmospheric GCR exposure during geomagnetically quiet conditions, with three main objectives. First, provide detailed descriptions of the NAIRAS GCR transport and dosimetry methodologies. Second, present a climatology of effective dose and ambient dose equivalent rates at typical commercial airline altitudes representative of solar cycle maximum and solar cycle minimum conditions and spanning the full range of geomagnetic cutoff rigidities. Third, conduct an initial validation of the NAIRAS model by comparing predictions of ambient dose equivalent rates with tabulated reference measurement data and recent aircraft radiation measurements taken in 2008 during the minimum between solar cycle 23 and solar cycle 24. By applying the criterion of the International Commission on Radiation Units and Measurements (ICRU) on acceptable levels of aircraft radiation dose uncertainty for ambient dose equivalent greater than or equal to an annual dose of 1 mSv, the NAIRAS model is within 25% of the measured data, which fall within the ICRU acceptable uncertainty limit of 30%. The NAIRAS model predictions of ambient dose equivalent rate are generally within 50% of the measured data for any single-point comparison. The largest differences occur at low latitudes and high cutoffs, where the radiation dose level is low. Nevertheless, analysis suggests

  11. Selection, calibration, and validation of models of tumor growth.

    Science.gov (United States)

    Lima, E A B F; Oden, J T; Hormuth, D A; Yankeelov, T E; Almeida, R C

    2016-11-01

    This paper presents general approaches for addressing some of the most important issues in predictive computational oncology concerned with developing classes of predictive models of tumor growth. First, the process of developing mathematical models of vascular tumors evolving in the complex, heterogeneous, macroenvironment of living tissue; second, the selection of the most plausible models among these classes, given relevant observational data; third, the statistical calibration and validation of models in these classes, and finally, the prediction of key Quantities of Interest (QOIs) relevant to patient survival and the effect of various therapies. The most challenging aspects of this endeavor is that all of these issues often involve confounding uncertainties: in observational data, in model parameters, in model selection, and in the features targeted in the prediction. Our approach can be referred to as "model agnostic" in that no single model is advocated; rather, a general approach that explores powerful mixture-theory representations of tissue behavior while accounting for a range of relevant biological factors is presented, which leads to many potentially predictive models. Then representative classes are identified which provide a starting point for the implementation of OPAL, the Occam Plausibility Algorithm (OPAL) which enables the modeler to select the most plausible models (for given data) and to determine if the model is a valid tool for predicting tumor growth and morphology ( in vivo ). All of these approaches account for uncertainties in the model, the observational data, the model parameters, and the target QOI. We demonstrate these processes by comparing a list of models for tumor growth, including reaction-diffusion models, phase-fields models, and models with and without mechanical deformation effects, for glioma growth measured in murine experiments. Examples are provided that exhibit quite acceptable predictions of tumor growth in laboratory

  12. Developing and validating a new precise risk-prediction model for new-onset hypertension: The Jichi Genki hypertension prediction model (JG model).

    Science.gov (United States)

    Kanegae, Hiroshi; Oikawa, Takamitsu; Suzuki, Kenji; Okawara, Yukie; Kario, Kazuomi

    2018-03-31

    No integrated risk assessment tools that include lifestyle factors and uric acid have been developed. In accordance with the Industrial Safety and Health Law in Japan, a follow-up examination of 63 495 normotensive individuals (mean age 42.8 years) who underwent a health checkup in 2010 was conducted every year for 5 years. The primary endpoint was new-onset hypertension (systolic blood pressure [SBP]/diastolic blood pressure [DBP] ≥ 140/90 mm Hg and/or the initiation of antihypertensive medications with self-reported hypertension). During the mean 3.4 years of follow-up, 7402 participants (11.7%) developed hypertension. The prediction model included age, sex, body mass index (BMI), SBP, DBP, low-density lipoprotein cholesterol, uric acid, proteinuria, current smoking, alcohol intake, eating rate, DBP by age, and BMI by age at baseline and was created by using Cox proportional hazards models to calculate 3-year absolute risks. The derivation analysis confirmed that the model performed well both with respect to discrimination and calibration (n = 63 495; C-statistic = 0.885, 95% confidence interval [CI], 0.865-0.903; χ 2 statistic = 13.6, degree of freedom [df] = 7). In the external validation analysis, moreover, the model performed well both in its discrimination and calibration characteristics (n = 14 168; C-statistic = 0.846; 95%CI, 0.775-0.905; χ 2 statistic = 8.7, df = 7). Adding LDL cholesterol, uric acid, proteinuria, alcohol intake, eating rate, and BMI by age to the base model yielded a significantly higher C-statistic, net reclassification improvement (NRI), and integrated discrimination improvement, especially NRI non-event (NRI = 0.127, 95%CI = 0.100-0.152; NRI non-event  = 0.108, 95%CI = 0.102-0.117). In conclusion, a highly precise model with good performance was developed for predicting incident hypertension using the new parameters of eating rate, uric acid, proteinuria, and BMI by age. ©2018 Wiley Periodicals, Inc.

  13. Development and validation of a logistic regression model to distinguish transition zone cancers from benign prostatic hyperplasia on multi-parametric prostate MRI

    Energy Technology Data Exchange (ETDEWEB)

    Iyama, Yuji [Kumamoto Chuo Hospital, Department of Diagnostic Radiology, Kumamoto, Kumamoto (Japan); Kumamoto University, Department of Diagnostic Radiology, Graduate School of Medical Sciences, Kumamoto, Kumamoto (Japan); Nakaura, Takeshi; Nagayama, Yasunori; Utsunomiya, Daisuke; Yamashita, Yasuyuki [Kumamoto University, Department of Diagnostic Radiology, Graduate School of Medical Sciences, Kumamoto, Kumamoto (Japan); Katahira, Kazuhiro; Oda, Seitaro [Kumamoto Chuo Hospital, Department of Diagnostic Radiology, Kumamoto, Kumamoto (Japan); Iyama, Ayumi [National Hospital Organization Kumamoto Medical Center, Department of Diagnostic Radiology, Kumamoto, Kumamoto (Japan)

    2017-09-15

    To develop a prediction model to distinguish between transition zone (TZ) cancers and benign prostatic hyperplasia (BPH) on multi-parametric prostate magnetic resonance imaging (mp-MRI). This retrospective study enrolled 60 patients with either BPH or TZ cancer, who had undergone 3 T-MRI. We generated ten parameters for T2-weighted images (T2WI), diffusion-weighted images (DWI) and dynamic MRI. Using a t-test and multivariate logistic regression (LR) analysis to evaluate the parameters' accuracy, we developed LR models. We calculated the area under the receiver operating characteristic curve (ROC) of LR models by a leave-one-out cross-validation procedure, and the LR model's performance was compared with radiologists' performance with their opinion and with the Prostate Imaging Reporting and Data System (Pi-RADS v2) score. Multivariate LR analysis showed that only standardized T2WI signal and mean apparent diffusion coefficient (ADC) maintained their independent values (P < 0.001). The validation analysis showed that the AUC of the final LR model was comparable to that of board-certified radiologists, and superior to that of Pi-RADS scores. A standardized T2WI and mean ADC were independent factors for distinguishing between BPH and TZ cancer. The performance of the LR model was comparable to that of experienced radiologists. (orig.)

  14. Sewer solids separation by sedimentation--the problem of modeling, validation and transferability.

    Science.gov (United States)

    Kutzner, R; Brombach, H; Geiger, W F

    2007-01-01

    Sedimentation of sewer solids in tanks, ponds and similar devices is the most relevant process for the treatment of stormwater and combined sewer overflows in urban collecting systems. In the past a lot of research work was done to develop deterministic models for the description of this separation process. But these modern models are not commonly accepted in Germany until today. Water Authorities are sceptical with regard to model validation and transferability. Within this paper it is checked whether this scepticism is reasonable. A framework-proposal for the validation of mathematical models with zero or one dimensional spatial resolution for particle separation processes for stormwater and combined sewer overflow treatment is presented. This proposal was applied to publications of repute on sewer solids separation by sedimentation. The result was that none of the investigated models described in literature passed the validation entirely. There is an urgent need for future research in sewer solids sedimentation and remobilization!

  15. A free wake vortex lattice model for vertical axis wind turbines: Modeling, verification and validation

    International Nuclear Information System (INIS)

    Meng, Fanzhong; Schwarze, Holger; Vorpahl, Fabian; Strobel, Michael

    2014-01-01

    Since the 1970s several research activities had been carried out on developing aerodynamic models for Vertical Axis Wind Turbines (VAWTs). In order to design large VAWTs of MW scale, more accurate aerodynamic calculation is required to predict their aero-elastic behaviours. In this paper, a 3D free wake vortex lattice model for VAWTs is developed, verified and validated. Comparisons to the experimental results show that the 3D free wake vortex lattice model developed is capable of making an accurate prediction of the general performance and the instantaneous aerodynamic forces on the blades. The comparison between momentum method and the vortex lattice model shows that free wake vortex models are needed for detailed loads calculation and for calculating highly loaded rotors

  16. Development and validation of a measure of food choice values.

    Science.gov (United States)

    Lyerly, Jordan E; Reeve, Charlie L

    2015-06-01

    Food choice values (FCVs) are factors that individuals consider when deciding which foods to purchase and/or consume. Given the potentially important implications for health, it is critical for researchers to have access to a validated measure of FCV. Though there is an existing measure of FCV, this measure was developed 20 years ago and recent research suggests additional FCVs exist that are not included in this measure. A series of four studies was conducted to develop a new expanded measure of FCV. An eight-factor model of FCV was supported and confirmed. In aggregate, results from the four studies indicate that the measure is content valid, and has internally consistent scales that also demonstrated acceptable temporal stability and convergent validity. In addition, the eight scales of the measures were independent of social desirability, met criteria for measurement invariance across income groups, and predicted dietary intake. The development of this new measure of FCV may be useful for researchers examining FCVs (FCVs) in the future, as well as for use in intervention and prevention efforts targeting dietary choices. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Development and validation of a local time stepping-based PaSR solver for combustion and radiation modeling

    DEFF Research Database (Denmark)

    Pang, Kar Mun; Ivarsson, Anders; Haider, Sajjad

    2013-01-01

    In the current work, a local time stepping (LTS) solver for the modeling of combustion, radiative heat transfer and soot formation is developed and validated. This is achieved using an open source computational fluid dynamics code, OpenFOAM. Akin to the solver provided in default assembly i...... library in the edcSimpleFoam solver which was introduced during the 6th OpenFOAM workshop is modified and coupled with the current solver. One of the main amendments made is the integration of soot radiation submodel since this is significant in rich flames where soot particles are formed. The new solver...

  18. Development and validation of a novel predictive scoring model for microvascular invasion in patients with hepatocellular carcinoma

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, Hui [Department of Hepatopancreatobiliary Surgery, Nanjing Drum Tower Hospital Clinical College of Nanjing Medical University, Nanjing, Jiangsu (China); Department of Hepatopancreatobiliary Surgery, Nanjing Medical University Affiliated Wuxi Second People' s Hospital, Wuxi, Jiangsu (China); Hua, Ye [Department of Neurology, Nanjing Medical University Affiliated Wuxi Second People’s Hospital, Wuxi, Jiangsu (China); Dai, Tu [Department of Hepatopancreatobiliary Surgery, Nanjing Medical University Affiliated Wuxi Second People' s Hospital, Wuxi, Jiangsu (China); He, Jian; Tang, Min [Department of Radiology, Drum Tower Hospital, Medical School of Nanjing University, Nanjing, Jiangsu (China); Fu, Xu; Mao, Liang [Department of Hepatopancreatobiliary Surgery, Nanjing Drum Tower Hospital Clinical College of Nanjing Medical University, Nanjing, Jiangsu (China); Jin, Huihan, E-mail: 45687061@qq.com [Department of Hepatopancreatobiliary Surgery, Nanjing Medical University Affiliated Wuxi Second People' s Hospital, Wuxi, Jiangsu (China); Qiu, Yudong, E-mail: yudongqiu510@163.com [Department of Hepatopancreatobiliary Surgery, Nanjing Drum Tower Hospital Clinical College of Nanjing Medical University, Nanjing, Jiangsu (China)

    2017-03-15

    Highlights: • This study aimed to establish a novel predictive scoring model of MVI in HCC patients. • Preoperative imaging features on CECT, such as intratumoral arteries, non-nodule type and absence of radiological tumor capsule were independent predictors for MVI. • The predictive scoring model is of great value in prediction of MVI regardless of tumor size. - Abstract: Purpose: Microvascular invasion (MVI) in patients with hepatocellular carcinoma (HCC) cannot be accurately predicted preoperatively. This study aimed to establish a predictive scoring model of MVI in solitary HCC patients without macroscopic vascular invasion. Methods: A total of 309 consecutive HCC patients who underwent curative hepatectomy were divided into the derivation (n = 206) and validation cohort (n = 103). A predictive scoring model of MVI was established according to the valuable predictors in the derivation cohort based on multivariate logistic regression analysis. The performance of the predictive model was evaluated in the derivation and validation cohorts. Results: Preoperative imaging features on CECT, such as intratumoral arteries, non-nodular type of HCC and absence of radiological tumor capsule were independent predictors for MVI. The predictive scoring model was established according to the β coefficients of the 3 predictors. Area under receiver operating characteristic (AUROC) of the predictive scoring model was 0.872 (95% CI, 0.817-0.928) and 0.856 (95% CI, 0.771-0.940) in the derivation and validation cohorts. The positive and negative predictive values were 76.5% and 88.0% in the derivation cohort and 74.4% and 88.3% in the validation cohort. The performance of the model was similar between the patients with tumor size ≤5 cm and >5 cm in AUROC (P = 0.910). Conclusions: The predictive scoring model based on intratumoral arteries, non-nodular type of HCC, and absence of the radiological tumor capsule on preoperative CECT is of great value in the prediction of MVI

  19. Development and validation of a novel predictive scoring model for microvascular invasion in patients with hepatocellular carcinoma

    International Nuclear Information System (INIS)

    Zhao, Hui; Hua, Ye; Dai, Tu; He, Jian; Tang, Min; Fu, Xu; Mao, Liang; Jin, Huihan; Qiu, Yudong

    2017-01-01

    Highlights: • This study aimed to establish a novel predictive scoring model of MVI in HCC patients. • Preoperative imaging features on CECT, such as intratumoral arteries, non-nodule type and absence of radiological tumor capsule were independent predictors for MVI. • The predictive scoring model is of great value in prediction of MVI regardless of tumor size. - Abstract: Purpose: Microvascular invasion (MVI) in patients with hepatocellular carcinoma (HCC) cannot be accurately predicted preoperatively. This study aimed to establish a predictive scoring model of MVI in solitary HCC patients without macroscopic vascular invasion. Methods: A total of 309 consecutive HCC patients who underwent curative hepatectomy were divided into the derivation (n = 206) and validation cohort (n = 103). A predictive scoring model of MVI was established according to the valuable predictors in the derivation cohort based on multivariate logistic regression analysis. The performance of the predictive model was evaluated in the derivation and validation cohorts. Results: Preoperative imaging features on CECT, such as intratumoral arteries, non-nodular type of HCC and absence of radiological tumor capsule were independent predictors for MVI. The predictive scoring model was established according to the β coefficients of the 3 predictors. Area under receiver operating characteristic (AUROC) of the predictive scoring model was 0.872 (95% CI, 0.817-0.928) and 0.856 (95% CI, 0.771-0.940) in the derivation and validation cohorts. The positive and negative predictive values were 76.5% and 88.0% in the derivation cohort and 74.4% and 88.3% in the validation cohort. The performance of the model was similar between the patients with tumor size ≤5 cm and >5 cm in AUROC (P = 0.910). Conclusions: The predictive scoring model based on intratumoral arteries, non-nodular type of HCC, and absence of the radiological tumor capsule on preoperative CECT is of great value in the prediction of MVI

  20. Development and validation of a human biomechanical model for rib fracture and thorax injuries in blunt impact.

    Science.gov (United States)

    Cai, Zhihua; Lan, Fengchong; Chen, Jiqing

    2015-07-01

    From 1990 to approximately 50,000-120,000 people die annually of road traffic accidents in China. Traffic accidents are the main cause of death of Chinese adults aged 15-45 years. This study aimed to determine the biomechanical response and injury tolerance of the human body in traffic accidents. The subject was a 35-year-old male with a height of 170 cm, weight of 70 kg and Chinese characteristics at the 50th percentile. Geometry was generated by computed tomography and magnetic resonance imaging. A human-body biomechanical model was then developed. The model featured in great detail the main anatomical characteristics of skeletal tissues, soft tissues and internal organs, including the head, neck, shoulder, thoracic cage, abdomen, spine, pelvis, pleurae and lungs, heart, aorta, arms, legs, and other muscle tissues and skeletons. The material properties of all tissues in the human body model were obtained from the literature. Material properties were developed in the LS-DYNA code to simulate the mechanical behaviour of the biological tissues in the human body. The model was validated against cadaver responses to frontal and side impact. The predicted model response reasonably agreed with the experimental data, and the model can further be used to evaluate thoracic injury in real-world crashes. We believe that the transportation industry can use numerical models in the future to simultaneously reduce physical testing and improve automotive safety.

  1. Development and validation of a modified Hybrid-III six-year-old dummy model for simulating submarining in motor-vehicle crashes.

    Science.gov (United States)

    Hu, Jingwen; Klinich, Kathleen D; Reed, Matthew P; Kokkolaras, Michael; Rupp, Jonathan D

    2012-06-01

    In motor-vehicle crashes, young school-aged children restrained by vehicle seat belt systems often suffer from abdominal injuries due to submarining. However, the current anthropomorphic test device, so-called "crash dummy", is not adequate for proper simulation of submarining. In this study, a modified Hybrid-III six-year-old dummy model capable of simulating and predicting submarining was developed using MADYMO (TNO Automotive Safety Solutions). The model incorporated improved pelvis and abdomen geometry and properties previously tested in a modified physical dummy. The model was calibrated and validated against four sled tests under two test conditions with and without submarining using a multi-objective optimization method. A sensitivity analysis using this validated child dummy model showed that dummy knee excursion, torso rotation angle, and the difference between head and knee excursions were good predictors for submarining status. It was also shown that restraint system design variables, such as lap belt angle, D-ring height, and seat coefficient of friction (COF), may have opposite effects on head and abdomen injury risks; therefore child dummies and dummy models capable of simulating submarining are crucial for future restraint system design optimization for young school-aged children. Copyright © 2011 IPEM. Published by Elsevier Ltd. All rights reserved.

  2. Feature Extraction for Structural Dynamics Model Validation

    Energy Technology Data Exchange (ETDEWEB)

    Farrar, Charles [Los Alamos National Laboratory; Nishio, Mayuko [Yokohama University; Hemez, Francois [Los Alamos National Laboratory; Stull, Chris [Los Alamos National Laboratory; Park, Gyuhae [Chonnam Univesity; Cornwell, Phil [Rose-Hulman Institute of Technology; Figueiredo, Eloi [Universidade Lusófona; Luscher, D. J. [Los Alamos National Laboratory; Worden, Keith [University of Sheffield

    2016-01-13

    As structural dynamics becomes increasingly non-modal, stochastic and nonlinear, finite element model-updating technology must adopt the broader notions of model validation and uncertainty quantification. For example, particular re-sampling procedures must be implemented to propagate uncertainty through a forward calculation, and non-modal features must be defined to analyze nonlinear data sets. The latter topic is the focus of this report, but first, some more general comments regarding the concept of model validation will be discussed.

  3. Model performance evaluation (validation and calibration) in model-based studies of therapeutic interventions for cardiovascular diseases : a review and suggested reporting framework.

    Science.gov (United States)

    Haji Ali Afzali, Hossein; Gray, Jodi; Karnon, Jonathan

    2013-04-01

    Decision analytic models play an increasingly important role in the economic evaluation of health technologies. Given uncertainties around the assumptions used to develop such models, several guidelines have been published to identify and assess 'best practice' in the model development process, including general modelling approach (e.g., time horizon), model structure, input data and model performance evaluation. This paper focuses on model performance evaluation. In the absence of a sufficient level of detail around model performance evaluation, concerns regarding the accuracy of model outputs, and hence the credibility of such models, are frequently raised. Following presentation of its components, a review of the application and reporting of model performance evaluation is presented. Taking cardiovascular disease as an illustrative example, the review investigates the use of face validity, internal validity, external validity, and cross model validity. As a part of the performance evaluation process, model calibration is also discussed and its use in applied studies investigated. The review found that the application and reporting of model performance evaluation across 81 studies of treatment for cardiovascular disease was variable. Cross-model validation was reported in 55 % of the reviewed studies, though the level of detail provided varied considerably. We found that very few studies documented other types of validity, and only 6 % of the reviewed articles reported a calibration process. Considering the above findings, we propose a comprehensive model performance evaluation framework (checklist), informed by a review of best-practice guidelines. This framework provides a basis for more accurate and consistent documentation of model performance evaluation. This will improve the peer review process and the comparability of modelling studies. Recognising the fundamental role of decision analytic models in informing public funding decisions, the proposed

  4. The concept of validation of numerical models for consequence analysis

    International Nuclear Information System (INIS)

    Borg, Audun; Paulsen Husted, Bjarne; Njå, Ove

    2014-01-01

    Numerical models such as computational fluid dynamics (CFD) models are increasingly used in life safety studies and other types of analyses to calculate the effects of fire and explosions. The validity of these models is usually established by benchmark testing. This is done to quantitatively measure the agreement between the predictions provided by the model and the real world represented by observations in experiments. This approach assumes that all variables in the real world relevant for the specific study are adequately measured in the experiments and in the predictions made by the model. In this paper the various definitions of validation for CFD models used for hazard prediction are investigated to assess their implication for consequence analysis in a design phase. In other words, how is uncertainty in the prediction of future events reflected in the validation process? The sources of uncertainty are viewed from the perspective of the safety engineer. An example of the use of a CFD model is included to illustrate the assumptions the analyst must make and how these affect the prediction made by the model. The assessments presented in this paper are based on a review of standards and best practice guides for CFD modeling and the documentation from two existing CFD programs. Our main thrust has been to assess how validation work is performed and communicated in practice. We conclude that the concept of validation adopted for numerical models is adequate in terms of model performance. However, it does not address the main sources of uncertainty from the perspective of the safety engineer. Uncertainty in the input quantities describing future events, which are determined by the model user, outweighs the inaccuracies in the model as reported in validation studies. - Highlights: • Examine the basic concept of validation applied to models for consequence analysis. • Review standards and guides for validation of numerical models. • Comparison of the validation

  5. Development and validation of GFR-estimating equations using diabetes, transplant and weight

    DEFF Research Database (Denmark)

    Stevens, L.A.; Schmid, C.H.; Zhang, Y.L.

    2009-01-01

    interactions. Equations were developed in a pooled database of 10 studies [2/3 (N = 5504) for development and 1/3 (N = 2750) for internal validation], and final model selection occurred in 16 additional studies [external validation (N = 3896)]. RESULTS: The mean mGFR was 68, 67 and 68 ml/min/ 1.73 m(2......BACKGROUND: We have reported a new equation (CKD-EPI equation) that reduces bias and improves accuracy for GFR estimation compared to the MDRD study equation while using the same four basic predictor variables: creatinine, age, sex and race. Here, we describe the development and validation...... of this equation as well as other equations that incorporate diabetes, transplant and weight as additional predictor variables. METHODS: Linear regression was used to relate log-measured GFR (mGFR) to sex, race, diabetes, transplant, weight, various transformations of creatinine and age with and without...

  6. Myocardial segmentation based on coronary anatomy using coronary computed tomography angiography: Development and validation in a pig model

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Mi Sun [Chung-Ang University College of Medicine, Department of Radiology, Chung-Ang University Hospital, Seoul (Korea, Republic of); Yang, Dong Hyun; Seo, Joon Beom; Kang, Joon-Won; Lim, Tae-Hwan [Asan Medical Center, University of Ulsan College of Medicine, Department of Radiology and Research Institute of Radiology, Seoul (Korea, Republic of); Kim, Young-Hak; Kang, Soo-Jin; Jung, Joonho [Asan Medical Center, University of Ulsan College of Medicine, Heart Institute, Seoul (Korea, Republic of); Kim, Namkug [Asan Medical Center, University of Ulsan College of Medicine, Department of Convergence Medicine, Seoul (Korea, Republic of); Heo, Seung-Ho [Asan Medical Center, University of Ulsan College of Medicine, Asan institute for Life Science, Seoul (Korea, Republic of); Baek, Seunghee [Asan Medical Center, University of Ulsan College of Medicine, Department of Clinical Epidemiology and Biostatistics, Seoul (Korea, Republic of); Choi, Byoung Wook [Yonsei University, Department of Diagnostic Radiology, College of Medicine, Seoul (Korea, Republic of)

    2017-10-15

    To validate a method for performing myocardial segmentation based on coronary anatomy using coronary CT angiography (CCTA). Coronary artery-based myocardial segmentation (CAMS) was developed for use with CCTA. To validate and compare this method with the conventional American Heart Association (AHA) classification, a single coronary occlusion model was prepared and validated using six pigs. The unstained occluded coronary territories of the specimens and corresponding arterial territories from CAMS and AHA segmentations were compared using slice-by-slice matching and 100 virtual myocardial columns. CAMS more precisely predicted ischaemic area than the AHA method, as indicated by 95% versus 76% (p < 0.001) of the percentage of matched columns (defined as percentage of matched columns of segmentation method divided by number of unstained columns in the specimen). According to the subgroup analyses, CAMS demonstrated a higher percentage of matched columns than the AHA method in the left anterior descending artery (100% vs. 77%; p < 0.001) and mid- (99% vs. 83%; p = 0.046) and apical-level territories of the left ventricle (90% vs. 52%; p = 0.011). CAMS is a feasible method for identifying the corresponding myocardial territories of the coronary arteries using CCTA. (orig.)

  7. Radiation Background and Attenuation Model Validation and Development

    Energy Technology Data Exchange (ETDEWEB)

    Peplow, Douglas E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Santiago, Claudio P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-08-05

    This report describes the initial results of a study being conducted as part of the Urban Search Planning Tool project. The study is comparing the Urban Scene Simulator (USS), a one-dimensional (1D) radiation transport model developed at LLNL, with the three-dimensional (3D) radiation transport model from ORNL using the MCNP, SCALE/ORIGEN and SCALE/MAVRIC simulation codes. In this study, we have analyzed the differences between the two approaches at every step, from source term representation, to estimating flux and detector count rates at a fixed distance from a simple surface (slab), and at points throughout more complex 3D scenes.

  8. Improving Perovskite Solar Cells: Insights From a Validated Device Model

    NARCIS (Netherlands)

    Sherkar, Tejas S.; Momblona, Cristina; Gil-Escrig, Lidon; Bolink, Henk J.; Koster, L. Jan Anton

    2017-01-01

    To improve the efficiency of existing perovskite solar cells (PSCs), a detailed understanding of the underlying device physics during their operation is essential. Here, a device model has been developed and validated that describes the operation of PSCs and quantitatively explains the role of

  9. Development and validation of a free-piston engine generator numerical model

    International Nuclear Information System (INIS)

    Jia, Boru; Zuo, Zhengxing; Tian, Guohong; Feng, Huihua; Roskilly, A.P.

    2015-01-01

    Highlights: • Detailed numerical model of free-piston engine generator is presented. • Sub models for both starting process and steady operation are derived. • Simulation results show good agreement with prototype test data. • Engine performance with different starting motor force and varied loads are simulated. • The efficiency of the prototype is estimated to be 31.5% at a power output of 4 kW under full load. - Abstract: This paper focuses on the numerical modelling of a spark ignited free-piston engine generator and the model validation with test results. Detailed sub-models for both starting process and steady operation were derived. The compression and expansion processes were not regarded as ideal gas isentropic processes; both heat transfer and air leakage were taken into consideration. The simulation results show good agreement with the prototype test data for both the starting process and steady operation. During the starting process, the difference of the in-cylinder gas pressure can be controlled within 1 bar for every running cycle. For the steady operation process, the difference was less than 5% and the areas enclosed on the pressure–volume diagram were similar, indicating that the power produced by the engine and the engine efficiency could be predicted by this model. Based on this model, the starting process with different starting motor forces and the combustion process with various throttle openings were simulated. The engine performance during stable operation at 100% engine load was predicted, and the efficiency of the prototype was estimated to be 31.5% at power output of 4 kW

  10. Dynamic model development and validation for a nitrifying moving bed biofilter: Effect of temperature and influent load on the performance

    DEFF Research Database (Denmark)

    Sin, Gürkan; Weijma, Jan; Spanjers, Henri

    2008-01-01

    A mathematical model with adequate complexity integrating hydraulics, biofilm and microbial conversion processes is successfully developed for a continuously moving bed biofilter performing tertiary nitrification. The model was calibrated and validated using data from Nether Stowey pilot plant...... on the ammonium removal efficiency, doubling nitrification capacity every 5 degrees C increase. However, at temperatures higher than 20 degrees C, the biofilm thickness starts to decrease due to increased decay rate. The influent nitrogen load was also found to be influential on the filter performance, while...... the hydraulic loading had relatively negligible impact. Overall, the calibrated model can now reliably be used for design and process optimization purposes....

  11. Validation of a functional model for integration of safety into process system design

    DEFF Research Database (Denmark)

    Wu, J.; Lind, M.; Zhang, X.

    2015-01-01

    with the process system functionalities as required for the intended safety applications. To provide the scientific rigor and facilitate the acceptance of qualitative modelling, this contribution focuses on developing a scientifically based validation method for functional models. The Multilevel Flow Modeling (MFM...

  12. Analytical models approximating individual processes: a validation method.

    Science.gov (United States)

    Favier, C; Degallier, N; Menkès, C E

    2010-12-01

    Upscaling population models from fine to coarse resolutions, in space, time and/or level of description, allows the derivation of fast and tractable models based on a thorough knowledge of individual processes. The validity of such approximations is generally tested only on a limited range of parameter sets. A more general validation test, over a range of parameters, is proposed; this would estimate the error induced by the approximation, using the original model's stochastic variability as a reference. This method is illustrated by three examples taken from the field of epidemics transmitted by vectors that bite in a temporally cyclical pattern, that illustrate the use of the method: to estimate if an approximation over- or under-fits the original model; to invalidate an approximation; to rank possible approximations for their qualities. As a result, the application of the validation method to this field emphasizes the need to account for the vectors' biology in epidemic prediction models and to validate these against finer scale models. Copyright © 2010 Elsevier Inc. All rights reserved.

  13. Calibration and validation of earthquake catastrophe models. Case study: Impact Forecasting Earthquake Model for Algeria

    Science.gov (United States)

    Trendafiloski, G.; Gaspa Rebull, O.; Ewing, C.; Podlaha, A.; Magee, B.

    2012-04-01

    Calibration and validation are crucial steps in the production of the catastrophe models for the insurance industry in order to assure the model's reliability and to quantify its uncertainty. Calibration is needed in all components of model development including hazard and vulnerability. Validation is required to ensure that the losses calculated by the model match those observed in past events and which could happen in future. Impact Forecasting, the catastrophe modelling development centre of excellence within Aon Benfield, has recently launched its earthquake model for Algeria as a part of the earthquake model for the Maghreb region. The earthquake model went through a detailed calibration process including: (1) the seismic intensity attenuation model by use of macroseismic observations and maps from past earthquakes in Algeria; (2) calculation of the country-specific vulnerability modifiers by use of past damage observations in the country. The use of Benouar, 1994 ground motion prediction relationship was proven as the most appropriate for our model. Calculation of the regional vulnerability modifiers for the country led to 10% to 40% larger vulnerability indexes for different building types compared to average European indexes. The country specific damage models also included aggregate damage models for residential, commercial and industrial properties considering the description of the buildings stock given by World Housing Encyclopaedia and the local rebuilding cost factors equal to 10% for damage grade 1, 20% for damage grade 2, 35% for damage grade 3, 75% for damage grade 4 and 100% for damage grade 5. The damage grades comply with the European Macroseismic Scale (EMS-1998). The model was validated by use of "as-if" historical scenario simulations of three past earthquake events in Algeria M6.8 2003 Boumerdes, M7.3 1980 El-Asnam and M7.3 1856 Djidjelli earthquake. The calculated return periods of the losses for client market portfolio align with the

  14. Towards a realistic approach to validation of reactive transport models for performance assessment

    International Nuclear Information System (INIS)

    Siegel, M.D.

    1993-01-01

    Performance assessment calculations are based on geochemical models that assume that interactions among radionuclides, rocks and groundwaters under natural conditions, can be estimated or bound by data obtained from laboratory-scale studies. The data include radionuclide distribution coefficients, measured in saturated batch systems of powdered rocks, and retardation factors measured in short-term column experiments. Traditional approaches to model validation cannot be applied in a straightforward manner to the simple reactive transport models that use these data. An approach to model validation in support of performance assessment is described in this paper. It is based on a recognition of different levels of model validity and is compatible with the requirements of current regulations for high-level waste disposal. Activities that are being carried out in support of this approach include (1) laboratory and numerical experiments to test the validity of important assumptions inherent in current performance assessment methodologies,(2) integrated transport experiments, and (3) development of a robust coupled reaction/transport code for sensitivity analyses using massively parallel computers

  15. Towards Model Validation and Verification with SAT Techniques

    OpenAIRE

    Gogolla, Martin

    2010-01-01

    After sketching how system development and the UML (Unified Modeling Language) and the OCL (Object Constraint Language) are related, validation and verification with the tool USE (UML-based Specification Environment) is demonstrated. As a more efficient alternative for verification tasks, two approaches using SAT-based techniques are put forward: First, a direct encoding of UML and OCL with Boolean variables and propositional formulas, and second, an encoding employing an...

  16. A discussion on validation of hydrogeological models

    International Nuclear Information System (INIS)

    Carrera, J.; Mousavi, S.F.; Usunoff, E.J.; Sanchez-Vila, X.; Galarza, G.

    1993-01-01

    Groundwater flow and solute transport are often driven by heterogeneities that elude easy identification. It is also difficult to select and describe the physico-chemical processes controlling solute behaviour. As a result, definition of a conceptual model involves numerous assumptions both on the selection of processes and on the representation of their spatial variability. Validating a numerical model by comparing its predictions with actual measurements may not be sufficient for evaluating whether or not it provides a good representation of 'reality'. Predictions will be close to measurements, regardless of model validity, if these are taken from experiments that stress well-calibrated model modes. On the other hand, predictions will be far from measurements when model parameters are very uncertain, even if the model is indeed a very good representation of the real system. Hence, we contend that 'classical' validation of hydrogeological models is not possible. Rather, models should be viewed as theories about the real system. We propose to follow a rigorous modeling approach in which different sources of uncertainty are explicitly recognized. The application of one such approach is illustrated by modeling a laboratory uranium tracer test performed on fresh granite, which was used as Test Case 1b in INTRAVAL. (author)

  17. Advanced Reactors-Intermediate Heat Exchanger (IHX) Coupling: Theoretical Modeling and Experimental Validation

    Energy Technology Data Exchange (ETDEWEB)

    Utgikar, Vivek [Univ. of Idaho, Moscow, ID (United States); Sun, Xiaodong [The Ohio State Univ., Columbus, OH (United States); Christensen, Richard [The Ohio State Univ., Columbus, OH (United States); Sabharwall, Piyush [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-12-29

    The overall goal of the research project was to model the behavior of the advanced reactorintermediate heat exchange system and to develop advanced control techniques for off-normal conditions. The specific objectives defined for the project were: 1. To develop the steady-state thermal hydraulic design of the intermediate heat exchanger (IHX); 2. To develop mathematical models to describe the advanced nuclear reactor-IHX-chemical process/power generation coupling during normal and off-normal operations, and to simulate models using multiphysics software; 3. To develop control strategies using genetic algorithm or neural network techniques and couple these techniques with the multiphysics software; 4. To validate the models experimentally The project objectives were accomplished by defining and executing four different tasks corresponding to these specific objectives. The first task involved selection of IHX candidates and developing steady state designs for those. The second task involved modeling of the transient and offnormal operation of the reactor-IHX system. The subsequent task dealt with the development of control strategies and involved algorithm development and simulation. The last task involved experimental validation of the thermal hydraulic performances of the two prototype heat exchangers designed and fabricated for the project at steady state and transient conditions to simulate the coupling of the reactor- IHX-process plant system. The experimental work utilized the two test facilities at The Ohio State University (OSU) including one existing High-Temperature Helium Test Facility (HTHF) and the newly developed high-temperature molten salt facility.

  18. Experimental validation of TASS/SMR-S critical flow model for the integral reactor SMART

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Si Won; Ra, In Sik; Kim, Kun Yeup [ACT Co., Daejeon (Korea, Republic of); Chung, Young Jong [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2011-05-15

    An advanced integral PWR, SMART (System- Integrated Modular Advanced ReacTor) is being developed in KAERI. It has a compact size and a relatively small power rating (330MWt) compared to a conventional reactor. Because new concepts are applied to SMART, an experimental and analytical validation is necessary for the safety evaluation of SMART. The analytical safety validation is being accomplished by a safety analysis code for an integral reactor, TASS/SMR-S developed by KAERI. TASS/SMR-S uses a lumped parameter one dimensional node and path modeling for the thermal hydraulic calculation and it uses point kinetics for the reactor power calculation. It has models for a general usage such as a core heat transfer model, a wall heat structure model, a critical flow model, component models, and it also has many SMART specific models such as an once through helical coiled steam generator model, and a condensate heat transfer model. To ensure that the TASS/SMR-S code has the calculation capability for the safety evaluation of SMART, the code should be validated for the specific models with the separate effect test experimental results. In this study, TASS/SMR-S critical flow model is evaluated as compared with SMD (Super Moby Dick) experiment

  19. The development and validation of a clinical prediction model to determine the probability of MODY in patients with young-onset diabetes.

    Science.gov (United States)

    Shields, B M; McDonald, T J; Ellard, S; Campbell, M J; Hyde, C; Hattersley, A T

    2012-05-01

    Diagnosing MODY is difficult. To date, selection for molecular genetic testing for MODY has used discrete cut-offs of limited clinical characteristics with varying sensitivity and specificity. We aimed to use multiple, weighted, clinical criteria to determine an individual's probability of having MODY, as a crucial tool for rational genetic testing. We developed prediction models using logistic regression on data from 1,191 patients with MODY (n = 594), type 1 diabetes (n = 278) and type 2 diabetes (n = 319). Model performance was assessed by receiver operating characteristic (ROC) curves, cross-validation and validation in a further 350 patients. The models defined an overall probability of MODY using a weighted combination of the most discriminative characteristics. For MODY, compared with type 1 diabetes, these were: lower HbA(1c), parent with diabetes, female sex and older age at diagnosis. MODY was discriminated from type 2 diabetes by: lower BMI, younger age at diagnosis, female sex, lower HbA(1c), parent with diabetes, and not being treated with oral hypoglycaemic agents or insulin. Both models showed excellent discrimination (c-statistic = 0.95 and 0.98, respectively), low rates of cross-validated misclassification (9.2% and 5.3%), and good performance on the external test dataset (c-statistic = 0.95 and 0.94). Using the optimal cut-offs, the probability models improved the sensitivity (91% vs 72%) and specificity (94% vs 91%) for identifying MODY compared with standard criteria of diagnosis MODY. This allows an improved and more rational approach to determine who should have molecular genetic testing.

  20. Development and Validation of Perioperative Risk-Adjustment Models for Hip Fracture Repair, Total Hip Arthroplasty, and Total Knee Arthroplasty.

    Science.gov (United States)

    Schilling, Peter L; Bozic, Kevin J

    2016-01-06

    Comparing outcomes across providers requires risk-adjustment models that account for differences in case mix. The burden of data collection from the clinical record can make risk-adjusted outcomes difficult to measure. The purpose of this study was to develop risk-adjustment models for hip fracture repair (HFR), total hip arthroplasty (THA), and total knee arthroplasty (TKA) that weigh adequacy of risk adjustment against data-collection burden. We used data from the American College of Surgeons National Surgical Quality Improvement Program to create derivation cohorts for HFR (n = 7000), THA (n = 17,336), and TKA (n = 28,661). We developed logistic regression models for each procedure using age, sex, American Society of Anesthesiologists (ASA) physical status classification, comorbidities, laboratory values, and vital signs-based comorbidities as covariates, and validated the models with use of data from 2012. The derivation models' C-statistics for mortality were 80%, 81%, 75%, and 92% and for adverse events were 68%, 68%, 60%, and 70% for HFR, THA, TKA, and combined procedure cohorts. Age, sex, and ASA classification accounted for a large share of the explained variation in mortality (50%, 58%, 70%, and 67%) and adverse events (43%, 45%, 46%, and 68%). For THA and TKA, these three variables were nearly as predictive as models utilizing all covariates. HFR model discrimination improved with the addition of comorbidities and laboratory values; among the important covariates were functional status, low albumin, high creatinine, disseminated cancer, dyspnea, and body mass index. Model performance was similar in validation cohorts. Risk-adjustment models using data from health records demonstrated good discrimination and calibration for HFR, THA, and TKA. It is possible to provide adequate risk adjustment using only the most predictive variables commonly available within the clinical record. This finding helps to inform the trade-off between model performance and data

  1. Validating the passenger traffic model for Copenhagen

    DEFF Research Database (Denmark)

    Overgård, Christian Hansen; VUK, Goran

    2006-01-01

    The paper presents a comprehensive validation procedure for the passenger traffic model for Copenhagen based on external data from the Danish national travel survey and traffic counts. The model was validated for the years 2000 to 2004, with 2004 being of particular interest because the Copenhagen...... matched the observed traffic better than those of the transit assignment model. With respect to the metro forecasts, the model over-predicts metro passenger flows by 10% to 50%. The wide range of findings from the project resulted in two actions. First, a project was started in January 2005 to upgrade...

  2. Wave Tank Testing and Model Validation of an Autonomous Wave Energy Converter

    Directory of Open Access Journals (Sweden)

    Bret Bosma

    2015-08-01

    Full Text Available A key component in bringing ocean wave energy converters from concept to commercialization is the building and testing of scaled prototypes to provide model validation. A one quarter scale prototype of an autonomous two body heaving point absorber was modeled, built, and tested for this work. Wave tank testing results are compared with two hydrodynamic and system models—implemented in both ANSYS AQWA and MATLAB/Simulink—and show model validation over certain regions of operation. This work will serve as a guide for future developers of wave energy converter devices, providing insight in taking their design from concept to prototype stage.

  3. Development and validation of a combined phased acoustical radiosity and image source model for predicting sound fields in rooms.

    Science.gov (United States)

    Marbjerg, Gerd; Brunskog, Jonas; Jeong, Cheol-Ho; Nilsson, Erling

    2015-09-01

    A model, combining acoustical radiosity and the image source method, including phase shifts on reflection, has been developed. The model is denoted Phased Acoustical Radiosity and Image Source Method (PARISM), and it has been developed in order to be able to model both specular and diffuse reflections with complex-valued and angle-dependent boundary conditions. This paper mainly describes the combination of the two models and the implementation of the angle-dependent boundary conditions. It furthermore describes how a pressure impulse response is obtained from the energy-based acoustical radiosity by regarding the model as being stochastic. Three methods of implementation are proposed and investigated, and finally, recommendations are made for their use. Validation of the image source method is done by comparison with finite element simulations of a rectangular room with a porous absorber ceiling. Results from the full model are compared with results from other simulation tools and with measurements. The comparisons of the full model are done for real-valued and angle-independent surface properties. The proposed model agrees well with both the measured results and the alternative theories, and furthermore shows a more realistic spatial variation than energy-based methods due to the fact that interference is considered.

  4. Deviant behavior variety scale: development and validation with a sample of Portuguese adolescents

    Directory of Open Access Journals (Sweden)

    Cristina Sanches

    2016-01-01

    Full Text Available Abstract This study presents the development and analysis of the psychometric properties of the Deviant Behavior Variety Scale (DBVS. Participants were 861 Portuguese adolescents (54 % female, aged between 12 and 19 years old. Two alternative models were tested using Confirmatory Factor Analysis. Although both models showed good fit indexes, the two-factor model didn’t presented discriminant validity. Further results provided evidence for the factorial and the convergent validity of the single-factor structure of the DVBS, which has also shown good internal consistency. Criterion validity was evaluated through the association with related variables, such as age and school failure, as well as the scale’s ability to capture group differences, namely between genders and school retentions, and finally by comparing a sub-group of convicted adolescents with a group of non-convicted ones regarding their engagement in delinquent activities. Overall, the scale presented good psychometric properties, with results supporting that the DBVS is a valid and reliable self-reported measure to evaluate adolescents’ involvement in deviance.

  5. Implementation and validation of the condensation model for containment hydrogen distribution studies

    International Nuclear Information System (INIS)

    Ravva, Srinivasa Rao; Iyer, Kannan N.; Gupta, S.K.; Gaikwad, Avinash J.

    2014-01-01

    Highlights: • A condensation model based on diffusion was implemented in FLUENT. • Validation of a condensation model for the H 2 distribution studies was performed. • Multi-component diffusion is used in the present work. • Appropriate grid and turbulence model were identified. - Abstract: This paper aims at the implementation details of a condensation model in the CFD code FLUENT and its validation so that it can be used in performing the containment hydrogen distribution studies. In such studies, computational fluid dynamics simulations are necessary for obtaining accurate predictions. While steam condensation plays an important role, commercial CFD codes such as FLUENT do not have an in-built condensation model. Therefore, a condensation model was developed and implemented in the FLUENT code through user defined functions (UDFs) for the sink terms in the mass, momentum, energy and species balance equations together with associated turbulence quantities viz., kinetic energy and dissipation rate. The implemented model was validated against the ISP-47 test of TOSQAN facility using the standard wall functions and enhanced wall treatment approaches. The best suitable grid size and the turbulence model for the low density gas (He) distribution studies are brought out in this paper

  6. The development and psychometric validation of the Ethical Awareness Scale.

    Science.gov (United States)

    Milliken, Aimee; Ludlow, Larry; DeSanto-Madeya, Susan; Grace, Pamela

    2018-04-19

    To develop and psychometrically assess the Ethical Awareness Scale using Rasch measurement principles and a Rasch item response theory model. Critical care nurses must be equipped to provide good (ethical) patient care. This requires ethical awareness, which involves recognizing the ethical implications of all nursing actions. Ethical awareness is imperative in successfully addressing patient needs. Evidence suggests that the ethical import of everyday issues may often go unnoticed by nurses in practice. Assessing nurses' ethical awareness is a necessary first step in preparing nurses to identify and manage ethical issues in the highly dynamic critical care environment. A cross-sectional design was used in two phases of instrument development. Using Rasch principles, an item bank representing nursing actions was developed (33 items). Content validity testing was performed. Eighteen items were selected for face validity testing. Two rounds of operational testing were performed with critical care nurses in Boston between February-April 2017. A Rasch analysis suggests sufficient item invariance across samples and sufficient construct validity. The analysis further demonstrates a progression of items uniformly along a hierarchical continuum; items that match respondent ability levels; response categories that are sufficiently used; and adequate internal consistency. Mean ethical awareness scores were in the low/moderate range. The results suggest the Ethical Awareness Scale is a psychometrically sound, reliable and valid measure of ethical awareness in critical care nurses. © 2018 John Wiley & Sons Ltd.

  7. Validation of Symptom Validity Tests Using a "Child-model" of Adult Cognitive Impairments

    NARCIS (Netherlands)

    Rienstra, A.; Spaan, P. E. J.; Schmand, B.

    2010-01-01

    Validation studies of symptom validity tests (SVTs) in children are uncommon. However, since children's cognitive abilities are not yet fully developed, their performance may provide additional support for the validity of these measures in adult populations. Four SVTs, the Test of Memory Malingering

  8. Modeling and validating HL7 FHIR profiles using semantic web Shape Expressions (ShEx).

    Science.gov (United States)

    Solbrig, Harold R; Prud'hommeaux, Eric; Grieve, Grahame; McKenzie, Lloyd; Mandel, Joshua C; Sharma, Deepak K; Jiang, Guoqian

    2017-03-01

    HL7 Fast Healthcare Interoperability Resources (FHIR) is an emerging open standard for the exchange of electronic healthcare information. FHIR resources are defined in a specialized modeling language. FHIR instances can currently be represented in either XML or JSON. The FHIR and Semantic Web communities are developing a third FHIR instance representation format in Resource Description Framework (RDF). Shape Expressions (ShEx), a formal RDF data constraint language, is a candidate for describing and validating the FHIR RDF representation. Create a FHIR to ShEx model transformation and assess its ability to describe and validate FHIR RDF data. We created the methods and tools that generate the ShEx schemas modeling the FHIR to RDF specification being developed by HL7 ITS/W3C RDF Task Force, and evaluated the applicability of ShEx in the description and validation of FHIR to RDF transformations. The ShEx models contributed significantly to workgroup consensus. Algorithmic transformations from the FHIR model to ShEx schemas and FHIR example data to RDF transformations were incorporated into the FHIR build process. ShEx schemas representing 109 FHIR resources were used to validate 511 FHIR RDF data examples from the Standards for Trial Use (STU 3) Ballot version. We were able to uncover unresolved issues in the FHIR to RDF specification and detect 10 types of errors and root causes in the actual implementation. The FHIR ShEx representations have been included in the official FHIR web pages for the STU 3 Ballot version since September 2016. ShEx can be used to define and validate the syntax of a FHIR resource, which is complementary to the use of RDF Schema (RDFS) and Web Ontology Language (OWL) for semantic validation. ShEx proved useful for describing a standard model of FHIR RDF data. The combination of a formal model and a succinct format enabled comprehensive review and automated validation. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. The Development and Validation of a Mental Toughness Scale for Adolescents

    Science.gov (United States)

    McGeown, Sarah; St. Clair-Thompson, Helen; Putwain, David W.

    2018-01-01

    The present study examined the validity of a newly developed instrument, the Mental Toughness Scale for Adolescents, which examines the attributes of challenge, commitment, confidence (abilities and interpersonal), and control (life and emotion). The six-factor model was supported using exploratory factor analysis (n = 373) and confirmatory factor…

  10. Validation of mentorship model for newly qualified professional ...

    African Journals Online (AJOL)

    Newly qualified professional nurses (NQPNs) allocated to community health care services require the use of validated model to practice independently. Validation was done to adapt and assess if the model is understood and could be implemented by NQPNs and mentors employed in community health care services.

  11. Validation and Adaptation of Router and Switch Models

    NARCIS (Netherlands)

    Boltjes, B.; Fernandez Diaz, I.; Kock, B.A.; Langeveld, R.J.G.M.; Schoenmaker, G.

    2003-01-01

    This paper describes validating OPNET models of key devices for the next generation IP-based tactical network of the Royal Netherlands Army (RNLA). The task of TNO-FEL is to provide insight in scalability and performance of future deployed networks. Because validated models ol key Cisco equipment

  12. Validation of symptom validity tests using a "child-model" of adult cognitive impairments

    NARCIS (Netherlands)

    Rienstra, A.; Spaan, P.E.J.; Schmand, B.

    2010-01-01

    Validation studies of symptom validity tests (SVTs) in children are uncommon. However, since children’s cognitive abilities are not yet fully developed, their performance may provide additional support for the validity of these measures in adult populations. Four SVTs, the Test of Memory Malingering

  13. Development and validation of a laparoscopic hysterectomy cuff closure simulation model for surgical training.

    Science.gov (United States)

    Tunitsky-Bitton, Elena; Propst, Katie; Muffly, Tyler

    2016-03-01

    The number of robotically assisted hysterectomies is increasing, and therefore, the opportunities for trainees to become competent in performing traditional laparoscopic hysterectomy are decreasing. Simulation-based training is ideal for filling this gap in training. The objective of the study was to design a surgical model for training in laparoscopic vaginal cuff closure and to present evidence of its validity and reliability as an assessment and training tool. Participants included gynecology staff and trainees at 2 tertiary care centers. Experienced surgeons were also recruited at the combined International Urogynecologic Association and American Urogynecologic Society scientific meeting. Participants included 19 experts and 21 trainees. All participants were recorded using the laparoscopic hysterectomy cuff closure simulation model. The model was constructed using the an advanced uterine manipulation system with a sacrocolopexy tip/vaginal stent, a vaginal cuff constructed from neoprene material and lined with a swimsuit material (nylon and spandex) secured to the vaginal stent with a plastic cable tie. The uterine manipulation system was attached to the fundamentals of laparoscopic surgery laparoscopic training box trainer using a metal bracket. Performance was evaluated using the Global Operative Assessment of Laparoscopic Skills scale. In addition, needle handling, knot tying, and incorporation of epithelial edge were also evaluated. The Student t test was used to compare the scores and the operating times between the groups. Intrarater reliability between the scores by the 2 masked experts was measured using the interclass correlation coefficient. Total and annual experience with laparoscopic suturing and specifically vaginal cuff closure varied greatly among the participants. For the construct validity, the participants in the expert group received significantly higher scores in each of the domains of the Global Operative Assessment of Laparoscopic Skills

  14. Natural analogues and radionuclide transport model validation

    International Nuclear Information System (INIS)

    Lever, D.A.

    1987-08-01

    In this paper, some possible roles for natural analogues are discussed from the point of view of those involved with the development of mathematical models for radionuclide transport and with the use of these models in repository safety assessments. The characteristic features of a safety assessment are outlined in order to address the questions of where natural analogues can be used to improve our understanding of the processes involved and where they can assist in validating the models that are used. Natural analogues have the potential to provide useful information about some critical processes, especially long-term chemical processes and migration rates. There is likely to be considerable uncertainty and ambiguity associated with the interpretation of natural analogues, and thus it is their general features which should be emphasized, and models with appropriate levels of sophistication should be used. Experience gained in modelling the Koongarra uranium deposit in northern Australia is drawn upon. (author)

  15. Predicting risk behaviors: development and validation of a diagnostic scale.

    Science.gov (United States)

    Witte, K; Cameron, K A; McKeon, J K; Berkowitz, J M

    1996-01-01

    The goal of this study was to develop and validate the Risk Behavior Diagnosis (RBD) Scale for use by health care providers and practitioners interested in promoting healthy behaviors. Theoretically guided by the Extended Parallel Process Model (EPPM; a fear appeal theory), the RBD scale was designed to work in conjunction with an easy-to-use formula to determine which types of health risk messages would be most appropriate for a given individual or audience. Because some health risk messages promote behavior change and others backfire, this type of scale offers guidance to practitioners on how to develop the best persuasive message possible to motivate healthy behaviors. The results of the study demonstrate the RBD scale to have a high degree of content, construct, and predictive validity. Specific examples and practical suggestions are offered to facilitate use of the scale for health practitioners.

  16. Development and Cross-Validation of Equation for Estimating Percent Body Fat of Korean Adults According to Body Mass Index

    Directory of Open Access Journals (Sweden)

    Hoyong Sung

    2017-06-01

    Full Text Available Background : Using BMI as an independent variable is the easiest way to estimate percent body fat. Thus far, few studies have investigated the development and cross-validation of an equation for estimating the percent body fat of Korean adults according to the BMI. The goals of this study were the development and cross-validation of an equation for estimating the percent fat of representative Korean adults using the BMI. Methods : Samples were obtained from the Korea National Health and Nutrition Examination Survey between 2008 and 2011. The samples from 2008-2009 and 2010-2011 were labeled as the validation group (n=10,624 and the cross-validation group (n=8,291, respectively. The percent fat was measured using dual-energy X-ray absorptiometry, and the body mass index, gender, and age were included as independent variables to estimate the measured percent fat. The coefficient of determination (R², standard error of estimation (SEE, and total error (TE were calculated to examine the accuracy of the developed equation. Results : The cross-validated R² was 0.731 for Model 1 and 0.735 for Model 2. The SEE was 3.978 for Model 1 and 3.951 for Model 2. The equations developed in this study are more accurate for estimating percent fat of the cross-validation group than those previously published by other researchers. Conclusion : The newly developed equations are comparatively accurate for the estimation of the percent fat of Korean adults.

  17. Testing Software Development Project Productivity Model

    Science.gov (United States)

    Lipkin, Ilya

    Software development is an increasingly influential factor in today's business environment, and a major issue affecting software development is how an organization estimates projects. If the organization underestimates cost, schedule, and quality requirements, the end results will not meet customer needs. On the other hand, if the organization overestimates these criteria, resources that could have been used more profitably will be wasted. There is no accurate model or measure available that can guide an organization in a quest for software development, with existing estimation models often underestimating software development efforts as much as 500 to 600 percent. To address this issue, existing models usually are calibrated using local data with a small sample size, with resulting estimates not offering improved cost analysis. This study presents a conceptual model for accurately estimating software development, based on an extensive literature review and theoretical analysis based on Sociotechnical Systems (STS) theory. The conceptual model serves as a solution to bridge organizational and technological factors and is validated using an empirical dataset provided by the DoD. Practical implications of this study allow for practitioners to concentrate on specific constructs of interest that provide the best value for the least amount of time. This study outlines key contributing constructs that are unique for Software Size E-SLOC, Man-hours Spent, and Quality of the Product, those constructs having the largest contribution to project productivity. This study discusses customer characteristics and provides a framework for a simplified project analysis for source selection evaluation and audit task reviews for the customers and suppliers. Theoretical contributions of this study provide an initial theory-based hypothesized project productivity model that can be used as a generic overall model across several application domains such as IT, Command and Control

  18. Development and validation of advanced theoretical modeling for churn-turbulent flows and subsequent transitions

    Energy Technology Data Exchange (ETDEWEB)

    Montoya Zabala, Gustavo Adolfo

    2015-07-01

    The applicability of CFD codes for two-phase flows has always been limited to special cases due to the very complex nature of its interface. Due to its tremendous computational cost, methods based on direct resolution of the interface are not applicable to most problems of practical relevance. Instead, averaging procedures are commonly used for these applications, such as the Eulerian-Eulerian approach, which necessarily means losing detailed information on the interfacial structure. In order to allow widespread application of the two-fluid approach, closure models are required to reintroduce in the simulations the correct interfacial mass, momentum, and heat transfer. It is evident that such closure models will strongly depend on the specific flow pattern. When considering vertical pipe flow with low gas volume flow rates, bubbly flow occurs. With increasing gas volume flow rates larger bubbles are generated by bubble coalescence, which further leads to transition to slug, churn-turbulent, and annular flow. Considering, as an example, a heated tube producing steam by evaporation, as in the case of a vertical steam generator, all these flow patterns including transitions are expected to occur in the system. Despite extensive attempts, robust and accurate simulations approaches for such conditions are still lacking. The purpose of this dissertation is the development, testing, and validation of a multifield model for adiabatic gas-liquid flows at high gas volume fractions, for which a multiple-size bubble approach has been implemented by separating the gas structures into a specified number of groups, each of which represents a prescribed range of sizes. A fully-resolved continuous gas phase is also computed, and represents all the gas structures which are large enough to be resolved within the computational mesh. The concept, known as GENeralized TwO Phase flow or GENTOP, is formulated as an extension to the bubble population balance approach known as the

  19. Development and validation of advanced theoretical modeling for churn-turbulent flows and subsequent transitions

    International Nuclear Information System (INIS)

    Montoya Zabala, Gustavo Adolfo

    2015-01-01

    The applicability of CFD codes for two-phase flows has always been limited to special cases due to the very complex nature of its interface. Due to its tremendous computational cost, methods based on direct resolution of the interface are not applicable to most problems of practical relevance. Instead, averaging procedures are commonly used for these applications, such as the Eulerian-Eulerian approach, which necessarily means losing detailed information on the interfacial structure. In order to allow widespread application of the two-fluid approach, closure models are required to reintroduce in the simulations the correct interfacial mass, momentum, and heat transfer. It is evident that such closure models will strongly depend on the specific flow pattern. When considering vertical pipe flow with low gas volume flow rates, bubbly flow occurs. With increasing gas volume flow rates larger bubbles are generated by bubble coalescence, which further leads to transition to slug, churn-turbulent, and annular flow. Considering, as an example, a heated tube producing steam by evaporation, as in the case of a vertical steam generator, all these flow patterns including transitions are expected to occur in the system. Despite extensive attempts, robust and accurate simulations approaches for such conditions are still lacking. The purpose of this dissertation is the development, testing, and validation of a multifield model for adiabatic gas-liquid flows at high gas volume fractions, for which a multiple-size bubble approach has been implemented by separating the gas structures into a specified number of groups, each of which represents a prescribed range of sizes. A fully-resolved continuous gas phase is also computed, and represents all the gas structures which are large enough to be resolved within the computational mesh. The concept, known as GENeralized TwO Phase flow or GENTOP, is formulated as an extension to the bubble population balance approach known as the

  20. Laboratory research program to aid in developing and testing the validity of conceptual models for flow and transport through unsaturated porous media

    International Nuclear Information System (INIS)

    Glass, R.J.

    1991-01-01

    As part of the Yucca Mountain Project, a laboratory research program is being developed at Sandia National Laboratories that will integrate fundamental physical experimentation with conceptual model formulation and mathematical modeling and aid in subsequent model validation for unsaturated zone water and contaminant transport. Experimental systems are being developed to explore flow and transport processes and assumptions of fundamental importance to various conceptual models. Experimentation will run concurrently in two types of systems: fractured and nonfractured tuffaceous systems; and analogue systems having specific characteristics of the tuff systems but designed to maximize experimental control and resolution of data measurement. Areas in which experimentation currently is directed include infiltration flow instability, water and solute movement in unsaturated fractures, fracture-matrix interaction, and scaling laws to define effective large-scale properties for heterogeneous, fractured media. 16 refs

  1. Validation of designing tools as part of nuclear pump development process

    International Nuclear Information System (INIS)

    Klemm, T.; Sehr, F.; Spenner, P.; Fritz, J.

    2010-01-01

    Nuclear pumps are characterized by high safety standards, operational reliability as well as long life cycles. For the design process it is of common use to have a down scaled model pump to qualify operating data and simulate exceptional operating conditions. In case of modifications of the pump design compared to existing reactor coolant pumps a model pump is required to develop methods and tools to design the full scale pump. In the presented case it has a geometry scale of 1:2 regarding the full scale pump size. The experimental data of the model pump is basis for validation of methods and tools which are applied in the designing process of the full scale pump. In this paper the selection of qualified tools and the validation process is demonstrated exemplarily on a cooling circuit. The aim is to predict the resulting flow rate. Tools are chosen for different components depending on the benefit to effort ratio. For elementary flow phenomena such as fluid flow in straight pipes or gaps analytic or empirical laws can be used. For more complex flow situations numerical methods are utilized. Main focus is set on the validation process of the applied numerical flow simulation. In this case not only integral data should be compared, it is also necessary to validate local flow structure of numerical flow simulation to avoid systematic errors in CFD Model generation. Due to complex design internal flow measurements are not possible. On that reason simple comparisons of similar flow test cases are used. Results of this study show, that the flow simulation data closely match measured integral pump and test case data. With this validation it is now possible to qualify CFD simulations as a design tool for the full scale pump in similar cooling circuit. (authors)

  2. Validity of information security policy models

    Directory of Open Access Journals (Sweden)

    Joshua Onome Imoniana

    Full Text Available Validity is concerned with establishing evidence for the use of a method to be used with a particular set of population. Thus, when we address the issue of application of security policy models, we are concerned with the implementation of a certain policy, taking into consideration the standards required, through attribution of scores to every item in the research instrument. En today's globalized economic scenarios, the implementation of information security policy, in an information technology environment, is a condition sine qua non for the strategic management process of any organization. Regarding this topic, various studies present evidences that, the responsibility for maintaining a policy rests primarily with the Chief Security Officer. The Chief Security Officer, in doing so, strives to enhance the updating of technologies, in order to meet all-inclusive business continuity planning policies. Therefore, for such policy to be effective, it has to be entirely embraced by the Chief Executive Officer. This study was developed with the purpose of validating specific theoretical models, whose designs were based on literature review, by sampling 10 of the Automobile Industries located in the ABC region of Metropolitan São Paulo City. This sampling was based on the representativeness of such industries, particularly with regards to each one's implementation of information technology in the region. The current study concludes, presenting evidence of the discriminating validity of four key dimensions of the security policy, being such: the Physical Security, the Logical Access Security, the Administrative Security, and the Legal & Environmental Security. On analyzing the Alpha of Crombach structure of these security items, results not only attest that the capacity of those industries to implement security policies is indisputable, but also, the items involved, homogeneously correlate to each other.

  3. Deployable and Conformal Planar Micro-Devices: Design and Model Validation

    Directory of Open Access Journals (Sweden)

    Jinda Zhuang

    2014-08-01

    Full Text Available We report a design concept for a deployable planar microdevice and the modeling and experimental validation of its mechanical behavior. The device consists of foldable membranes that are suspended between flexible stems and actuated by push-pull wires. Such a deployable device can be introduced into a region of interest in its compact “collapsed” state and then deployed to conformally cover a large two-dimensional surface area for minimally invasive biomedical operations and other engineering applications. We develop and experimentally validate theoretical models based on the energy minimization approach to examine the conformality and figures of merit of the device. The experimental results obtained using model contact surfaces agree well with the prediction and quantitatively highlight the importance of the membrane bending modulus in controlling surface conformality. The present study establishes an early foundation for the mechanical design of this and related deployable planar microdevice concepts.

  4. Quantification of Dynamic Model Validation Metrics Using Uncertainty Propagation from Requirements

    Science.gov (United States)

    Brown, Andrew M.; Peck, Jeffrey A.; Stewart, Eric C.

    2018-01-01

    The Space Launch System, NASA's new large launch vehicle for long range space exploration, is presently in the final design and construction phases, with the first launch scheduled for 2019. A dynamic model of the system has been created and is critical for calculation of interface loads and natural frequencies and mode shapes for guidance, navigation, and control (GNC). Because of the program and schedule constraints, a single modal test of the SLS will be performed while bolted down to the Mobile Launch Pad just before the first launch. A Monte Carlo and optimization scheme will be performed to create thousands of possible models based on given dispersions in model properties and to determine which model best fits the natural frequencies and mode shapes from modal test. However, the question still remains as to whether this model is acceptable for the loads and GNC requirements. An uncertainty propagation and quantification (UP and UQ) technique to develop a quantitative set of validation metrics that is based on the flight requirements has therefore been developed and is discussed in this paper. There has been considerable research on UQ and UP and validation in the literature, but very little on propagating the uncertainties from requirements, so most validation metrics are "rules-of-thumb;" this research seeks to come up with more reason-based metrics. One of the main assumptions used to achieve this task is that the uncertainty in the modeling of the fixed boundary condition is accurate, so therefore that same uncertainty can be used in propagating the fixed-test configuration to the free-free actual configuration. The second main technique applied here is the usage of the limit-state formulation to quantify the final probabilistic parameters and to compare them with the requirements. These techniques are explored with a simple lumped spring-mass system and a simplified SLS model. When completed, it is anticipated that this requirements-based validation

  5. Crack Detection in Fibre Reinforced Plastic Structures Using Embedded Fibre Bragg Grating Sensors: Theory, Model Development and Experimental Validation

    DEFF Research Database (Denmark)

    Pereira, Gilmar Ferreira; Mikkelsen, Lars Pilgaard; McGugan, Malcolm

    2015-01-01

    properties. When applying this concept to different structures, sensor systems and damage types, a combination of damage mechanics, monitoring technology, and modelling is required. The primary objective of this article is to demonstrate such a combination. This article is divided in three main topics......: the damage mechanism (delamination of FRP), the structural health monitoring technology (fibre Bragg gratings to detect delamination), and the finite element method model of the structure that incorporates these concepts into a final and integrated damage-monitoring concept. A novel method for assessing...... by the end-user. Conjointly, a novel model for sensor output prediction (virtual sensor) was developed using this FBG sensor crack monitoring concept and implemented in a finite element method code. The monitoring method was demonstrated and validated using glass fibre double cantilever beam specimens...

  6. A multibody motorcycle model with rigid-ring tyres: formulation and validation

    Science.gov (United States)

    Leonelli, Luca; Mancinelli, Nicolò

    2015-06-01

    The aim of this paper is the development and validation of a three-dimensional multibody motorcycle model including a rigid-ring tyre model, taking into account both the slopes and elevation of the road surface. In order to achieve accurate assessment of ride and handling performances of a road racing motorcycle, a tyre model capable of reproducing the dynamic response to actual road excitation is required. While a number of vehicle models with such feature are available for car application, the extension to the motorcycle modelling has not been addressed yet. To do so, a novel parametrisation for the general motorcycle kinematics is proposed, using a mixed reference point and relative coordinates approach. The resulting description, developed in terms of dependent coordinates, makes it possible to include the rigid-ring kinematics as well as road elevation and slopes, without affecting computational efficiency. The equations of motion for the whole multibody system are derived symbolically and the constraint equations arising from the dependent coordinate formulation are handled using the position and velocity vector projection technique. The resulting system of equations is integrated in time domain using a standard ordinary differential equation (ODE) algorithm. Finally, the model is validated with respect to experimentally measured data in both time and frequency domains.

  7. Development and validation of double and single Wiebe function for multi-injection mode Diesel engine combustion modelling for hardware-in-the-loop applications

    International Nuclear Information System (INIS)

    Maroteaux, Fadila; Saad, Charbel; Aubertin, Fabrice

    2015-01-01

    Highlights: • Modelling of Diesel engine combustion with multi-injection mode was conducted. • Double and single Wiebe correlations for pilot, main and post combustion processes were calibrated. • Ignition delay time correlations have been developed and calibrated using experimental data for each injection. • The complete in-cylinder model has been applied successfully to real time simulations on HiL test bed. - Abstract: The improvement of Diesel engine performances in terms of fuel consumption and pollutant emissions has a huge impact on management system and diagnostic procedure. Validation and testing of engine performances can benefit from the use of theoretical models, for the reduction of development time and costs. Hardware in the Loop (HiL) test bench is a suitable way to achieve these objectives. However, the increasing complexity of management systems rises challenges for the development of very reduced physical models able to run in real time applications. This paper presents an extension of a previously developed phenomenological Diesel combustion model suitable for real time applications on a HiL test bench. In the earlier study, the modelling efforts have been targeted at high engine speeds with a very short computational time window, and where the engine operates with single injection. In the present work, a modelling of in-cylinder processes at low and medium engine speeds with multi-injection is performed. In order to reach an adequate computational time, the combustion progress during the pilot and main injection periods has been treated through a double Wiebe function, while the post combustion period has required a single Wiebe function. This paper describes the basic system models and their calibration and validation against experimental data. The use of the developed correlations of Wiebe coefficients and ignition delay times for each combustion phase, included in the in-cylinder crank angle global model, is applied for the prediction

  8. Improvement and Validation of Weld Residual Stress Modelling Procedure

    International Nuclear Information System (INIS)

    Zang, Weilin; Gunnars, Jens; Dong, Pingsha; Hong, Jeong K.

    2009-06-01

    The objective of this work is to identify and evaluate improvements for the residual stress modelling procedure currently used in Sweden. There is a growing demand to eliminate any unnecessary conservatism involved in residual stress assumptions. The study was focused on the development and validation of an improved weld residual stress modelling procedure, by taking advantage of the recent advances in residual stress modelling and stress measurement techniques. The major changes applied in the new weld residual stress modelling procedure are: - Improved procedure for heat source calibration based on use of analytical solutions. - Use of an isotropic hardening model where mixed hardening data is not available. - Use of an annealing model for improved simulation of strain relaxation in re-heated material. The new modelling procedure is demonstrated to capture the main characteristics of the through thickness stress distributions by validation to experimental measurements. Three austenitic stainless steel butt-welds cases are analysed, covering a large range of pipe geometries. From the cases it is evident that there can be large differences between the residual stresses predicted using the new procedure, and the earlier procedure or handbook recommendations. Previously recommended profiles could give misleading fracture assessment results. The stress profiles according to the new procedure agree well with the measured data. If data is available then a mixed hardening model should be used

  9. Improvement and Validation of Weld Residual Stress Modelling Procedure

    Energy Technology Data Exchange (ETDEWEB)

    Zang, Weilin; Gunnars, Jens (Inspecta Technology AB, Stockholm (Sweden)); Dong, Pingsha; Hong, Jeong K. (Center for Welded Structures Research, Battelle, Columbus, OH (United States))

    2009-06-15

    The objective of this work is to identify and evaluate improvements for the residual stress modelling procedure currently used in Sweden. There is a growing demand to eliminate any unnecessary conservatism involved in residual stress assumptions. The study was focused on the development and validation of an improved weld residual stress modelling procedure, by taking advantage of the recent advances in residual stress modelling and stress measurement techniques. The major changes applied in the new weld residual stress modelling procedure are: - Improved procedure for heat source calibration based on use of analytical solutions. - Use of an isotropic hardening model where mixed hardening data is not available. - Use of an annealing model for improved simulation of strain relaxation in re-heated material. The new modelling procedure is demonstrated to capture the main characteristics of the through thickness stress distributions by validation to experimental measurements. Three austenitic stainless steel butt-welds cases are analysed, covering a large range of pipe geometries. From the cases it is evident that there can be large differences between the residual stresses predicted using the new procedure, and the earlier procedure or handbook recommendations. Previously recommended profiles could give misleading fracture assessment results. The stress profiles according to the new procedure agree well with the measured data. If data is available then a mixed hardening model should be used

  10. WEC-SIM Phase 1 Validation Testing -- Numerical Modeling of Experiments: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Ruehl, Kelley; Michelen, Carlos; Bosma, Bret; Yu, Yi-Hsiang

    2016-08-01

    The Wave Energy Converter Simulator (WEC-Sim) is an open-source code jointly developed by Sandia National Laboratories and the National Renewable Energy Laboratory. It is used to model wave energy converters subjected to operational and extreme waves. In order for the WEC-Sim code to be beneficial to the wave energy community, code verification and physical model validation is necessary. This paper describes numerical modeling of the wave tank testing for the 1:33-scale experimental testing of the floating oscillating surge wave energy converter. The comparison between WEC-Sim and the Phase 1 experimental data set serves as code validation. This paper is a follow-up to the WEC-Sim paper on experimental testing, and describes the WEC-Sim numerical simulations for the floating oscillating surge wave energy converter.

  11. Validation study of safety assessment model for radioactive waste disposal

    Energy Technology Data Exchange (ETDEWEB)

    Munakata, Masahiro; Takeda, Seiji; Kimura, Hideo [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2001-12-01

    The JAERI-AECL collaboration research program has been conducted to validate a groundwater flow and radionuclide transport models for safety assessment. JAERI have developed a geostatistical model for radionuclide transport through a heterogeneous geological media and verify using experimental results of field tracer tests. The simulated tracer plumes explain favorably the experimental tracer plumes. A regional groundwater flow and transport model using site-scale parameter obtained from tracer tests have been verified by comparing simulation results with observation ones of natural environmental tracer. (author)

  12. Development and validation of effective models for simulation of stratification and mixing phenomena in a pool of water

    Energy Technology Data Exchange (ETDEWEB)

    Li, H.; Kudinov, P.; Villanueva, W. (Royal Institute of Technology (KTH). Div. of Nuclear Power Safety (Sweden))

    2011-06-15

    's physical models and numerical schemes, (b) propose necessary improvements in GOTHIC sub-grid scale modeling, and (c) to validate proposed models. Results obtained with the EHS model shows that GOTHIC can predict development of thermal stratification in the pool if adequate grid resolution is provided. An equation for the effective momentum is proposed based on feasibility studies of the EMS model and analysis of the measured data in the test with chugging regime of steam injection. An experiment with higher resolution in space and time of oscillatory flow inside the blowdown pipe is highly desirable to uniquely determine model coefficients. Implementation of EHS/EMS model in GOTHIC and their validation against new PPOOLEX experiment is underway. (Author)

  13. Development and validation of effective models for simulation of stratification and mixing phenomena in a pool of water

    International Nuclear Information System (INIS)

    Li, H.; Kudinov, P.; Villanueva, W.

    2011-06-01

    and numerical schemes, (b) propose necessary improvements in GOTHIC sub-grid scale modeling, and (c) to validate proposed models. Results obtained with the EHS model shows that GOTHIC can predict development of thermal stratification in the pool if adequate grid resolution is provided. An equation for the effective momentum is proposed based on feasibility studies of the EMS model and analysis of the measured data in the test with chugging regime of steam injection. An experiment with higher resolution in space and time of oscillatory flow inside the blowdown pipe is highly desirable to uniquely determine model coefficients. Implementation of EHS/EMS model in GOTHIC and their validation against new PPOOLEX experiment is underway. (Author)

  14. Validation of the Activities of Community Transportation model for individuals with cognitive impairments.

    Science.gov (United States)

    Sohlberg, McKay Moore; Fickas, Stephen; Lemoncello, Rik; Hung, Pei-Fang

    2009-01-01

    To develop a theoretical, functional model of community navigation for individuals with cognitive impairments: the Activities of Community Transportation (ACTs). Iterative design using qualitative methods (i.e. document review, focus groups and observations). Four agencies providing travel training to adults with cognitive impairments in the USA participated in the validation study. A thorough document review and series of focus groups led to the development of a comprehensive model (ACTs Wheels) delineating the requisite steps and skills for community navigation. The model was validated and updated based on observations of 395 actual trips by travellers with navigational challenges from the four participating agencies. Results revealed that the 'ACTs Wheel' models were complete and comprehensive. The 'ACTs Wheels' represent a comprehensive model of the steps needed to navigate to destinations using paratransit and fixed-route public transportation systems for travellers with cognitive impairments. Suggestions are made for future investigations of community transportation for this population.

  15. Validation of a FAST Model of the SWAY Prototype Floating Wind Turbine

    Energy Technology Data Exchange (ETDEWEB)

    Koh, J. H. [Nanyang Technological Univ. (Singapore); Ng, E. Y. K. [Nanyang Technological Univ. (Singapore); Robertson, Amy [National Renewable Energy Lab. (NREL), Golden, CO (United States); Jonkman, Jason [National Renewable Energy Lab. (NREL), Golden, CO (United States); Driscoll, Frederick [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2016-06-01

    As part of a collaboration of the National Renewable Energy Laboratory (NREL) and SWAY AS, NREL installed scientific wind, wave, and motion measurement equipment on the spar-type 1/6.5th-scale prototype SWAY floating offshore wind system. The equipment enhanced SWAY's data collection and allowed SWAY to verify the concept and NREL to validate a FAST model of the SWAY design in an open-water condition. Nanyang Technological University (NTU), in collaboration with NREL, assisted with the validation. This final report gives an overview of the SWAY prototype and NREL and NTU's efforts to validate a model of the system. The report provides a summary of the different software tools used in the study, the modeling strategies, and the development of a FAST model of the SWAY prototype wind turbine, including justification of the modeling assumptions. Because of uncertainty in system parameters and modeling assumptions due to the complexity of the design, several system properties were tuned to better represent the system and improve the accuracy of the simulations. Calibration was performed using data from a static equilibrium test and free-decay tests.

  16. Modeling and validation of existing VAV system components

    Energy Technology Data Exchange (ETDEWEB)

    Nassif, N.; Kajl, S.; Sabourin, R. [Ecole de Technologie Superieure, Montreal, PQ (Canada)

    2004-07-01

    The optimization of supervisory control strategies and local-loop controllers can improve the performance of HVAC (heating, ventilating, air-conditioning) systems. In this study, the component model of the fan, the damper and the cooling coil were developed and validated against monitored data of an existing variable air volume (VAV) system installed at Montreal's Ecole de Technologie Superieure. The measured variables that influence energy use in individual HVAC models included: (1) outdoor and return air temperature and relative humidity, (2) supply air and water temperatures, (3) zone airflow rates, (4) supply duct, outlet fan, mixing plenum static pressures, (5) fan speed, and (6) minimum and principal damper and cooling and heating coil valve positions. The additional variables that were considered, but not measured were: (1) fan and outdoor airflow rate, (2) inlet and outlet cooling coil relative humidity, and (3) liquid flow rate through the heating or cooling coils. The paper demonstrates the challenges of the validation process when monitored data of existing VAV systems are used. 7 refs., 11 figs.

  17. Internal validation of risk models in clustered data: a comparison of bootstrap schemes

    NARCIS (Netherlands)

    Bouwmeester, W.; Moons, K.G.M.; Kappen, T.H.; van Klei, W.A.; Twisk, J.W.R.; Eijkemans, M.J.C.; Vergouwe, Y.

    2013-01-01

    Internal validity of a risk model can be studied efficiently with bootstrapping to assess possible optimism in model performance. Assumptions of the regular bootstrap are violated when the development data are clustered. We compared alternative resampling schemes in clustered data for the estimation

  18. Validation of ecological state space models using the Laplace approximation

    DEFF Research Database (Denmark)

    Thygesen, Uffe Høgsbro; Albertsen, Christoffer Moesgaard; Berg, Casper Willestofte

    2017-01-01

    Many statistical models in ecology follow the state space paradigm. For such models, the important step of model validation rarely receives as much attention as estimation or hypothesis testing, perhaps due to lack of available algorithms and software. Model validation is often based on a naive...... for estimation in general mixed effects models. Implementing one-step predictions in the R package Template Model Builder, we demonstrate that it is possible to perform model validation with little effort, even if the ecological model is multivariate, has non-linear dynamics, and whether observations...... useful directions in which the model could be improved....

  19. Approaches to Validation of Models for Low Gravity Fluid Behavior

    Science.gov (United States)

    Chato, David J.; Marchetta, Jeffery; Hochstein, John I.; Kassemi, Mohammad

    2005-01-01

    This paper details the author experiences with the validation of computer models to predict low gravity fluid behavior. It reviews the literature of low gravity fluid behavior as a starting point for developing a baseline set of test cases. It examines authors attempts to validate their models against these cases and the issues they encountered. The main issues seem to be that: Most of the data is described by empirical correlation rather than fundamental relation; Detailed measurements of the flow field have not been made; Free surface shapes are observed but through thick plastic cylinders, and therefore subject to a great deal of optical distortion; and Heat transfer process time constants are on the order of minutes to days but the zero-gravity time available has been only seconds.

  20. Are Model Transferability And Complexity Antithetical? Insights From Validation of a Variable-Complexity Empirical Snow Model in Space and Time

    Science.gov (United States)

    Lute, A. C.; Luce, Charles H.

    2017-11-01

    The related challenges of predictions in ungauged basins and predictions in ungauged climates point to the need to develop environmental models that are transferable across both space and time. Hydrologic modeling has historically focused on modelling one or only a few basins using highly parameterized conceptual or physically based models. However, model parameters and structures have been shown to change significantly when calibrated to new basins or time periods, suggesting that model complexity and model transferability may be antithetical. Empirical space-for-time models provide a framework within which to assess model transferability and any tradeoff with model complexity. Using 497 SNOTEL sites in the western U.S., we develop space-for-time models of April 1 SWE and Snow Residence Time based on mean winter temperature and cumulative winter precipitation. The transferability of the models to new conditions (in both space and time) is assessed using non-random cross-validation tests with consideration of the influence of model complexity on transferability. As others have noted, the algorithmic empirical models transfer best when minimal extrapolation in input variables is required. Temporal split-sample validations use pseudoreplicated samples, resulting in the selection of overly complex models, which has implications for the design of hydrologic model validation tests. Finally, we show that low to moderate complexity models transfer most successfully to new conditions in space and time, providing empirical confirmation of the parsimony principal.

  1. Development and Validation of Web-based Courseware for Junior Secondary School Basic Technology Students in Nigeria

    OpenAIRE

    Anunobi, Anunobi; Njedeka, Vivian; Gambari, Gambari; Isiaka, Amosa; Abdullahi, Abdullahi; Bashiru, Mohammed; Alabi, Alabi; Omotayo, Thomas

    2018-01-01

    This research aimed to develop and validate a web-based courseware for junior secondary school basic technology students in Nigeria. In this study, a mixed method quantitative pilot study design with qualitative components was used to test and ascertain the ease of development and validation of the web-based courseware. Dick and Carey instructional system design model was adopted for developing the courseware. Convenience sampling technique was used in selecting the three content, computer an...

  2. Development and Validation of Web-Based Courseware for Junior Secondary School Basic Technology Students in Nigeria

    OpenAIRE

    Amosa Isiaka Gambari

    2018-01-01

    This research aimed to develop and validate a web-based courseware for junior secondary school basic technology students in Nigeria. In this study, a mixed method quantitative pilot study design with qualitative components was used to test and ascertain the ease of development and validation of the web-based courseware. Dick and Carey instructional system design model was adopted for developing the courseware. Convenience sampling technique was used in selecting the three content, computer an...

  3. Advanced training simulator models. Implementation and validation

    International Nuclear Information System (INIS)

    Borkowsky, Jeffrey; Judd, Jerry; Belblidia, Lotfi; O'farrell, David; Andersen, Peter

    2008-01-01

    Modern training simulators are required to replicate plant data for both thermal-hydraulic and neutronic response. Replication is required such that reactivity manipulation on the simulator properly trains the operator for reactivity manipulation at the plant. This paper discusses advanced models which perform this function in real-time using the coupled code system THOR/S3R. This code system models the all fluids systems in detail using an advanced, two-phase thermal-hydraulic a model. The nuclear core is modeled using an advanced, three-dimensional nodal method and also by using cycle-specific nuclear data. These models are configured to run interactively from a graphical instructor station or handware operation panels. The simulator models are theoretically rigorous and are expected to replicate the physics of the plant. However, to verify replication, the models must be independently assessed. Plant data is the preferred validation method, but plant data is often not available for many important training scenarios. In the absence of data, validation may be obtained by slower-than-real-time transient analysis. This analysis can be performed by coupling a safety analysis code and a core design code. Such a coupling exists between the codes RELAP5 and SIMULATE-3K (S3K). RELAP5/S3K is used to validate the real-time model for several postulated plant events. (author)

  4. Development and Validation of Computational Fluid Dynamics Models for Prediction of Heat Transfer and Thermal Microenvironments of Corals

    Science.gov (United States)

    Ong, Robert H.; King, Andrew J. C.; Mullins, Benjamin J.; Cooper, Timothy F.; Caley, M. Julian

    2012-01-01

    We present Computational Fluid Dynamics (CFD) models of the coupled dynamics of water flow, heat transfer and irradiance in and around corals to predict temperatures experienced by corals. These models were validated against controlled laboratory experiments, under constant and transient irradiance, for hemispherical and branching corals. Our CFD models agree very well with experimental studies. A linear relationship between irradiance and coral surface warming was evident in both the simulation and experimental result agreeing with heat transfer theory. However, CFD models for the steady state simulation produced a better fit to the linear relationship than the experimental data, likely due to experimental error in the empirical measurements. The consistency of our modelling results with experimental observations demonstrates the applicability of CFD simulations, such as the models developed here, to coral bleaching studies. A study of the influence of coral skeletal porosity and skeletal bulk density on surface warming was also undertaken, demonstrating boundary layer behaviour, and interstitial flow magnitude and temperature profiles in coral cross sections. Our models compliment recent studies showing systematic changes in these parameters in some coral colonies and have utility in the prediction of coral bleaching. PMID:22701582

  5. Reactor core modeling practice: Operational requirements, model characteristics, and model validation

    International Nuclear Information System (INIS)

    Zerbino, H.

    1997-01-01

    The physical models implemented in power plant simulators have greatly increased in performance and complexity in recent years. This process has been enabled by the ever increasing computing power available at affordable prices. This paper describes this process from several angles: First the operational requirements which are more critical from the point of view of model performance, both for normal and off-normal operating conditions; A second section discusses core model characteristics in the light of the solutions implemented by Thomson Training and Simulation (TT and S) in several full-scope simulators recently built and delivered for Dutch, German, and French nuclear power plants; finally we consider the model validation procedures, which are of course an integral part of model development, and which are becoming more and more severe as performance expectations increase. As a conclusion, it may be asserted that in the core modeling field, as in other areas, the general improvement in the quality of simulation codes has resulted in a fairly rapid convergence towards mainstream engineering-grade calculations. This is remarkable performance in view of the stringent real-time requirements which the simulation codes must satisfy as well as the extremely wide range of operating conditions that they are called upon to cover with good accuracy. (author)

  6. Sample size calculation to externally validate scoring systems based on logistic regression models.

    Directory of Open Access Journals (Sweden)

    Antonio Palazón-Bru

    Full Text Available A sample size containing at least 100 events and 100 non-events has been suggested to validate a predictive model, regardless of the model being validated and that certain factors can influence calibration of the predictive model (discrimination, parameterization and incidence. Scoring systems based on binary logistic regression models are a specific type of predictive model.The aim of this study was to develop an algorithm to determine the sample size for validating a scoring system based on a binary logistic regression model and to apply it to a case study.The algorithm was based on bootstrap samples in which the area under the ROC curve, the observed event probabilities through smooth curves, and a measure to determine the lack of calibration (estimated calibration index were calculated. To illustrate its use for interested researchers, the algorithm was applied to a scoring system, based on a binary logistic regression model, to determine mortality in intensive care units.In the case study provided, the algorithm obtained a sample size with 69 events, which is lower than the value suggested in the literature.An algorithm is provided for finding the appropriate sample size to validate scoring systems based on binary logistic regression models. This could be applied to determine the sample size in other similar cases.

  7. A comprehensive model for piezoceramic actuators: modelling, validation and application

    International Nuclear Information System (INIS)

    Quant, Mario; Elizalde, Hugo; Flores, Abiud; Ramírez, Ricardo; Orta, Pedro; Song, Gangbing

    2009-01-01

    This paper presents a comprehensive model for piezoceramic actuators (PAs), which accounts for hysteresis, non-linear electric field and dynamic effects. The hysteresis model is based on the widely used general Maxwell slip model, while an enhanced electro-mechanical non-linear model replaces the linear constitutive equations commonly used. Further on, a linear second order model compensates the frequency response of the actuator. Each individual model is fully characterized from experimental data yielded by a specific PA, then incorporated into a comprehensive 'direct' model able to determine the output strain based on the applied input voltage, fully compensating the aforementioned effects, where the term 'direct' represents an electrical-to-mechanical operating path. The 'direct' model was implemented in a Matlab/Simulink environment and successfully validated via experimental results, exhibiting higher accuracy and simplicity than many published models. This simplicity would allow a straightforward inclusion of other behaviour such as creep, ageing, material non-linearity, etc, if such parameters are important for a particular application. Based on the same formulation, two other models are also presented: the first is an 'alternate' model intended to operate within a force-controlled scheme (instead of a displacement/position control), thus able to capture the complex mechanical interactions occurring between a PA and its host structure. The second development is an 'inverse' model, able to operate within an open-loop control scheme, that is, yielding a 'linearized' PA behaviour. The performance of the developed models is demonstrated via a numerical sample case simulated in Matlab/Simulink, consisting of a PA coupled to a simple mechanical system, aimed at shifting the natural frequency of the latter

  8. Development and validation of a premature ejaculation diagnostic tool.

    Science.gov (United States)

    Symonds, Tara; Perelman, Michael A; Althof, Stanley; Giuliano, François; Martin, Mona; May, Kathryn; Abraham, Lucy; Crossland, Anna; Morris, Mark

    2007-08-01

    Diagnosis of premature ejaculation (PE) for clinical trial purposes has typically relied on intravaginal ejaculation latency time (IELT) for entry, but this parameter does not capture the multidimensional nature of PE. Therefore, the aim was to develop a brief, multidimensional, psychometrically validated instrument for diagnosing PE status. The questionnaire development involved three stages: (1) Five focus groups and six individual interviews were conducted to develop the content; (2) psychometric validation using three different groups of men; and (3) generation of a scoring system. For psychometric validation/scoring system development, data was collected from (1) men with PE based on clinician diagnosis, using DSM-IV-TR, who also had IELTs or =11 PE. The development and validation of this new PE diagnostic tool has resulted in a new, user-friendly, and brief self-report questionnaire for use in clinical trials to diagnose PE.

  9. Development and validation of a new turbocharger simulation methodology for marine two stroke diesel engine modelling and diagnostic applications

    International Nuclear Information System (INIS)

    Sakellaridis, Nikolaos F.; Raptotasios, Spyridon I.; Antonopoulos, Antonis K.; Mavropoulos, Georgios C.; Hountalas, Dimitrios T.

    2015-01-01

    Engine cycle simulation models are increasingly used in diesel engine simulation and diagnostic applications, reducing experimental effort. Turbocharger simulation plays an important role in model's ability to accurately predict engine performance and emissions. The present work describes the development of a complete engine simulation model for marine Diesel engines based on a new methodology for turbocharger modelling utilizing physically based meanline models for compressor and turbine. Simulation accuracy is evaluated against engine bench measurements. The methodology was developed to overcome the problem of limited experimental maps availability for compressor and turbine, often encountered in large marine diesel engine simulation and diagnostic studies. Data from the engine bench are used to calibrate the models, as well as to estimate turbocharger shaft mechanical efficiency. Closed cycle and gas exchange are modelled using an existing multizone thermodynamic model. The proposed methodology is applied on a 2-stroke marine diesel engine and its evaluation is based on the comparison of predictions against measured engine data. It is demonstrated model's ability to predict engine response with load variation regarding both turbocharger performance and closed cycle parameters, as well as NOx emission trends, making it an effective tool for both engine diagnostic and optimization studies. - Highlights: • Marine two stroke diesel engine simulation model. • Turbine and compressor simulation using physical meanline models. • Methodology to derive T/C component efficiency and T/C shaft mechanical efficiency. • Extensive validation of predictions against experimental data.

  10. Five-Factor Model personality disorder prototypes: a review of their development, validity, and comparison to alternative approaches.

    Science.gov (United States)

    Miller, Joshua D

    2012-12-01

    In this article, the development of Five-Factor Model (FFM) personality disorder (PD) prototypes for the assessment of DSM-IV PDs are reviewed, as well as subsequent procedures for scoring individuals' FFM data with regard to these PD prototypes, including similarity scores and simple additive counts that are based on a quantitative prototype matching methodology. Both techniques, which result in very strongly correlated scores, demonstrate convergent and discriminant validity, and provide clinically useful information with regard to various forms of functioning. The techniques described here for use with FFM data are quite different from the prototype matching methods used elsewhere. © 2012 The Author. Journal of Personality © 2012, Wiley Periodicals, Inc.

  11. Development of Camera Model and Geometric Calibration/validation of Xsat IRIS Imagery

    Science.gov (United States)

    Kwoh, L. K.; Huang, X.; Tan, W. J.

    2012-07-01

    XSAT, launched on 20 April 2011, is the first micro-satellite designed and built in Singapore. It orbits the Earth at altitude of 822 km in a sun synchronous orbit. The satellite carries a multispectral camera IRIS with three spectral bands - 0.52~0.60 mm for Green, 0.63~0.69 mm for Red and 0.76~0.89 mm for NIR at 12 m resolution. In the design of IRIS camera, the three bands were acquired by three lines of CCDs (NIR, Red and Green). These CCDs were physically separated in the focal plane and their first pixels not absolutely aligned. The micro-satellite platform was also not stable enough to allow for co-registration of the 3 bands with simple linear transformation. In the camera model developed, this platform stability was compensated with 3rd to 4th order polynomials for the satellite's roll, pitch and yaw attitude angles. With the camera model, the camera parameters such as the band to band separations, the alignment of the CCDs relative to each other, as well as the focal length of the camera can be validated or calibrated. The results of calibration with more than 20 images showed that the band to band along-track separation agreed well with the pre-flight values provided by the vendor (0.093° and 0.046° for the NIR vs red and for green vs red CCDs respectively). The cross-track alignments were 0.05 pixel and 5.9 pixel for the NIR vs red and green vs red CCDs respectively. The focal length was found to be shorter by about 0.8%. This was attributed to the lower operating temperature which XSAT is currently operating. With the calibrated parameters and the camera model, a geometric level 1 multispectral image with RPCs can be generated and if required, orthorectified imagery can also be produced.

  12. Developing and validating of predictive model for radiofrequency radiation emission within the vicinity of fm stations in Ghana

    International Nuclear Information System (INIS)

    Ahenkora-Duodu, Kingsley

    2016-07-01

    The rapid growing number of FM stations with their corresponding antennas have led to an increase in the concern of the potential health risks that may arise as a result of exposure to RF radiations. The main objective of this research was to develop and validate a predictive model with real time measured data for FM antennas in Ghana. Theoretical and experimental assessment of radiofrequency emission due to FM antennas has been analysed. The maximum and minimum electric field spatial average recorded was 7.17E-01 ± 6.97E-01V/m at Kasapa FM and 6.39E-02 ± 5.39E-02V/m at Asempa FM respectively. At a transmission frequency range of 88 -108 MHz, the average power density of the real time measured data ranged between 3.92E-05W/m"2 and 1.37E-03W/m"2 whiles that of the FM model varied from 9.72E-03W/m"2 to 5.35E-01W/m"2 respectively. Results obtained showed a variation between measured power density levels and the FM model. The FM model overestimates the power density levels as compared to that of the measured data. The impact predictions were based on the maximum values estimated by the FM model, hence these results validates the credibility of the impact analysis for the FM stations. The general public exposure quotient ranged between 9.00E-03 and 2.68E-01 whilst that of the occupational exposure quotient varied from 9.72E-04 to 5.35E-02. The results obtained were found to be in compliance with the International Commission on Non-Ionizing Radiation Protection (ICNIRP) RF exposure limit. (au)

  13. Development and validation of a new LBM-MRT hybrid model with enthalpy formulation for melting with natural convection

    Energy Technology Data Exchange (ETDEWEB)

    Miranda Fuentes, Johann [Université de Lyon, CNRS, UMR5008, F-69622 Villeurbanne (France); INSA-Lyon, CETHIL, F-69621 Villeurbanne (France); Kuznik, Frédéric, E-mail: frederic.kuznik@insa-lyon.fr [Université de Lyon, CNRS, UMR5008, F-69622 Villeurbanne (France); INSA-Lyon, CETHIL, F-69621 Villeurbanne (France); Johannes, Kévyn; Virgone, Joseph [Université de Lyon, CNRS, UMR5008, F-69622 Villeurbanne (France); Université Lyon 1, CETHIL, F-69622 Villeurbanne (France)

    2014-01-17

    This article presents a new model to simulate melting with natural convection of a phase change material. For the phase change problem, the enthalpy formulation is used. Energy equation is solved by a finite difference method, whereas the fluid flow is solved by the multiple relaxation time (MRT) lattice Boltzmann method. The model is first verified and validated using the data from the literature. Then, the model is applied to a tall brick filled with a fatty acid eutectic mixture and the results are presented. The main results are (1) the spatial convergence rate is of second order, (2) the new model is validated against data from the literature and (3) the natural convection plays an important role in the melting process of the fatty acid mixture considered in our work.

  14. DTU PMU Laboratory Development - Testing and Validation

    OpenAIRE

    Garcia-Valle, Rodrigo; Yang, Guang-Ya; Martin, Kenneth E.; Nielsen, Arne Hejde; Østergaard, Jacob

    2010-01-01

    This is a report of the results of phasor measurement unit (PMU) laboratory development and testing done at the Centre for Electric Technology (CET), Technical University of Denmark (DTU). Analysis of the PMU performance first required the development of tools to convert the DTU PMU data into IEEE standard, and the validation is done for the DTU-PMU via a validated commercial PMU. The commercial PMU has been tested from the authors' previous efforts, where the response can be expected to foll...

  15. Innovation, Product Development, and New Business Models in Networks: How to come from case studies to a valid and operational theory

    DEFF Research Database (Denmark)

    Rasmussen, Erik Stavnsager; Jørgensen, Jacob Høj; Goduscheit, René Chester

    2007-01-01

    We have in the research project NEWGIBM (New Global ICT based Business Models) during 2005 and 2006 closely cooperated with a group of firms. The focus in the project has been development of new business models (and innovation) in close cooperation with multiple partners. These partners have been...... customers, suppliers, R&D partners, and others. The methodological problem is thus, how to come from e.g. one in-depth case study to a more formalized theory or model on how firms can develop new projects and be innovative in a network. The paper is structured so that it starts with a short presentation...... of the two key concepts in our research setting and theoretical models: Innovation and networks. It is not our intention in this paper to present a lengthy discussion of the two concepts, but a short presentation is necessary to understand the validity and interpretation discussion later in the paper. Next...

  16. An inverse radiation model for optical determination of temperature and species concentration: Development and validation

    DEFF Research Database (Denmark)

    Ren, Tao; Modest, Michael F.; Fateev, Alexander

    2015-01-01

    2010 (Rothman et al. (2010) [1]), which contains line-by-line (LBL) information for several combustion gas species, such as CO2 and H2O, was used to predict gas spectral transmissivities. The model was validated by retrieving temperatures and species concentrations from experimental CO2 and H2O...

  17. Validation of the DeLone and McLean Information Systems Success Model.

    Science.gov (United States)

    Ojo, Adebowale I

    2017-01-01

    This study is an adaptation of the widely used DeLone and McLean information system success model in the context of hospital information systems in a developing country. A survey research design was adopted in the study. A structured questionnaire was used to collect data from 442 health information management personnel in five Nigerian teaching hospitals. A structural equation modeling technique was used to validate the model's constructs. It was revealed that system quality significantly influenced use (β = 0.53, p Information quality significantly influenced use (β = 0.24, p 0.05), but it significantly influenced perceived net benefits (β = 0.21, p 0.05). The study validates the DeLone and McLean information system success model in the context of a hospital information system in a developing country. Importantly, system quality and use were found to be important measures of hospital information system success. It is, therefore, imperative that hospital information systems are designed in such ways that are easy to use, flexible, and functional to serve their purpose.

  18. Preventing patient absenteeism: validation of a predictive overbooking model.

    Science.gov (United States)

    Reid, Mark W; Cohen, Samuel; Wang, Hank; Kaung, Aung; Patel, Anish; Tashjian, Vartan; Williams, Demetrius L; Martinez, Bibiana; Spiegel, Brennan M R

    2015-12-01

    To develop a model that identifies patients at high risk for missing scheduled appointments ("no-shows" and cancellations) and to project the impact of predictive overbooking in a gastrointestinal endoscopy clinic-an exemplar resource-intensive environment with a high no-show rate. We retrospectively developed an algorithm that uses electronic health record (EHR) data to identify patients who do not show up to their appointments. Next, we prospectively validated the algorithm at a Veterans Administration healthcare network clinic. We constructed a multivariable logistic regression model that assigned a no-show risk score optimized by receiver operating characteristic curve analysis. Based on these scores, we created a calendar of projected open slots to offer to patients and compared the daily performance of predictive overbooking with fixed overbooking and typical "1 patient, 1 slot" scheduling. Data from 1392 patients identified several predictors of no-show, including previous absenteeism, comorbid disease burden, and current diagnoses of mood and substance use disorders. The model correctly classified most patients during the development (area under the curve [AUC] = 0.80) and validation phases (AUC = 0.75). Prospective testing in 1197 patients found that predictive overbooking averaged 0.51 unused appointments per day versus 6.18 for typical booking (difference = -5.67; 95% CI, -6.48 to -4.87; P < .0001). Predictive overbooking could have increased service utilization from 62% to 97% of capacity, with only rare clinic overflows. Information from EHRs can accurately predict whether patients will no-show. This method can be used to overbook appointments, thereby maximizing service utilization while staying within clinic capacity.

  19. Development and validation of a stochastic model for potential growth of Listeria monocytogenes in naturally contaminated lightly preserved seafood.

    Science.gov (United States)

    Mejlholm, Ole; Bøknæs, Niels; Dalgaard, Paw

    2015-02-01

    A new stochastic model for the simultaneous growth of Listeria monocytogenes and lactic acid bacteria (LAB) was developed and validated on data from naturally contaminated samples of cold-smoked Greenland halibut (CSGH) and cold-smoked salmon (CSS). During industrial processing these samples were added acetic and/or lactic acids. The stochastic model was developed from an existing deterministic model including the effect of 12 environmental parameters and microbial interaction (O. Mejlholm and P. Dalgaard, Food Microbiology, submitted for publication). Observed maximum population density (MPD) values of L. monocytogenes in naturally contaminated samples of CSGH and CSS were accurately predicted by the stochastic model based on measured variability in product characteristics and storage conditions. Results comparable to those from the stochastic model were obtained, when product characteristics of the least and most preserved sample of CSGH and CSS were used as input for the existing deterministic model. For both modelling approaches, it was shown that lag time and the effect of microbial interaction needs to be included to accurately predict MPD values of L. monocytogenes. Addition of organic acids to CSGH and CSS was confirmed as a suitable mitigation strategy against the risk of growth by L. monocytogenes as both types of products were in compliance with the EU regulation on ready-to-eat foods. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Development and validation of a 3D-printed model of the ostiomeatal complex and frontal sinus for endoscopic sinus surgery training.

    Science.gov (United States)

    Alrasheed, Abdulaziz S; Nguyen, Lily H P; Mongeau, Luc; Funnell, W Robert J; Tewfik, Marc A

    2017-08-01

    Endoscopic sinus surgery poses unique training challenges due to complex and variable anatomy, and the risk of major complications. We sought to create and provide validity evidence for a novel 3D-printed simulator of the nose and paranasal sinuses. Sinonasal computed tomography (CT) images of a patient were imported into 3D visualization software. Segmentation of bony and soft tissue structures was then performed. The model was printed using simulated bone and soft tissue materials. Rhinologists and otolaryngology residents completed 6 prespecified tasks including maxillary antrostomy and frontal recess dissection on the simulator. Participants evaluated the model using survey ratings based on a 5-point Likert scale. The average time to complete each task was calculated. Descriptive analysis was used to evaluate ratings, and thematic analysis was done for qualitative questions. A total of 20 participants (10 rhinologists and 10 otolaryngology residents) tested the model and answered the survey. Overall the participants felt that the simulator would be useful as a training/educational tool (4.6/5), and that it should be integrated as part of the rhinology training curriculum (4.5/5). The following responses were obtained: visual appearance 4.25/5; realism of materials 3.8/5; and surgical experience 3.9/5. The average time to complete each task was lower for the rhinologist group than for the residents. We describe the development and validation of a novel 3D-printed model for the training of endoscopic sinus surgery skills. Although participants found the simulator to be a useful training and educational tool, further model development could improve the outcome. © 2017 ARS-AAOA, LLC.

  1. Development and Validation of 3D-CFD Injection and Combustion Models for Dual Fuel Combustion in Diesel Ignited Large Gas Engines

    Directory of Open Access Journals (Sweden)

    Lucas Eder

    2018-03-01

    Full Text Available This paper focuses on improving the 3D-Computational Fluid Dynamics (CFD modeling of diesel ignited gas engines, with an emphasis on injection and combustion modeling. The challenges of modeling are stated and possible solutions are provided. A specific approach for modeling injection is proposed that improves the modeling of the ballistic region of the needle lift. Experimental results from an inert spray chamber are used for model validation. Two-stage ignition methods are described along with improvements in ignition delay modeling of the diesel ignited gas engine. The improved models are used in the Extended Coherent Flame Model with the 3 Zones approach (ECFM-3Z. The predictive capability of the models is investigated using data from single cylinder engine (SCE tests conducted at the Large Engines Competence Center (LEC. The results are discussed and further steps for development are identified.

  2. Towards practical application of sensors for monitoring animal health; design and validation of a model to detect ketosis.

    Science.gov (United States)

    Steensels, Machteld; Maltz, Ephraim; Bahr, Claudia; Berckmans, Daniel; Antler, Aharon; Halachmi, Ilan

    2017-05-01

    The objective of this study was to design and validate a mathematical model to detect post-calving ketosis. The validation was conducted in four commercial dairy farms in Israel, on a total of 706 multiparous Holstein dairy cows: 203 cows clinically diagnosed with ketosis and 503 healthy cows. A logistic binary regression model was developed, where the dependent variable is categorical (healthy/diseased) and a set of explanatory variables were measured with existing commercial sensors: rumination duration, activity and milk yield of each individual cow. In a first validation step (within-farm), the model was calibrated on the database of each farm separately. Two thirds of the sick cows and an equal number of healthy cows were randomly selected for model validation. The remaining one third of the cows, which did not participate in the model validation, were used for model calibration. In order to overcome the random selection effect, this procedure was repeated 100 times. In a second (between-farms) validation step, the model was calibrated on one farm and validated on another farm. Within-farm accuracy, ranging from 74 to 79%, was higher than between-farm accuracy, ranging from 49 to 72%, in all farms. The within-farm sensitivities ranged from 78 to 90%, and specificities ranged from 71 to 74%. The between-farms sensitivities ranged from 65 to 95%. The developed model can be improved in future research, by employing other variables that can be added; or by exploring other models to achieve greater sensitivity and specificity.

  3. Supplementary investigations on the validation of the atmospheric radionuclide transport model (ARTM)

    International Nuclear Information System (INIS)

    Richter, Cornelia; Thielen, Harald; Sogalla, Martin

    2015-09-01

    In the medium-term time scale the Gaussian plume model used so far for atmospheric dispersion calculations in the General Administrative Provision (AVV) relating to Section 47 of the Radiation Protection Ordinance (StrISchV) as well as in the Incident Calculation Bases (SBG) relating to Section 49 StrISchV is to be replaced by a Lagrangian particle model. Meanwhile the Atmospheric Radionuclide Transportation Model (ARTM) is available, which allows the simulation of the atmospheric dispersion of operational releases from nuclear installations. ARTM is based on the program package AUSTAL2000 which is designed for the simulation of atmospheric dispersion of non-radioactive operational releases from industrial plants and was adapted to the application of airborne radioactive releases. The research project 3612S50007 serves, on the one hand, to validate ARTM systematically. On the other hand, the development of science and technology were investigated and, if reasonable and possible, were implemented to the program system. The dispersion model and the user interface were advanced and optimized. The program package was provided to the users as a free download. Notably t he work program comprises the validation of the approach used in ARTM to model short emission periods, which are of interest in view of the SBG. The simulation results of the diagnostic wind and turbulence model TALdia, which is part of the GO-ARTM program package, were evaluated with focus on the influence of buildings on the flow field. The user interface was upgraded with a wind field viewer. To simplify the comparison with the model still in use, a Gaussian plum e model was implemented into the graphical user interface. The ARTM web page was maintained, user questions and feedback were answered and analysed concerning possible improvements and further developments of the program package. Numerous improvements were implemented. An ARTM user workshop was hosted by the Federal Office for Radiation

  4. Assessment model validity document - HYDRASTAR. A stochastic continuum program for groundwater flow

    Energy Technology Data Exchange (ETDEWEB)

    Gylling, B. [Kemakta Konsult AB, Stockholm (Sweden); Eriksson, Lars [Equa Simulation AB, Sundbyberg (Sweden)

    2001-12-01

    HYDRASTAR and referenced here. In addition, the system of tools to present and inspect input and output data is shortly described in this document. HYDRASTAR is developed to produce probabilistic input to other models in the SKB model chain PROPER, which has been used in performance assessment studies conducted by SKB. As for most model concepts there are advantages and limitations for HYDRASTAR. However, one conclusion is that HYDRASTAR is valid for its purpose, i.e. local-scale stochastic groundwater modelling and advective transport for performance assessments under freshwater conditions.

  5. Development and validation of the Patriarchal Beliefs Scale.

    Science.gov (United States)

    Yoon, Eunju; Adams, Kristen; Hogge, Ingrid; Bruner, John P; Surya, Shruti; Bryant, Fred B

    2015-04-01

    The purpose of this research was to develop and validate a conceptually and psychometrically solid measure for patriarchal beliefs in samples of U.S. American adults from diverse demographic and geographic backgrounds. In Study 1, we identified 3 correlated factors of the Patriarchal Beliefs Scale (PBS) in data collected from the Internet (N = 279): Institutional Power of Men, Inferiority of Women, and Gendered Domestic Roles. In Study 2, data collected from the Internet (N = 284) supported both an oblique 3-factor structure and a bifactor structure of the PBS, through confirmatory factor analyses. Construct validity of the PBS was supported in relation to other gender-related measures. The PBS was correlated in expected directions with modern sexism, antifeminist attitudes, and egalitarian attitudes toward women. In Study 3, we examined measurement invariance across gender by using combined data from Study 1 and Study 2. All 3 factors of the oblique 3-factor model indicated measurement invariance, whereas the general factor represented in the bifactor model indicated nonequivalence. Mean differences in patriarchal beliefs were found for such demographic variables as gender, sexual orientation, education, and social class. Recommendations for using the PBS, as well as implications for research and practice, are discussed. (c) 2015 APA, all rights reserved).

  6. Laboratory research program to aid in developing and testing the validity of conceptual models for flow and transport through unsaturated porous media

    International Nuclear Information System (INIS)

    Glass, R.J.

    1990-01-01

    As part of the Yucca Mountain Project, a laboratory research program is being developed at Sandia National Laboratories that will integrate fundamental physical experimentation with conceptual formulation and mathematical modeling and aid in subsequent model validation for unsaturated zone water and contaminant transport. Experimental systems are being developed to explore flow and transport processes and assumptions of fundamental importance to various conceptual models. Experimentation will run concurrently in two types of systems: fractured and nonfractured tuffaceous systems; and analogue systems having specific characteristics of the tuff systems but designed to maximize experimental control and resolution of data measurement. Questions to which experimentation currently is directed include infiltration flow instability, water and solute movement in unsaturated fractures, fracture-matrix interaction, and the definition of effective large-scale properties for heterogeneous, fractured media. 16 refs

  7. Verification and Validation of Heat Transfer Model of AGREE Code

    Energy Technology Data Exchange (ETDEWEB)

    Tak, N. I. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Seker, V.; Drzewiecki, T. J.; Downar, T. J. [Department of Nuclear Engineering and Radiological Sciences, Univ. of Michigan, Michigan (United States); Kelly, J. M. [US Nuclear Regulatory Commission, Washington (United States)

    2013-05-15

    The AGREE code was originally developed as a multi physics simulation code to perform design and safety analysis of Pebble Bed Reactors (PBR). Currently, additional capability for the analysis of Prismatic Modular Reactor (PMR) core is in progress. Newly implemented fluid model for a PMR core is based on a subchannel approach which has been widely used in the analyses of light water reactor (LWR) cores. A hexagonal fuel (or graphite block) is discretized into triangular prism nodes having effective conductivities. Then, a meso-scale heat transfer model is applied to the unit cell geometry of a prismatic fuel block. Both unit cell geometries of multi-hole and pin-in-hole types of prismatic fuel blocks are considered in AGREE. The main objective of this work is to verify and validate the heat transfer model newly implemented for a PMR core in the AGREE code. The measured data in the HENDEL experiment were used for the validation of the heat transfer model for a pin-in-hole fuel block. However, the HENDEL tests were limited to only steady-state conditions of pin-in-hole fuel blocks. There exist no available experimental data regarding a heat transfer in multi-hole fuel blocks. Therefore, numerical benchmarks using conceptual problems are considered to verify the heat transfer model of AGREE for multi-hole fuel blocks as well as transient conditions. The CORONA and GAMMA+ codes were used to compare the numerical results. In this work, the verification and validation study were performed for the heat transfer model of the AGREE code using the HENDEL experiment and the numerical benchmarks of selected conceptual problems. The results of the present work show that the heat transfer model of AGREE is accurate and reliable for prismatic fuel blocks. Further validation of AGREE is in progress for a whole reactor problem using the HTTR safety test data such as control rod withdrawal tests and loss-of-forced convection tests.

  8. Development and validation of the Child Oral Health Impact Profile - Preschool version.

    Science.gov (United States)

    Ruff, R R; Sischo, L; Chinn, C H; Broder, H L

    2017-09-01

    The Child Oral Health Impact Profile (COHIP) is a validated instrument created to measure the oral health-related quality of life of school-aged children. The purpose of this study was to develop and validate a preschool version of the COHIP (COHIP-PS) for children aged 2-5. The COHIP-PS was developed and validated using a multi-stage process consisting of item selection, face validity testing, item impact testing, reliability and validity testing, and factor analysis. A cross-sectional convenience sample of caregivers having children 2-5 years old from four groups completed item clarity and impact forms. Groups were recruited from pediatric health clinics or preschools/daycare centers, speech clinics, dental clinics, or cleft/craniofacial centers. Participants had a variety of oral health-related conditions, including caries, congenital orofacial anomalies, and speech/language deficiencies such as articulation and language disorders. COHIP-PS. The COHIP-PS was found to have acceptable internal validity (a = 0.71) and high test-retest reliability (0.87), though internal validity was below the accepted threshold for the community sample. While discriminant validity results indicated significant differences across study groups, the overall magnitude of differences was modest. Results from confirmatory factor analyses support the use of a four-factor model consisting of 11 items across oral health, functional well-being, social-emotional well-being, and self-image domains. Quality of life is an integral factor in understanding and assessing children's well-being. The COHIP-PS is a validated oral health-related quality of life measure for preschool children with cleft or other oral conditions. Copyright© 2017 Dennis Barber Ltd.

  9. Validity of Basic Electronic 1 Module Integrated Character Value Based on Conceptual Change Teaching Model to Increase Students Physics Competency in STKIP PGRI West Sumatera

    Science.gov (United States)

    Hidayati, A.; Rahmi, A.; Yohandri; Ratnawulan

    2018-04-01

    The importance of teaching materials in accordance with the characteristics of students became the main reason for the development of basic electronics I module integrated character values based on conceptual change teaching model. The module development in this research follows the development procedure of Plomp which includes preliminary research, prototyping phase and assessment phase. In the first year of this research, the module is validated. Content validity is seen from the conformity of the module with the development theory in accordance with the demands of learning model characteristics. The validity of the construct is seen from the linkage and consistency of each module component developed with the characteristic of the integrated learning model of character values obtained through validator assessment. The average validation value assessed by the validator belongs to a very valid category. Based on the validator assessment then revised the basic electronics I module integrated character values based on conceptual change teaching model.

  10. Maturity Models Development in IS Research

    DEFF Research Database (Denmark)

    Lasrado, Lester Allan; Vatrapu, Ravi; Andersen, Kim Normann

    2015-01-01

    Maturity models are widespread in IS research and in particular, IT practitioner communities. However, theoretically sound, methodologically rigorous and empirically validated maturity models are quite rare. This literature review paper focuses on the challenges faced during the development...... literature reveals that researchers have primarily focused on developing new maturity models pertaining to domain-specific problems and/or new enterprise technologies. We find rampant re-use of the design structure of widely adopted models such as Nolan’s Stage of Growth Model, Crosby’s Grid, and Capability...... Maturity Model (CMM). Only recently have there been some research efforts to standardize maturity model development. We also identify three dominant views of maturity models and provide guidelines for various approaches of constructing maturity models with a standard vocabulary. We finally propose using...

  11. A methodology for PSA model validation

    International Nuclear Information System (INIS)

    Unwin, S.D.

    1995-09-01

    This document reports Phase 2 of work undertaken by Science Applications International Corporation (SAIC) in support of the Atomic Energy Control Board's Probabilistic Safety Assessment (PSA) review. A methodology is presented for the systematic review and evaluation of a PSA model. These methods are intended to support consideration of the following question: To within the scope and depth of modeling resolution of a PSA study, is the resultant model a complete and accurate representation of the subject plant? This question was identified as a key PSA validation issue in SAIC's Phase 1 project. The validation methods are based on a model transformation process devised to enhance the transparency of the modeling assumptions. Through conversion to a 'success-oriented' framework, a closer correspondence to plant design and operational specifications is achieved. This can both enhance the scrutability of the model by plant personnel, and provide an alternative perspective on the model that may assist in the identification of deficiencies. The model transformation process is defined and applied to fault trees documented in the Darlington Probabilistic Safety Evaluation. A tentative real-time process is outlined for implementation and documentation of a PSA review based on the proposed methods. (author). 11 refs., 9 tabs., 30 refs

  12. Validating agent oriented methodology (AOM) for netlogo modelling and simulation

    Science.gov (United States)

    WaiShiang, Cheah; Nissom, Shane; YeeWai, Sim; Sharbini, Hamizan

    2017-10-01

    AOM (Agent Oriented Modeling) is a comprehensive and unified agent methodology for agent oriented software development. AOM methodology was proposed to aid developers with the introduction of technique, terminology, notation and guideline during agent systems development. Although AOM methodology is claimed to be capable of developing a complex real world system, its potential is yet to be realized and recognized by the mainstream software community and the adoption of AOM is still at its infancy. Among the reason is that there are not much case studies or success story of AOM. This paper presents two case studies on the adoption of AOM for individual based modelling and simulation. It demonstrate how the AOM is useful for epidemiology study and ecological study. Hence, it further validate the AOM in a qualitative manner.

  13. SU-E-J-244: Development and Validation of a Knowledge Based Planning Model for External Beam Radiation Therapy of Locally Advanced Non-Small Cell Lung Cancer

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Z; Kennedy, A [Sarah Cannon, Nashville, TN (United States); Larsen, E; Hayes, C; Grow, A [North Florida Cancer Center, Gainesville, FL (United States); Bahamondes, S.; Zheng, Y; Wu, X [JFK Comprehensive Cancer Institute, Lake Worth, FL (United States); Choi, M; Pai, S [Good Samaritan Hospital, Los Gatos, CA (United States); Li, J [Doctors Hospital of Augusta, Augusta, GA (United States); Cranford, K [Trident Medical Center, Charleston, SC (United States)

    2015-06-15

    Purpose: The study aims to develop and validate a knowledge based planning (KBP) model for external beam radiation therapy of locally advanced non-small cell lung cancer (LA-NSCLC). Methods: RapidPlan™ technology was used to develop a lung KBP model. Plans from 65 patients with LA-NSCLC were used to train the model. 25 patients were treated with VMAT, and the other patients were treated with IMRT. Organs-at-risk (OARs) included right lung, left lung, heart, esophagus, and spinal cord. DVH and geometric distribution DVH were extracted from the treated plans. The model was trained using principal component analysis and step-wise multiple regression. Box plot and regression plot tools were used to identify geometric outliers and dosimetry outliers and help fine-tune the model. The validation was performed by (a) comparing predicted DVH boundaries to actual DVHs of 63 patients and (b) using an independent set of treatment planning data. Results: 63 out of 65 plans were included in the final KBP model with PTV volume ranging from 102.5cc to 1450.2cc. Total treatment dose prescription varied from 50Gy to 70Gy based on institutional guidelines. One patient was excluded due to geometric outlier where 2.18cc of spinal cord was included in PTV. The other patient was excluded due to dosimetric outlier where the dose sparing to spinal cord was heavily enforced in the clinical plan. Target volume, OAR volume, OAR overlap volume percentage to target, and OAR out-of-field volume were included in the trained model. Lungs and heart had two principal component scores of GEDVH, whereas spinal cord and esophagus had three in the final model. Predicted DVH band (mean ±1 standard deviation) represented 66.2±3.6% of all DVHs. Conclusion: A KBP model was developed and validated for radiotherapy of LA-NSCLC in a commercial treatment planning system. The clinical implementation may improve the consistency of IMRT/VMAT planning.

  14. Model-based Systems Engineering: Creation and Implementation of Model Validation Rules for MOS 2.0

    Science.gov (United States)

    Schmidt, Conrad K.

    2013-01-01

    Model-based Systems Engineering (MBSE) is an emerging modeling application that is used to enhance the system development process. MBSE allows for the centralization of project and system information that would otherwise be stored in extraneous locations, yielding better communication, expedited document generation and increased knowledge capture. Based on MBSE concepts and the employment of the Systems Modeling Language (SysML), extremely large and complex systems can be modeled from conceptual design through all system lifecycles. The Operations Revitalization Initiative (OpsRev) seeks to leverage MBSE to modernize the aging Advanced Multi-Mission Operations Systems (AMMOS) into the Mission Operations System 2.0 (MOS 2.0). The MOS 2.0 will be delivered in a series of conceptual and design models and documents built using the modeling tool MagicDraw. To ensure model completeness and cohesiveness, it is imperative that the MOS 2.0 models adhere to the specifications, patterns and profiles of the Mission Service Architecture Framework, thus leading to the use of validation rules. This paper outlines the process by which validation rules are identified, designed, implemented and tested. Ultimately, these rules provide the ability to maintain model correctness and synchronization in a simple, quick and effective manner, thus allowing the continuation of project and system progress.

  15. Development and validation of an instrument to assess future orientation and resilience in adolescence.

    Science.gov (United States)

    Di Maggio, Ilaria; Ginevra, Maria Cristina; Nota, Laura; Soresi, Salvatore

    2016-08-01

    The study is aimed at providing the development and initial validation of the Design My Future (DMF), which may be administered in career counseling and research activities to assess adolescents' future orientation and resilience. Two studies with two independent samples of Italian adolescents were conducted to examine psychometric requisites of DMF. Specifically, in the first study, after developing items and examined the content validity, the factorial structure, reliability and discriminant validity of the DMF were tested. In the second study, the measurement invariance across gender, conducing a sequence of nested CFA models, was evaluated. Results showed good psychometric support for the instrument with Italian adolescents. Copyright © 2016 The Foundation for Professionals in Services for Adolescents. Published by Elsevier Ltd. All rights reserved.

  16. Refining and validating a conceptual model of Clinical Nurse Leader integrated care delivery.

    Science.gov (United States)

    Bender, Miriam; Williams, Marjory; Su, Wei; Hites, Lisle

    2017-02-01

    To empirically validate a conceptual model of Clinical Nurse Leader integrated care delivery. There is limited evidence of frontline care delivery models that consistently achieve quality patient outcomes. Clinical Nurse Leader integrated care delivery is a promising nursing model with a growing record of success. However, theoretical clarity is necessary to generate causal evidence of effectiveness. Sequential mixed methods. A preliminary Clinical Nurse Leader practice model was refined and survey items developed to correspond with model domains, using focus groups and a Delphi process with a multi-professional expert panel. The survey was administered in 2015 to clinicians and administrators involved in Clinical Nurse Leader initiatives. Confirmatory factor analysis and structural equation modelling were used to validate the measurement and model structure. Final sample n = 518. The model incorporates 13 components organized into five conceptual domains: 'Readiness for Clinical Nurse Leader integrated care delivery'; 'Structuring Clinical Nurse Leader integrated care delivery'; 'Clinical Nurse Leader Practice: Continuous Clinical Leadership'; 'Outcomes of Clinical Nurse Leader integrated care delivery'; and 'Value'. Sample data had good fit with specified model and two-level measurement structure. All hypothesized pathways were significant, with strong coefficients suggesting good fit between theorized and observed path relationships. The validated model articulates an explanatory pathway of Clinical Nurse Leader integrated care delivery, including Clinical Nurse Leader practices that result in improved care dynamics and patient outcomes. The validated model provides a basis for testing in practice to generate evidence that can be deployed across the healthcare spectrum. © 2016 John Wiley & Sons Ltd.

  17. Cross-validation pitfalls when selecting and assessing regression and classification models.

    Science.gov (United States)

    Krstajic, Damjan; Buturovic, Ljubomir J; Leahy, David E; Thomas, Simon

    2014-03-29

    We address the problem of selecting and assessing classification and regression models using cross-validation. Current state-of-the-art methods can yield models with high variance, rendering them unsuitable for a number of practical applications including QSAR. In this paper we describe and evaluate best practices which improve reliability and increase confidence in selected models. A key operational component of the proposed methods is cloud computing which enables routine use of previously infeasible approaches. We describe in detail an algorithm for repeated grid-search V-fold cross-validation for parameter tuning in classification and regression, and we define a repeated nested cross-validation algorithm for model assessment. As regards variable selection and parameter tuning we define two algorithms (repeated grid-search cross-validation and double cross-validation), and provide arguments for using the repeated grid-search in the general case. We show results of our algorithms on seven QSAR datasets. The variation of the prediction performance, which is the result of choosing different splits of the dataset in V-fold cross-validation, needs to be taken into account when selecting and assessing classification and regression models. We demonstrate the importance of repeating cross-validation when selecting an optimal model, as well as the importance of repeating nested cross-validation when assessing a prediction error.

  18. Lagrangian Stochastic Dispersion Model IMS Model Suite and its Validation against Experimental Data

    International Nuclear Information System (INIS)

    Bartok, J.

    2010-01-01

    The dissertation presents IMS Lagrangian Dispersion Model, which is a 'new generation' Slovak dispersion model of long-range transport, developed by MicroStep-MIS. It solves trajectory equation for a vast number of Lagrangian 'particles' and stochastic equation that simulates the effects of turbulence. Model contains simulation of radioactive decay (full decay chains of more than 300 nuclides), and dry and wet deposition. Model was integrated into IMS Model Suite, a system in which several models and modules can run and cooperate, e.g. LAM model WRF preparing fine resolution meteorological data for dispersion. The main theme of the work is validation of dispersion model against large scale international campaigns CAPTEX and ETEX, which are two of the largest tracer experiments. Validation addressed treatment of missing data, data interpolation into comparable temporal and spatial representation. The best model results were observed for ETEX I, standard results for CAPTEXes and worst results for ETEX II, known in modelling community for its meteorological conditions that can be hardly resolved by models. The IMS Lagrangian Dispersion Model was identified as capable long range dispersion model for slowly- or nonreacting chemicals and radioactive matter. Influence of input data on simulation quality is discussed within the work. Additional modules were prepared according to praxis requirement: a) Recalculation of concentrations of radioactive pollutant into effective doses form inhalation, immersion in the plume and deposition. b) Dispersion of mineral dust was added and tested in desert locality, where wind and soil moisture were firstly analysed and forecast by WRF. The result was qualitatively verified in case study against satellite observations. (author)

  19. Experimental validation of models for Plasma Focus devices

    International Nuclear Information System (INIS)

    Rodriguez Palomino, Luis; Gonzalez, Jose; Clausse, Alejandro

    2003-01-01

    Plasma Focus(PF) Devices are thermonuclear pulsators that produce short pulsed radiation (X-ray, charged particles and neutrons). Since Filippov and Mather, investigations have been used to study plasma properties. Nowadays the interest about PF is focused in technology applications, related to the use of these devices as pulsed neutron sources. In the numerical calculus the Inter institutional PLADEMA (PLAsmas DEnsos MAgnetizados) network is developing three models. Each one is useful in different engineering stages of the Plasma Focus design. One of the main objectives in this work is a comparative study on the influence of the different parameters involved in each models. To validate these results, several experimental measurements under different geometry and initial conditions were performed. (author)

  20. Analytical thermal model validation for Cassini radioisotope thermoelectric generator

    International Nuclear Information System (INIS)

    Lin, E.I.

    1997-01-01

    The Saturn-bound Cassini spacecraft is designed to rely, without precedent, on the waste heat from its three radioisotope thermoelectric generators (RTGs) to warm the propulsion module subsystem, and the RTG end dome temperature is a key determining factor of the amount of waste heat delivered. A previously validated SINDA thermal model of the RTG was the sole guide to understanding its complex thermal behavior, but displayed large discrepancies against some initial thermal development test data. A careful revalidation effort led to significant modifications and adjustments of the model, which result in a doubling of the radiative heat transfer from the heat source support assemblies to the end domes and bring up the end dome and flange temperature predictions to within 2 C of the pertinent test data. The increased inboard end dome temperature has a considerable impact on thermal control of the spacecraft central body. The validation process offers an example of physically-driven analytical model calibration with test data from not only an electrical simulator but also a nuclear-fueled flight unit, and has established the end dome temperatures of a flight RTG where no in-flight or ground-test data existed before

  1. Gap conductance model validation in the TASS/SMR-S code

    International Nuclear Information System (INIS)

    Ahn, Sang-Jun; Yang, Soo-Hyung; Chung, Young-Jong; Bae, Kyoo-Hwan; Lee, Won-Jae

    2011-01-01

    An advanced integral pressurized water reactor, SMART (System-Integrated Modular Advanced ReacTor) has been developed by KAERI (Korea Atomic Energy Research and Institute). The purposes of the SMART are sea water desalination and an electricity generation. For the safety evaluation and performance analysis of the SMART, TASS/SMR-S (Transient And Setpoint Simulation/System-integrated Modular Reactor) code, has been developed. In this paper, the gap conductance model for the calculation of gap conductance has been validated by using another system code, MARS code, and experimental results. In the validation, the behaviors of fuel temperature and gap width are selected as the major parameters. According to the evaluation results, the TASS/SMR-S code predicts well the behaviors of fuel temperatures and gap width variation, compared to the MARS calculation results and experimental data. (author)

  2. Development of an Auto-Validation Program for MARS Code Assessments

    International Nuclear Information System (INIS)

    Lee, Young Jin; Chung, Bub Dong

    2006-01-01

    MARS (Multi-dimensional Analysis of Reactor Safety) code is a best-estimate thermal hydraulic system analysis code developed at KAERI. It is important for a thermal hydraulic computer code to be assessed against theoretical and experimental data to verify and validate the performance and the integrity of the structure, models and correlations of the code. The code assessment efforts for complex thermal hydraulics code such as MARS code can be tedious, time-consuming and require large amount of human intervention in data transfer to see the results in graphic forms. Code developers produce many versions of a code during development and each version need to be verified for integrity. Thus, for MARS code developers, it is desirable to have an automatic way of carrying out the code assessment calculations. In the present work, an Auto-Validation program that carries out the code assessment efforts has been developed. The program uses the user supplied configuration file (with '.vv' extension) which contain commands to read input file, to execute the user selected MARS program, and to generate result graphs. The program can be useful if a same set of code assessments is repeated with different versions of the code. The program is written with the Delphi program language. The program runs under the Microsoft Windows environment

  3. The Development and Empirical Validation of an E-based Supply Chain Strategy Optimization Model

    DEFF Research Database (Denmark)

    Kotzab, Herbert; Skjoldager, Niels; Vinum, Thorkil

    2003-01-01

    Examines the formulation of supply chain strategies in complex environments. Argues that current state‐of‐the‐art e‐business and supply chain management, combined into the concept of e‐SCM, as well as the use of transaction cost theory, network theory and resource‐based theory, altogether can...... be used to form a model for analyzing supply chains with the purpose of reducing the uncertainty of formulating supply chain strategies. Presents e‐supply chain strategy optimization model (e‐SOM) as a way to analyze supply chains in a structured manner as regards strategic preferences for supply chain...... design, relations and resources in the chains with the ultimate purpose of enabling the formulation of optimal, executable strategies for specific supply chains. Uses research results for a specific supply chain to validate the usefulness of the model....

  4. Understanding and Measuring Evaluation Capacity: A Model and Instrument Validation Study

    Science.gov (United States)

    Taylor-Ritzler, Tina; Suarez-Balcazar, Yolanda; Garcia-Iriarte, Edurne; Henry, David B.; Balcazar, Fabricio E.

    2013-01-01

    This study describes the development and validation of the Evaluation Capacity Assessment Instrument (ECAI), a measure designed to assess evaluation capacity among staff of nonprofit organizations that is based on a synthesis model of evaluation capacity. One hundred and sixty-nine staff of nonprofit organizations completed the ECAI. The 68-item…

  5. On the development of a coupled regional climate-vegetation model RCM-CLM-CN-DV and its validation in Tropical Africa

    Science.gov (United States)

    Wang, Guiling; Yu, Miao; Pal, Jeremy S.; Mei, Rui; Bonan, Gordon B.; Levis, Samuel; Thornton, Peter E.

    2016-01-01

    This paper presents a regional climate system model RCM-CLM-CN-DV and its validation over Tropical Africa. The model development involves the initial coupling between the ICTP regional climate model RegCM4.3.4 (RCM) and the Community Land Model version 4 (CLM4) including models of carbon-nitrogen dynamics (CN) and vegetation dynamics (DV), and further improvements of the models. Model improvements derive from the new parameterization from CLM4.5 that addresses the well documented overestimation of gross primary production (GPP), a refinement of stress deciduous phenology scheme in CN that addresses a spurious LAI fluctuation for drought-deciduous plants, and the incorporation of a survival rule into the DV model to prevent tropical broadleaf evergreens trees from growing in areas with a prolonged drought season. The impact of the modifications on model results is documented based on numerical experiments using various subcomponents of the model. The performance of the coupled model is then validated against observational data based on three configurations with increasing capacity: RCM-CLM with prescribed leaf area index and fractional coverage of different plant functional types (PFTs); RCM-CLM-CN with prescribed PFTs coverage but prognostic plant phenology; RCM-CLM-CN-DV in which both the plant phenology and PFTs coverage are simulated by the model. Results from these three models are compared against the FLUXNET up-scaled GPP and ET data, LAI and PFT coverages from remote sensing data including MODIS and GIMMS, University of Delaware precipitation and temperature data, and surface radiation data from MVIRI and SRB. Our results indicate that the models perform well in reproducing the physical climate and surface radiative budgets in the domain of interest. However, PFTs coverage is significantly underestimated by the model over arid and semi-arid regions of Tropical Africa, caused by an underestimation of LAI in these regions by the CN model that gets exacerbated

  6. Numerical Validation of Chemical Compositional Model for Wettability Alteration Processes

    Science.gov (United States)

    Bekbauov, Bakhbergen; Berdyshev, Abdumauvlen; Baishemirov, Zharasbek; Bau, Domenico

    2017-12-01

    Chemical compositional simulation of enhanced oil recovery and surfactant enhanced aquifer remediation processes is a complex task that involves solving dozens of equations for all grid blocks representing a reservoir. In the present work, we perform a numerical validation of the newly developed mathematical formulation which satisfies the conservation laws of mass and energy and allows applying a sequential solution approach to solve the governing equations separately and implicitly. Through its application to the numerical experiment using a wettability alteration model and comparisons with existing chemical compositional model's numerical results, the new model has proven to be practical, reliable and stable.

  7. Validation of battery-alternator model against experimental data - a first step towards developing a future power supply system

    Energy Technology Data Exchange (ETDEWEB)

    Boulos, A.M.; Burnham, K.J.; Mahtani, J.L. [Coventry University (United Kingdom). Control Theory and Applications Centre; Pacaud, C. [Jaguar Cars Ltd., Coventry (United Kingdom). Engineering Centre

    2004-01-01

    The electric power system of a modern vehicle has to supply enough electrical energy to drive numerous electrical and electronic systems and components. The electric power system of a vehicle consists of two major components: an alternator and a battery. A detailed understanding of the characteristics of the electric power system, electrical load demands and the operating environment, such as road conditions and vehicle laden weight, is required when the capacities of the generator and the battery are to be determined for a vehicle. In this study, a battery-alternator system has been developed and simulated in MATLAB/Simulink, and data obtained from vehicle tests have been used as a basis for validating the models. This is considered to be a necessary first step in the design and development of a new 42 V power supply system. (author)

  8. Initial Development and Validation of the BullyHARM: The Bullying, Harassment, and Aggression Receipt Measure

    Science.gov (United States)

    Hall, William J.

    2017-01-01

    This article describes the development and preliminary validation of the Bullying, Harassment, and Aggression Receipt Measure (BullyHARM). The development of the BullyHARM involved a number of steps and methods, including a literature review, expert review, cognitive testing, readability testing, data collection from a large sample, reliability testing, and confirmatory factor analysis. A sample of 275 middle school students was used to examine the psychometric properties and factor structure of the BullyHARM, which consists of 22 items and 6 subscales: physical bullying, verbal bullying, social/relational bullying, cyber-bullying, property bullying, and sexual bullying. First-order and second-order factor models were evaluated. Results demonstrate that the first-order factor model had superior fit. Results of reliability testing indicate that the BullyHARM scale and subscales have very good internal consistency reliability. Findings indicate that the BullyHARM has good properties regarding content validation and respondent-related validation and is a promising instrument for measuring bullying victimization in school. PMID:28194041

  9. Experimental Validation of Flow Force Models for Fast Switching Valves

    DEFF Research Database (Denmark)

    Bender, Niels Christian; Pedersen, Henrik Clemmensen; Nørgård, Christian

    2017-01-01

    This paper comprises a detailed study of the forces acting on a Fast Switching Valve (FSV) plunger. The objective is to investigate to what extend different models are valid to be used for design purposes. These models depend on the geometry of the moving plunger and the properties of the surroun......This paper comprises a detailed study of the forces acting on a Fast Switching Valve (FSV) plunger. The objective is to investigate to what extend different models are valid to be used for design purposes. These models depend on the geometry of the moving plunger and the properties...... to compare and validate different models, where an effort is directed towards capturing the fluid squeeze effect just before material on material contact. The test data is compared with simulation data relying solely on analytic formulations. The general dynamics of the plunger is validated...

  10. Statistical validation of normal tissue complication probability models

    NARCIS (Netherlands)

    Xu, Cheng-Jian; van der Schaaf, Arjen; van t Veld, Aart; Langendijk, Johannes A.; Schilstra, Cornelis

    2012-01-01

    PURPOSE: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. METHODS AND MATERIALS: A penalized regression method, LASSO (least absolute shrinkage

  11. Multiphysics modelling and experimental validation of high concentration photovoltaic modules

    International Nuclear Information System (INIS)

    Theristis, Marios; Fernández, Eduardo F.; Sumner, Mike; O'Donovan, Tadhg S.

    2017-01-01

    Highlights: • A multiphysics modelling approach for concentrating photovoltaics was developed. • An experimental campaign was conducted to validate the models. • The experimental results were in good agreement with the models. • The multiphysics modelling allows the concentrator’s optimisation. - Abstract: High concentration photovoltaics, equipped with high efficiency multijunction solar cells, have great potential in achieving cost-effective and clean electricity generation at utility scale. Such systems are more complex compared to conventional photovoltaics because of the multiphysics effect that is present. Modelling the power output of such systems is therefore crucial for their further market penetration. Following this line, a multiphysics modelling procedure for high concentration photovoltaics is presented in this work. It combines an open source spectral model, a single diode electrical model and a three-dimensional finite element thermal model. In order to validate the models and the multiphysics modelling procedure against actual data, an outdoor experimental campaign was conducted in Albuquerque, New Mexico using a high concentration photovoltaic monomodule that is thoroughly described in terms of its geometry and materials. The experimental results were in good agreement (within 2.7%) with the predicted maximum power point. This multiphysics approach is relatively more complex when compared to empirical models, but besides the overall performance prediction it can also provide better understanding of the physics involved in the conversion of solar irradiance into electricity. It can therefore be used for the design and optimisation of high concentration photovoltaic modules.

  12. Development, Validation and Summative Evaluation of Card Pairing Games for Selected Math 8 Topics

    Directory of Open Access Journals (Sweden)

    Ronald O. Ocampo

    2015-12-01

    Full Text Available Traditional classroom situation where students are taught predominantly of lecture-discussion method put the classroom in a mathophobic atmosphere. Oftentimes, students exposed to this classroom atmosphere lead to math anxiety and eventually hate the subject and the teacher. Addressing this, varied interactive strategies to create an atmosphere of discourse has been developed and promoted. The use of instructional games has been viewed as one strategy that promotes active learning inside the classroom. Instructional games support constructivist learning and social learning. This study is aimed at developing, validating and evaluating card pairing games for specific topics in Math 8. The Research and Development model ( R& D was used. The card pairing games was validated by subject experts and experts in developing games. In evaluating the card pairing games, the Quasi-Experimental Pretest-Posttest design was used. There are six card pairing games developed for specific topics in Math 8; the card pairing game is highly valid based on the result of the validation; Students exposed to card pairing game become more intact (homogeneous; Students exposed to card games enhance academic performance. It is recommended to test the effectiveness of card pairing games to other group of students; Encourage math teachers to use the developed math card pairing games for classroom instruction; Develop other card pairing game for specific topics in math.

  13. Validation techniques of agent based modelling for geospatial simulations

    Directory of Open Access Journals (Sweden)

    M. Darvishi

    2014-10-01

    Full Text Available One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS, biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI’s ArcGIS, OpenMap, GeoTools, etc for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  14. Validation techniques of agent based modelling for geospatial simulations

    Science.gov (United States)

    Darvishi, M.; Ahmadi, G.

    2014-10-01

    One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS) is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS), biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI's ArcGIS, OpenMap, GeoTools, etc) for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  15. Validation of a probabilistic model for hurricane insurance loss projections in Florida

    International Nuclear Information System (INIS)

    Pinelli, J.-P.; Gurley, K.R.; Subramanian, C.S.; Hamid, S.S.; Pita, G.L.

    2008-01-01

    The Florida Public Hurricane Loss Model is one of the first public models accessible for scrutiny to the scientific community, incorporating state of the art techniques in hurricane and vulnerability modeling. The model was developed for Florida, and is applicable to other hurricane-prone regions where construction practice is similar. The 2004 hurricane season produced substantial losses in Florida, and provided the means to validate and calibrate this model against actual claim data. This paper presents the predicted losses for several insurance portfolios corresponding to hurricanes Andrew, Charley, and Frances. The predictions are validated against the actual claim data. Physical damage predictions for external building components are also compared to observed damage. The analyses show that the predictive capabilities of the model were substantially improved after the calibration against the 2004 data. The methodology also shows that the predictive capabilities of the model could be enhanced if insurance companies report more detailed information about the structures they insure and the types of damage they suffer. This model can be a powerful tool for the study of risk reduction strategies

  16. Development and Validation of a Model to Determine Risk of Progression of Barrett's Esophagus to Neoplasia.

    Science.gov (United States)

    Parasa, Sravanthi; Vennalaganti, Sreekar; Gaddam, Srinivas; Vennalaganti, Prashanth; Young, Patrick; Gupta, Neil; Thota, Prashanthi; Cash, Brooks; Mathur, Sharad; Sampliner, Richard; Moawad, Fouad; Lieberman, David; Bansal, Ajay; Kennedy, Kevin F; Vargo, John; Falk, Gary; Spaander, Manon; Bruno, Marco; Sharma, Prateek

    2018-04-01

    A system is needed to determine the risk of patients with Barrett's esophagus for progression to high-grade dysplasia (HGD) and esophageal adenocarcinoma (EAC). We developed and validated a model to determine of progression to HGD or EAC in patients with BE, based on demographic data and endoscopic and histologic findings at the time of index endoscopy. We performed a longitudinal study of patients with BE at 5 centers in United States and 1 center in Netherlands enrolled in the Barrett's Esophagus Study database from 1985 through 2014. Patients were excluded from the analysis if they had less than 1 year of follow-up, were diagnosed with HGD or EAC within the past year, were missing baseline histologic data, or had no intestinal metaplasia. Seventy percent of the patients were used to derive the model and 30% were used for the validation study. The primary outcome was development of HGD or EAC during the follow-up period (median, 5.9 years). Survival analysis was performed using the Kaplan-Meier method. We assigned a specific number of points to each BE risk factor, and point totals (scores) were used to create categories of low, intermediate, and high risk. We used Cox regression to compute hazard ratios and 95% confidence intervals to determine associations between risk of progression and scores. Of 4584 patients in the database, 2697 were included in our analysis (84.1% men; 87.6% Caucasian; mean age, 55.4 ± 20.1 years; mean body mass index, 27.9 ± 5.5 kg/m 2 ; mean length of BE, 3.7 ± 3.2 cm). During the follow-up period, 154 patients (5.7%) developed HGD or EAC, with an annual rate of progression of 0.95%. Male sex, smoking, length of BE, and baseline-confirmed low-grade dysplasia were significantly associated with progression. Scores assigned identified patients with BE that progressed to HGD or EAC with a c-statistic of 0.76 (95% confidence interval, 0.72-0.80; P Esophagus score) based on male sex, smoking, length of BE, and baseline low-grade dysplasia

  17. Development and Validation of Web-Based Courseware for Junior Secondary School Basic Technology Students in Nigeria

    Directory of Open Access Journals (Sweden)

    Amosa Isiaka Gambari

    2018-02-01

    Full Text Available This research aimed to develop and validate a web-based courseware for junior secondary school basic technology students in Nigeria. In this study, a mixed method quantitative pilot study design with qualitative components was used to test and ascertain the ease of development and validation of the web-based courseware. Dick and Carey instructional system design model was adopted for developing the courseware. Convenience sampling technique was used in selecting the three content, computer and educational technology experts to validate the web-based courseware. Non-randomized and non-equivalent Junior secondary school students from two schools were used for field trial validation. Four validating instruments were employed in conducting this study: (i Content Validation Assessment Report (CVAR; (ii Computer Expert Validation Assessment Report (CEAR; (iii Educational Technology Experts Validation Assessment Report (ETEVAR; and (iv Students Validation Questionnaire (SVQ. All the instruments were face and content validated. SVQ was pilot tested and reliability coefficient of 0.85 was obtained using Cronbach Alpha. CVAR, CEAR, ETEVAR were administered on content specialists, computer experts, and educational technology experts, while SVQ was administered on 83 JSS students from two selected secondary schools in Minna. The findings revealed that the process of developing web-based courseware using Dick and Carey Instructional System Design was successful. In addition, the report from the validating team revealed that the web-based courseware is valuable for learning basic technology. It is therefore recommended that web-based courseware should be produced to teach basic technology concepts on large scale.

  18. Validation, Proof-of-Concept, and Postaudit of the Groundwater Flow and Transport Model of the Project Shoal Area

    International Nuclear Information System (INIS)

    Ahmed Hassan

    2004-01-01

    The groundwater flow and radionuclide transport model characterizing the Shoal underground nuclear test has been accepted by the State of Nevada Division of Environmental Protection. According to the Federal Facility Agreement and Consent Order (FFACO) between DOE and the State of Nevada, the next steps in the closure process for the site are then model validation (or postaudit), the proof-of-concept, and the long-term monitoring stage. This report addresses the development of the validation strategy for the Shoal model, needed for preparing the subsurface Corrective Action Decision Document-Corrective Action Plan and the development of the proof-of-concept tools needed during the five-year monitoring/validation period. The approach builds on a previous model, but is adapted and modified to the site-specific conditions and challenges of the Shoal site

  19. Validation, Proof-of-Concept, and Postaudit of the Groundwater Flow and Transport Model of the Project Shoal Area

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed Hassan

    2004-09-01

    The groundwater flow and radionuclide transport model characterizing the Shoal underground nuclear test has been accepted by the State of Nevada Division of Environmental Protection. According to the Federal Facility Agreement and Consent Order (FFACO) between DOE and the State of Nevada, the next steps in the closure process for the site are then model validation (or postaudit), the proof-of-concept, and the long-term monitoring stage. This report addresses the development of the validation strategy for the Shoal model, needed for preparing the subsurface Corrective Action Decision Document-Corrective Action Plan and the development of the proof-of-concept tools needed during the five-year monitoring/validation period. The approach builds on a previous model, but is adapted and modified to the site-specific conditions and challenges of the Shoal site.

  20. Validating Models of Clinical Word Recognition Tests for Spanish/English Bilinguals

    Science.gov (United States)

    Shi, Lu-Feng

    2014-01-01

    Purpose: Shi and Sánchez (2010) developed models to predict the optimal test language for evaluating Spanish/English (S/E) bilinguals' word recognition. The current study intended to validate their conclusions in a separate bilingual listener sample. Method: Seventy normal-hearing S/E bilinguals varying in language profile were included.…

  1. Development and validation of the pro-environmental behaviour scale for women's health.

    Science.gov (United States)

    Kim, HyunKyoung

    2017-05-01

    This study was aimed to develop and test the Pro-environmental Behavior Scale for Women's Health. Women adopt sustainable behaviours and alter their life styles to protect the environment and their health from environmental pollution. The conceptual framework of pro-environmental behaviours was based on Rogers' protection motivation theory and Weinstein's precaution adoption process model. The cross-sectional design was used for instrument development. The instrument development process consisted of a literature review, personal depth interviews and focus group interviews. The sample comprised 356 adult women recruited in April-May 2012 in South Korea using quota sampling. For construct validity, exploratory factor analysis was conducted to examine the factor structure, after which convergent and discriminant validity and known-group comparisons were tested. Principal component analysis yielded 17 items with four factors, including 'women's health protection,' 'chemical exposure prevention,' 'alternative consumption,' and 'community-oriented behaviour'. The Cronbach's α was 0·81. Convergent and discriminant validity were supported by performing correlations with other environmental-health and health-behaviour measures. Nursing professionals can reliably use the instrument to assess women's behaviours, which protect their health and the environment. © 2016 John Wiley & Sons Ltd.

  2. Assessment model validity document FARF31

    International Nuclear Information System (INIS)

    Elert, Mark; Gylling Bjoern; Lindgren, Maria

    2004-08-01

    -fractures with flowing water and rock with porosity accessible only by diffusion. The approach furthermore assumes that the properties within the two porosity domains are averaged and also the transfer between the two domains is averaged.It is an important validation issue to verify that effective averaging of parameters can be performed and that suitable values can be derived. It can be shown that matrix interaction properties along a flow path can be integrated to an effective value and if the matrix depth can be considered as infinite, effective values may be derived also for the diffusion and sorption parameters. Thus, it is possible to derive effective parameters for sorbing radionuclides incorporating the total matrix effects along a flow path. This is strictly valid only for cases with no dispersion, but gives a good approximation as long as dispersion does not dominate the transport. FARF31 has been tested and compared with analytical solutions and other models and was found to correspond well within a wide range of input parameters. Support and documentation on how to use FARF31 are two important components to avoid calculation mistakes and obtain trustworthy results. The documentation describes handling and updates of the code. Test cases have been constructed which can be used to check updates and be used as templates. The development of the code is kept under source code control to fulfil quality assurance. The model is deemed to be well suited for performance assessments within the SKB framework

  3. Achievement Emotions in Technology Enhanced Learning: Development and Validation of Self-Report Instruments in the Italian Context

    Directory of Open Access Journals (Sweden)

    Daniela Raccanello

    2015-02-01

    Full Text Available The increased use of technology within the educational field gives rise to the need for developing valid instruments to measure key constructs associated with performance. We present some self-report instruments developed and/or validated in the Italian context that could be used to assess achievement emotions and correlates, within the theoretical framework of Pekrun’s control-value model. First, we propose some data related to the construction of two instruments developed to assess ten achievement emotions: the Brief Achievement Emotions Questionnaire, BR-AEQ, used with college students, and the Graduated Achievement Emotions Set, GR-AES, used with primary school students. Second, we describe some data concerning the validation within the Italian context of two instruments assessing achievement goals as antecedents of achievement emotions: the Achievement Goal Questionnaire-Revised, AGQ-R, and its more recent version based on the 3 X 2 achievement goal model.

  4. Development and validation of a numerical model for cross-section optimization of a multi-part probe for soft tissue intervention.

    Science.gov (United States)

    Frasson, L; Neubert, J; Reina, S; Oldfield, M; Davies, B L; Rodriguez Y Baena, F

    2010-01-01

    The popularity of minimally invasive surgical procedures is driving the development of novel, safer and more accurate surgical tools. In this context a multi-part probe for soft tissue surgery is being developed in the Mechatronics in Medicine Laboratory at Imperial College, London. This study reports an optimization procedure using finite element methods, for the identification of an interlock geometry able to limit the separation of the segments composing the multi-part probe. An optimal geometry was obtained and the corresponding three-dimensional finite element model validated experimentally. Simulation results are shown to be consistent with the physical experiments. The outcome of this study is an important step in the provision of a novel miniature steerable probe for surgery.

  5. Modeling and Validation of Environmental Suitability for Schistosomiasis Transmission Using Remote Sensing.

    Science.gov (United States)

    Walz, Yvonne; Wegmann, Martin; Dech, Stefan; Vounatsou, Penelope; Poda, Jean-Noël; N'Goran, Eliézer K; Utzinger, Jürg; Raso, Giovanna

    2015-11-01

    Schistosomiasis is the most widespread water-based disease in sub-Saharan Africa. Transmission is governed by the spatial distribution of specific freshwater snails that act as intermediate hosts and human water contact patterns. Remote sensing data have been utilized for spatially explicit risk profiling of schistosomiasis. We investigated the potential of remote sensing to characterize habitat conditions of parasite and intermediate host snails and discuss the relevance for public health. We employed high-resolution remote sensing data, environmental field measurements, and ecological data to model environmental suitability for schistosomiasis-related parasite and snail species. The model was developed for Burkina Faso using a habitat suitability index (HSI). The plausibility of remote sensing habitat variables was validated using field measurements. The established model was transferred to different ecological settings in Côte d'Ivoire and validated against readily available survey data from school-aged children. Environmental suitability for schistosomiasis transmission was spatially delineated and quantified by seven habitat variables derived from remote sensing data. The strengths and weaknesses highlighted by the plausibility analysis showed that temporal dynamic water and vegetation measures were particularly useful to model parasite and snail habitat suitability, whereas the measurement of water surface temperature and topographic variables did not perform appropriately. The transferability of the model showed significant relations between the HSI and infection prevalence in study sites of Côte d'Ivoire. A predictive map of environmental suitability for schistosomiasis transmission can support measures to gain and sustain control. This is particularly relevant as emphasis is shifting from morbidity control to interrupting transmission. Further validation of our mechanistic model needs to be complemented by field data of parasite- and snail

  6. Development and validation of the Chinese Quality of Life Instrument.

    Science.gov (United States)

    Leung, Kwok-fai; Liu, Feng-bin; Zhao, Li; Fang, Ji-qian; Chan, Kelvin; Lin, Li-zhu

    2005-04-16

    This paper describes the development of the Chinese Quality of Life Instrument (ChQOL) which is a self-report health status instrument. Chinese Medicine relies very much on asking subjective feelings of patients in the process of diagnosis and monitoring of treatment. For thousands of years, Chinese Medicine practitioners have accumulated a good wealth of experiences in asking questions about health of their patients based on the concept of health in Chinese Medicine. These experiences were then transformed into questions for the ChQOL. It is believed that ChQOL can contribute to the existing Patient Report Outcome measures. This paper outlines the concept of health and disease in Traditional Chinese Medicine, the building of the conceptual framework of the ChQOL, the steps of drafting, selecting and validating the items, and the psychometric properties of the ChQOL. The development of the ChQOL was based on the concept of health in Traditional Chinese Medicine with a theory driven approach. Based on the results of literature review, the research team developed an initial model of health which encompassed the concept of health in TCM. An expert panel was then invited to comment and give suggestions for improvement of the initial model. According to their suggestions, the model was refined and a set of initial items for the ChQOL was drafted. The refined model, together with the key domains, facets and initial items of the ChQOL were then mailed to a sample of about 100 Chinese medicine practitioners throughout Mainland China for their comments and advice. A revised set of items were developed for linguistic testing by a convenience sample consisting of both healthy people and people who attended Chinese Medicine treatment. After that, an item pool was developed for field-testing. Field test was conducted on a convenience sample of healthy and patient subjects to determine the construct validity and psychometric properties of the ChQOL. Construct validity was

  7. Development and validation of the Chinese Quality of Life Instrument

    Directory of Open Access Journals (Sweden)

    Chan Kelvin

    2005-04-01

    Full Text Available Abstract Background This paper describes the development of the Chinese Quality of Life Instrument (ChQOL which is a self-report health status instrument. Chinese Medicine relies very much on asking subjective feelings of patients in the process of diagnosis and monitoring of treatment. For thousands of years, Chinese Medicine practitioners have accumulated a good wealth of experiences in asking questions about health of their patients based on the concept of health in Chinese Medicine. These experiences were then transformed into questions for the ChQOL. It is believed that ChQOL can contribute to the existing Patient Report Outcome measures. This paper outlines the concept of health and disease in Traditional Chinese Medicine, the building of the conceptual framework of the ChQOL, the steps of drafting, selecting and validating the items, and the psychometric properties of the ChQOL. Methods The development of the ChQOL was based on the concept of health in Traditional Chinese Medicine with a theory driven approach. Based on the results of literature review, the research team developed an initial model of health which encompassed the concept of health in TCM. An expert panel was then invited to comment and give suggestions for improvement of the initial model. According to their suggestions, the model was refined and a set of initial items for the ChQOL was drafted. The refined model, together with the key domains, facets and initial items of the ChQOL were then mailed to a sample of about 100 Chinese medicine practitioners throughout Mainland China for their comments and advice. A revised set of items were developed for linguistic testing by a convenience sample consisting of both healthy people and people who attended Chinese Medicine treatment. After that, an item pool was developed for field-testing. Field test was conducted on a convenience sample of healthy and patient subjects to determine the construct validity and psychometric

  8. When is the Anelastic Approximation a Valid Model for Compressible Convection?

    Science.gov (United States)

    Alboussiere, T.; Curbelo, J.; Labrosse, S.; Ricard, Y. R.; Dubuffet, F.

    2017-12-01

    Compressible convection is ubiquitous in large natural systems such Planetary atmospheres, stellar and planetary interiors. Its modelling is notoriously more difficult than the case when the Boussinesq approximation applies. One reason for that difficulty has been put forward by Ogura and Phillips (1961): the compressible equations generate sound waves with very short time scales which need to be resolved. This is why they introduced an anelastic model, based on an expansion of the solution around an isentropic hydrostatic profile. How accurate is that anelastic model? What are the conditions for its validity? To answer these questions, we have developed a numerical model for the full set of compressible equations and compared its solutions with those of the corresponding anelastic model. We considered a simple rectangular 2D Rayleigh-Bénard configuration and decided to restrict the analysis to infinite Prandtl numbers. This choice is valid for convection in the mantles of rocky planets, but more importantly lead to a zero Mach number. So we got rid of the question of the interference of acoustic waves with convection. In that simplified context, we used the entropy balances (that of the full set of equations and that of the anelastic model) to investigate the differences between exact and anelastic solutions. We found that the validity of the anelastic model is dictated by two conditions: first, the superadiabatic temperature difference must be small compared with the adiabatic temperature difference (as expected) ɛ = Δ TSA / delta Ta << 1, and secondly that the product of ɛ with the Nusselt number must be small.

  9. Prediction and validation of pool fire development in enclosures by means of CFD Models for risk assessment of nuclear power plants (Poolfire) - Report year 2

    International Nuclear Information System (INIS)

    Van Hees, P.; Wahlqvist, J.; Kong, D.; Hostikka, S.; Sikanen, T.; Husted, B.; Magnusson, T.; Joerud, F.

    2013-05-01

    Fires in nuclear power plants can be an important hazard for the overall safety of the facility. One of the typical fire sources is a pool fire. It is therefore important to have good knowledge on the fire behaviour of pool fire and be able to predict the heat release rate by prediction of the mass loss rate. This project envisages developing a pyrolysis model to be used in CFD models. In this report the activities for second year are reported, which is an overview of the experiments conducted, further development and validation of models and cases study to be selected in year 3. (Author)

  10. Prediction and validation of pool fire development in enclosures by means of CFD Models for risk assessment of nuclear power plants (Poolfire) - Report year 2

    Energy Technology Data Exchange (ETDEWEB)

    van Hees, P.; Wahlqvist, J.; Kong, D. [Lund Univ., Lund (Sweden); Hostikka, S.; Sikanen, T. [VTT Technical Research Centre of Finland (Finland); Husted, B. [Haugesund Univ. College, Stord (Norway); Magnusson, T. [Ringhals AB, Vaeroebacka (Sweden); Joerud, F. [European Spallation Source (ESS), Lund (Sweden)

    2013-05-15

    Fires in nuclear power plants can be an important hazard for the overall safety of the facility. One of the typical fire sources is a pool fire. It is therefore important to have good knowledge on the fire behaviour of pool fire and be able to predict the heat release rate by prediction of the mass loss rate. This project envisages developing a pyrolysis model to be used in CFD models. In this report the activities for second year are reported, which is an overview of the experiments conducted, further development and validation of models and cases study to be selected in year 3. (Author)

  11. Development and Validation of a Dissolution Test Method for ...

    African Journals Online (AJOL)

    Purpose: To develop and validate a dissolution test method for dissolution release of artemether and lumefantrine from tablets. Methods: A single dissolution method for evaluating the in vitro release of artemether and lumefantrine from tablets was developed and validated. The method comprised of a dissolution medium of ...

  12. Development and validation of a postpartum depression risk score in delivered women, Iran

    Directory of Open Access Journals (Sweden)

    Mohammad R Maracy

    2012-01-01

    Full Text Available Background: Investigators describe a dramatic increase in the incidence of mood disorder after childbirth, with the largest risk in the 90 days after delivery. This study is designed to develop a relatively simple screening tool and validate it from the significant variables associated with postpartum depression (PPD to detect delivered women at high risk of having PPD. Materials and Methods: In the cross-sectional study, 6,627 from a total of 7,300 delivered women, 2-12 months after delivery were recruited and screened for PPD. Split-half validation was used to develop the risk score. The training data set was used to develop the model, and the validation data set was used to validate the developed the risk factors of postpartum depression risk score using multiple logistic regression analysis to compute the β coefficients and odds ratio (OR for the dependent variables associated with possible PPD in this study. Calibration was checked using the Hosmer and Lemeshow test. A score for independent variables contributing to PPD was calculated. Cutoff points using a trade-off between the sensitivity and specificity of risk scores derived from PPD model using the Receiver Operating Characteristic (ROC curve. Results: The predicted and observed PPD were not different (P value = 0.885. The aROC with area under the curve (S.E. of 0.611 (0.008 for predicting PPD using the suggested cut-off point of -0.702, the proportion of participants screening positive for PPD was 70.9% (sensitivity (CI 95%; 69.5, 72.3 while the proportion screening negative was 60.1% (specificity (CI 95%; 58.2, 62.1. Conclusion: Despite of the relatively low sensitivity and specificity in this study, it could be a simple, practical and useful screening tool to identify individual at high risk for PPD in the target population.

  13. DTU PMU Laboratory Development - Testing and Validation

    DEFF Research Database (Denmark)

    Garcia-Valle, Rodrigo; Yang, Guang-Ya; Martin, Kenneth E.

    2010-01-01

    This is a report of the results of phasor measurement unit (PMU) laboratory development and testing done at the Centre for Electric Technology (CET), Technical University of Denmark (DTU). Analysis of the PMU performance first required the development of tools to convert the DTU PMU data into IEEE...... standard, and the validation is done for the DTU-PMU via a validated commercial PMU. The commercial PMU has been tested from the authors' previous efforts, where the response can be expected to follow known patterns and provide confirmation about the test system to confirm the design and settings....... In a nutshell, having 2 PMUs that observe same signals provides validation of the operation and flags questionable results with more certainty. Moreover, the performance and accuracy of the DTU-PMU is tested acquiring good and precise results, when compared with a commercial phasor measurement device, PMU-1....

  14. Validation of a Numerical Model for Dynamic Three-Dimensional Railway Bridge Analysis by Comparison with a Small-Scale Laboratory Model

    DEFF Research Database (Denmark)

    Bucinskas, Paulius; Sneideris, Jonas; Agapii, Liuba

    2018-01-01

    The aim of the paper is analyse to what extent a small-scale experimental model can be applied in order to develop and validate a numerical model for dynamic analysis of a multi-span railway bridge interacting with the underlying soil. For this purpose a small-scale model of a bridge structure is...

  15. BIOMOVS: an international model validation study

    International Nuclear Information System (INIS)

    Haegg, C.; Johansson, G.

    1988-01-01

    BIOMOVS (BIOspheric MOdel Validation Study) is an international study where models used for describing the distribution of radioactive and nonradioactive trace substances in terrestrial and aquatic environments are compared and tested. The main objectives of the study are to compare and test the accuracy of predictions between such models, explain differences in these predictions, recommend priorities for future research concerning the improvement of the accuracy of model predictions and act as a forum for the exchange of ideas, experience and information. (author)

  16. BIOMOVS: An international model validation study

    International Nuclear Information System (INIS)

    Haegg, C.; Johansson, G.

    1987-01-01

    BIOMOVS (BIOspheric MOdel Validation Study) is an international study where models used for describing the distribution of radioactive and nonradioactive trace substances in terrestrial and aquatic environments are compared and tested. The main objectives of the study are to compare and test the accuracy of predictions between such models, explain differences in these predictions, recommend priorities for future research concerning the improvement of the accuracy of model predictions and act as a forum for the exchange of ideas, experience and information. (orig.)

  17. Development and Initial Validation of the Italian Mood Scale (ITAMS for Use in Sport and Exercise Contexts

    Directory of Open Access Journals (Sweden)

    Alessandro Quartiroli

    2017-09-01

    Full Text Available The current study presents initial validation statistics for the Italian Mood Scale (ITAMS, a culturally- and linguistically-validated Italian version of the Brunel Mood Scale (BRUMS: Terry and Lane, 2010. The ITAMS was administered to 950 sport participants (659 females, who ranged in age from 16 to 63 years (M = 25.03, SD = 7.62. In the first stage of the validation process, statistical procedures in Mplus were used to evaluate the measurement model. Multigroup exploratory structural equation modeling supported the hypothesized 6-factor measurement model for males and females separately and for the combined sample. Analysis of the scale scores using SPSS provided further support for the construct validity of the ITAMS with hypothesized relationships observed between ITAMS scores and measures of depression and affect. The development and validation of the ITAMS opens the way for mood-related research and sport or exercise interventions requiring mood assessments, in an Italian-language context.

  18. Construct Validation--Community College Instructional Development Inventory

    Science.gov (United States)

    Xiong, Soua; Delgado, Nexi; Wood, J. Luke; Harris, Frank, III

    2017-01-01

    This white paper describes the construct validation of the Community College Instructional Development Inventory (CC-IDI). The CC-IDI is an institutional assessment tool designed to inform professional development programming for instructional faculty. The instrument was developed to serve as a standardized assessment tool to determine the…

  19. On various metrics used for validation of predictive QSAR models with applications in virtual screening and focused library design.

    Science.gov (United States)

    Roy, Kunal; Mitra, Indrani

    2011-07-01

    Quantitative structure-activity relationships (QSARs) have important applications in drug discovery research, environmental fate modeling, property prediction, etc. Validation has been recognized as a very important step for QSAR model development. As one of the important objectives of QSAR modeling is to predict activity/property/toxicity of new chemicals falling within the domain of applicability of the developed models and QSARs are being used for regulatory decisions, checking reliability of the models and confidence of their predictions is a very important aspect, which can be judged during the validation process. One prime application of a statistically significant QSAR model is virtual screening for molecules with improved potency based on the pharmacophoric features and the descriptors appearing in the QSAR model. Validated QSAR models may also be utilized for design of focused libraries which may be subsequently screened for the selection of hits. The present review focuses on various metrics used for validation of predictive QSAR models together with an overview of the application of QSAR models in the fields of virtual screening and focused library design for diverse series of compounds with citation of some recent examples.

  20. A Practical Approach to Validating a PD Model

    NARCIS (Netherlands)

    Medema, L.; Koning, de R.; Lensink, B.W.

    2009-01-01

    The capital adequacy framework Basel II aims to promote the adoption of stronger risk management practices by the banking industry. The implementation makes validation of credit risk models more important. Lenders therefore need a validation methodology to convince their supervisors that their

  1. A practical approach to validating a PD model

    NARCIS (Netherlands)

    Medema, Lydian; Koning, Ruud H.; Lensink, Robert; Medema, M.

    The capital adequacy framework Basel II aims to promote the adoption of stronger risk management practices by the banking industry. The implementation makes validation of credit risk models more important. Lenders therefore need a validation methodology to convince their supervisors that their

  2. Base Flow Model Validation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation is the systematic "building-block" validation of CFD/turbulence models employing a GUI driven CFD code (RPFM) and existing as well as new data sets to...

  3. Improvement and Validation of an Aerosol Deposition Model in the GAMMA-FP, a Fission Product Analysis Module for VHTRs

    International Nuclear Information System (INIS)

    Yoon, Churl; Lim, Hong Sik

    2013-01-01

    GAMMA-FP (GAs Multicomponent Mixture Analysis-Fission Products module), consists of gaseous and aerosol fission product analysis modules. The aerosol FP module adopts a multi-component and multi-sectional aerosol analysis model that has been developed based on the MAEROS model. For the first work of FP module development, the MAEROS model has been implemented and examined against some analytic solutions and experimental data by Yoo et al. An aerosol transport model was developed and implemented in the GAMMA-FP code, and verified. In this study, the aerosol deposition model in the GAMMA-FP code was improved by adopting recent achievements, and was validated against an experimental data available. The aerosol deposition model in the GAMMA-FP code has been improved and successfully validated against the STORM SR-11 deposition test. The simulation with the improved deposition model predicted the matched results with the experimental data well. For future studies, the aerosol deposition model by flow irregularities will be implemented and validated against the TRANSAT bend effect test

  4. Improvement and Validation of an Aerosol Deposition Model in the GAMMA-FP, a Fission Product Analysis Module for VHTRs

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Churl; Lim, Hong Sik [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2013-10-15

    GAMMA-FP (GAs Multicomponent Mixture Analysis-Fission Products module), consists of gaseous and aerosol fission product analysis modules. The aerosol FP module adopts a multi-component and multi-sectional aerosol analysis model that has been developed based on the MAEROS model. For the first work of FP module development, the MAEROS model has been implemented and examined against some analytic solutions and experimental data by Yoo et al. An aerosol transport model was developed and implemented in the GAMMA-FP code, and verified. In this study, the aerosol deposition model in the GAMMA-FP code was improved by adopting recent achievements, and was validated against an experimental data available. The aerosol deposition model in the GAMMA-FP code has been improved and successfully validated against the STORM SR-11 deposition test. The simulation with the improved deposition model predicted the matched results with the experimental data well. For future studies, the aerosol deposition model by flow irregularities will be implemented and validated against the TRANSAT bend effect test.

  5. A comprehensive validation toolbox for regional ocean models - Outline, implementation and application to the Baltic Sea

    Science.gov (United States)

    Jandt, Simon; Laagemaa, Priidik; Janssen, Frank

    2014-05-01

    The systematic and objective comparison between output from a numerical ocean model and a set of observations, called validation in the context of this presentation, is a beneficial activity at several stages, starting from early steps in model development and ending at the quality control of model based products delivered to customers. Even though the importance of this kind of validation work is widely acknowledged it is often not among the most popular tasks in ocean modelling. In order to ease the validation work a comprehensive toolbox has been developed in the framework of the MyOcean-2 project. The objective of this toolbox is to carry out validation integrating different data sources, e.g. time-series at stations, vertical profiles, surface fields or along track satellite data, with one single program call. The validation toolbox, implemented in MATLAB, features all parts of the validation process - ranging from read-in procedures of datasets to the graphical and numerical output of statistical metrics of the comparison. The basic idea is to have only one well-defined validation schedule for all applications, in which all parts of the validation process are executed. Each part, e.g. read-in procedures, forms a module in which all available functions of this particular part are collected. The interface between the functions, the module and the validation schedule is highly standardized. Functions of a module are set up for certain validation tasks, new functions can be implemented into the appropriate module without affecting the functionality of the toolbox. The functions are assigned for each validation task in user specific settings, which are externally stored in so-called namelists and gather all information of the used datasets as well as paths and metadata. In the framework of the MyOcean-2 project the toolbox is frequently used to validate the forecast products of the Baltic Sea Marine Forecasting Centre. Hereby the performance of any new product

  6. Validation of fracture flow models in the Stripa project

    International Nuclear Information System (INIS)

    Herbert, A.; Dershowitz, W.; Long, J.; Hodgkinson, D.

    1991-01-01

    One of the objectives of Phase III of the Stripa Project is to develop and evaluate approaches for the prediction of groundwater flow and nuclide transport in a specific unexplored volume of the Stripa granite and make a comparison with data from field measurements. During the first stage of the project, a prediction of inflow to the D-holes, an array of six parallel closely spaced 100m boreholes, was made based on data from six other boreholes. This data included fracture geometry, stress, single borehole geophysical logging, crosshole and reflection radar and seismic tomogram, head monitoring and single hole packer test measurements. Maps of fracture traces on the drift walls have also been made. The D-holes are located along a future Validation Drift which will be excavated. The water inflow to the D-holes has been measured in an experiment called the Simulated Drift Experiment. The paper reviews the Simulated Drift Experiment validation exercise. Following a discussion of the approach to validation, the characterization data and its preliminary interpretation are summarised and commented upon. That work has proved feasible to carry through all the complex and interconnected tasks associated with the gathering and interpretation of characterization data, the development and application of complex models, and the comparison with measured inflows. This exercise has provided detailed feed-back to the experimental and theoretical work required for measurements and predictions of flow into the Validation Drift. Computer codes used: CHANGE, FRACMAN, MAFIC, NAPSAC and TRINET. 2 figs., 2 tabs., 19 refs

  7. Validation of a risk prediction model for Barrett's esophagus in an Australian population.

    Science.gov (United States)

    Ireland, Colin J; Gordon, Andrea L; Thompson, Sarah K; Watson, David I; Whiteman, David C; Reed, Richard L; Esterman, Adrian

    2018-01-01

    Esophageal adenocarcinoma is a disease that has a high mortality rate, the only known precursor being Barrett's esophagus (BE). While screening for BE is not cost-effective at the population level, targeted screening might be beneficial. We have developed a risk prediction model to identify people with BE, and here we present the external validation of this model. A cohort study was undertaken to validate a risk prediction model for BE. Individuals with endoscopy and histopathology proven BE completed a questionnaire containing variables previously identified as risk factors for this condition. Their responses were combined with data from a population sample for analysis. Risk scores were derived for each participant. Overall performance of the risk prediction model in terms of calibration and discrimination was assessed. Scores from 95 individuals with BE and 636 individuals from the general population were analyzed. The Brier score was 0.118, suggesting reasonable overall performance. The area under the receiver operating characteristic was 0.83 (95% CI 0.78-0.87). The Hosmer-Lemeshow statistic was p =0.14. Minimizing false positives and false negatives, the model achieved a sensitivity of 74% and a specificity of 73%. This study has validated a risk prediction model for BE that has a higher sensitivity than previous models.

  8. Challenges of forest landscape modeling - simulating large landscapes and validating results

    Science.gov (United States)

    Hong S. He; Jian Yang; Stephen R. Shifley; Frank R. Thompson

    2011-01-01

    Over the last 20 years, we have seen a rapid development in the field of forest landscape modeling, fueled by both technological and theoretical advances. Two fundamental challenges have persisted since the inception of FLMs: (1) balancing realistic simulation of ecological processes at broad spatial and temporal scales with computing capacity, and (2) validating...

  9. The turbulent viscosity models and their experimental validation; Les modeles de viscosite turbulente et leur validation experimentale

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-12-31

    This workshop on turbulent viscosity models and on their experimental validation was organized by the `convection` section of the French society of thermal engineers. From the 9 papers presented during this workshop, 8 deal with the modeling of turbulent flows inside combustion chambers, turbo-machineries or in other energy-related applications, and have been selected for ETDE. (J.S.)

  10. Development of coupled models and their validation against experiments -DECOVALEX project

    International Nuclear Information System (INIS)

    Stephansson, O.; Jing, L.; Kautsky, F.

    1995-01-01

    DECOVALEX is an international co-operative research project for theoretical and experimental studies of coupled thermal, hydrological and mechanical processes in hard rocks. Different mathematical models and computer codes have been developed by research teams from different countries. These models and codes are used to study the so-called Bench Mark Test and Test Case problems developed within this project. Bench-Mark Tests are defined as hypothetical initial-boundary value problems of a generic nature, and Test Cases are experimental investigations of part or full aspects of coupled thermo-hydro-mechanical processes in hard rocks. Analytical and semi-analytical solutions related to coupled T-H-M processes are also developed for problems with simpler geometry and initial-boundary conditions. These solutions are developed to verify algorithms and their computer implementations. In this contribution the motivation, organization and approaches and current status of the project are presented, together with definitions of Bench-Mark Tests and Test Case problems. The definition and part of results for a BMT problem (BMT3) for a near-field repository model are described as an example. (authors). 3 refs., 11 figs., 3 tabs

  11. Development and validation of a model for high pressure liquid poison injection for CANDU-6 shutdown system no.2

    International Nuclear Information System (INIS)

    Rhee, B.-W.; Jeong, C.J.; Choi, J.H.; Yoo, S.-Y.

    2002-01-01

    In CANDU reactor one of the two reactor shutdown systems is the liquid poison injection system which injects the highly pressurized liquid neutron poison into the moderator tank via small holes on the nozzle pipes. To ensure the safe shutdown of a reactor it is necessary for the poison curtains generated by jets provide quick, and enough negative reactivity to the reactor during the early stage of the accident. In order to produce the neutron cross section necessary to perform this work, the poison concentration distribution during the transient is necessary. In this study, a set of models for analyzing the transient poison concentration induced by this high pressure poison injection jet activated upon the reactor trip in a CANDU-6 reactor moderator tank has been developed and used to generate the poison concentration distribution of the poison curtains induced by the high pressure jets injected into the vacant region between the calandria tube banks. The poison injection rate through the jet holes drilled on the nozzle pipes is obtained by a 1-D transient hydrodynamic code called, ALITRIG, and this injection rate is used to provide the inlet boundary condition to a 3-D CFD model of the moderator tank based on CFX4.3, an AEA Technology CFD code, to simulate the formation and growth of the poison jet curtain inside the moderator tank. For validation, the current model is validated against a poison injection experiment performed at BARC, India and another poison jet experiment for Generic CANDU-6 performed at AECL, Canada. In conclusion this set of models is considered to predict the experimental results in a physically reasonable and consistent manner. (author)

  12. Development and preliminary validation of a screen for ...

    African Journals Online (AJOL)

    Development and preliminary validation of a screen for interpersonal childhood trauma experiences among school-going youth in Durban, South Africa. ... validity in the sense that all scales were significantly correlated with scores on clinical measures of post-traumatic stress disorder (PTSD) and/or complex PTSD.

  13. [Risk Prediction Using Routine Data: Development and Validation of Multivariable Models Predicting 30- and 90-day Mortality after Surgical Treatment of Colorectal Cancer].

    Science.gov (United States)

    Crispin, Alexander; Strahwald, Brigitte; Cheney, Catherine; Mansmann, Ulrich

    2018-06-04

    Quality control, benchmarking, and pay for performance (P4P) require valid indicators and statistical models allowing adjustment for differences in risk profiles of the patient populations of the respective institutions. Using hospital remuneration data for measuring quality and modelling patient risks has been criticized by clinicians. Here we explore the potential of prediction models for 30- and 90-day mortality after colorectal cancer surgery based on routine data. Full census of a major statutory health insurer. Surgical departments throughout the Federal Republic of Germany. 4283 and 4124 insurants with major surgery for treatment of colorectal cancer during 2013 and 2014, respectively. Age, sex, primary and secondary diagnoses as well as tumor locations as recorded in the hospital remuneration data according to §301 SGB V. 30- and 90-day mortality. Elixhauser comorbidities, Charlson conditions, and Charlson scores were generated from the ICD-10 diagnoses. Multivariable prediction models were developed using a penalized logistic regression approach (logistic ridge regression) in a derivation set (patients treated in 2013). Calibration and discrimination of the models were assessed in an internal validation sample (patients treated in 2014) using calibration curves, Brier scores, receiver operating characteristic curves (ROC curves) and the areas under the ROC curves (AUC). 30- and 90-day mortality rates in the learning-sample were 5.7 and 8.4%, respectively. The corresponding values in the validation sample were 5.9% and once more 8.4%. Models based on Elixhauser comorbidities exhibited the highest discriminatory power with AUC values of 0.804 (95% CI: 0.776 -0.832) and 0.805 (95% CI: 0.782-0.828) for 30- and 90-day mortality. The Brier scores for these models were 0.050 (95% CI: 0.044-0.056) and 0.067 (95% CI: 0.060-0.074) and similar to the models based on Charlson conditions. Regardless of the model, low predicted probabilities were well calibrated, while

  14. Development and validation of a physics-based urban fire spread model

    OpenAIRE

    HIMOTO, Keisuke; TANAKA, Takeyoshi

    2008-01-01

    A computational model for fire spread in a densely built urban area is developed. The model is distinct from existing models in that it explicitly describes fire spread phenomena with physics-based knowledge achieved in the field of fire safety engineering. In the model, urban fire is interpreted as an ensemble of multiple building fires; that is, the fire spread is simulated by predicting behaviors of individual building fires under the thermal influence of neighboring building fires. Adopte...

  15. Validation of the DeLone and McLean Information Systems Success Model

    OpenAIRE

    Ojo, Adebowale I.

    2017-01-01

    Objectives This study is an adaptation of the widely used DeLone and McLean information system success model in the context of hospital information systems in a developing country. Methods A survey research design was adopted in the study. A structured questionnaire was used to collect data from 442 health information management personnel in five Nigerian teaching hospitals. A structural equation modeling technique was used to validate the model's constructs. Results It was revealed that syst...

  16. Development and validation of a managerial decision making self-efficacy questionnaire

    Directory of Open Access Journals (Sweden)

    Wim Myburgh

    2015-05-01

    Full Text Available Orientation: Self-efficacy beliefs, given their task-specific nature, are likely to influence managers’ perceived decision-making competence depending on fluctuations in their nature and strength as non-ability contributors. Research purpose: The present research describes the conceptualisation, design and measurement of managerial decision-making self-efficacy. Motivation for the study: The absence of a domain-specific measure of the decision-making self-efficacy of managers was the motivation for the development of the Managerial Decisionmaking Self-efficacy Questionnaire (MDMSEQ. Research approach, design and method: A cross-sectional study was conducted on a nonprobability convenience sample of managers from various organisations in South Africa. Statistical analysis focused on the construct validity and reliability of items through exploratory and confirmatory factor analysis to test the factorial validity of the measure. Main findings: The research offers confirmatory validation of the factorial structure of the MDMSEQ. The results of two studies involving 455 (Study 1, n = 193; Study 2, n = 292 experienced managers evidenced a multidimensional structure and demonstrated respectable subscale internal consistencies. Findings also demonstrated that the MDMSEQ shared little common variance with confidence and problem-solving self-efficacy beliefs. In addition, several model fit indices suggested a reasonable to good model fit for the measurement model. Practical/managerial implications: The findings have implications for practical applications in employment selection and development with regard to managerial decision-making. Absence of the assessment of self-efficacy beliefs may introduce systematic, non-performance related variance into managerial decision-making outcomes in spite of abilities that managers possess. Contribution/value-add: Research on the volition-undermining effect of self-efficacy beliefs has been remarkably prominent

  17. Black liquor combustion validated recovery boiler modeling, five-year report

    Energy Technology Data Exchange (ETDEWEB)

    Grace, T.M.; Frederick, W.J.; Salcudean, M.; Wessel, R.A.

    1996-08-01

    The objective of this project was to develop a new computer model of a recovery boiler furnace using a computational fluid dynamics (CFD) code specifically tailored to the requirements for solving recovery boiler flows, and using improved submodels for black liquor combustion based on continued laboratory fundamental studies. The project originated in October 1990 and was scheduled to run for four years. At that time, there was considerable emphasis on developing accurate predictions of the physical carryover of macroscopic particles of partially burnt black liquor and smelt droplets out of the furnace, since this was seen as the main cause of boiler plugging. This placed a major emphasis on gas flow patterns within the furnace and on the mass loss rates and swelling and shrinking rates of burning black liquor drops. As work proceeded on developing the recovery boiler furnace model, it became apparent that some recovery boilers encounter serious plugging problems even when physical carryover was minimal. After the original four-year period was completed, the project was extended to address this issue. The objective of the extended project was to improve the utility of the models by including the black liquor chemistry relevant to air emissions predictions and aerosol formation, and by developing the knowledge base and computational tools to relate furnace model outputs to fouling and plugging of the convective sections of the boilers. The work done to date includes CFD model development and validation, acquisition of information on black liquor combustion fundamentals and development of improved burning models, char bed model development, and model application and simplification.

  18. Development and Validity Testing of Belief Measurement Model in Buddhism for Junior High School Students at Chiang Rai Buddhist Scripture School: An Application for Multitrait-Multimethod Analysis

    Science.gov (United States)

    Chaidi, Thirachai; Damrongpanich, Sunthorapot

    2016-01-01

    The purposes of this study were to develop a model to measure the belief in Buddhism of junior high school students at Chiang Rai Buddhist Scripture School, and to determine construct validity of the model for measuring the belief in Buddhism by using Multitrait-Multimethod analysis. The samples were 590 junior high school students at Buddhist…

  19. Experimental Validation and Model Verification for a Novel Geometry ICPC Solar Collector

    DEFF Research Database (Denmark)

    Perers, Bengt; Duff, William S.; Daosukho, Jirachote

    A novel geometry ICPC solar collector was developed at the University of Chicago and Colorado State University. A ray tracing model has been designed to investigate the optical performance of both the horizontal and vertical fin versions of this collector. Solar radiation is modeled as discrete...... to the desired incident angle of the sun’s rays, performance of the novel ICPC solar collector at various specified angles along the transverse and longitudinal evacuated tube directions were experimentally determined. To validate the ray tracing model, transverse and longitudinal performance predictions...... at the corresponding specified incident angles are compared to the Sandia results. A 100 m2 336 Novel ICPC evacuated tube solar collector array has been in continuous operation at a demonstration project in Sacramento California since 1998. Data from the initial operation of the array are used to further validate...

  20. Valid methods: the quality assurance of test method development, validation, approval, and transfer for veterinary testing laboratories.

    Science.gov (United States)

    Wiegers, Ann L

    2003-07-01

    Third-party accreditation is a valuable tool to demonstrate a laboratory's competence to conduct testing. Accreditation, internationally and in the United States, has been discussed previously. However, accreditation is only I part of establishing data credibility. A validated test method is the first component of a valid measurement system. Validation is defined as confirmation by examination and the provision of objective evidence that the particular requirements for a specific intended use are fulfilled. The international and national standard ISO/IEC 17025 recognizes the importance of validated methods and requires that laboratory-developed methods or methods adopted by the laboratory be appropriate for the intended use. Validated methods are therefore required and their use agreed to by the client (i.e., end users of the test results such as veterinarians, animal health programs, and owners). ISO/IEC 17025 also requires that the introduction of methods developed by the laboratory for its own use be a planned activity conducted by qualified personnel with adequate resources. This article discusses considerations and recommendations for the conduct of veterinary diagnostic test method development, validation, evaluation, approval, and transfer to the user laboratory in the ISO/IEC 17025 environment. These recommendations are based on those of nationally and internationally accepted standards and guidelines, as well as those of reputable and experienced technical bodies. They are also based on the author's experience in the evaluation of method development and transfer projects, validation data, and the implementation of quality management systems in the area of method development.

  1. VALIDATION OF SIMULATION MODELS FOR DIFFERENTLY DESIGNED HEAT-PIPE EVACUATED TUBULAR COLLECTORS

    DEFF Research Database (Denmark)

    Fan, Jianhua; Dragsted, Janne; Furbo, Simon

    2007-01-01

    Differently designed heat-pipe evacuated tubular collectors have been investigated theoretically and experimentally. The theoretical work has included development of two TRNSYS [1] simulation models for heat-pipe evacuated tubular collectors utilizing solar radiation from all directions. One model...... coating on both sides. The input to the models is thus not a simple collector efficiency expression but the actual collector geometry. In this study, the TRNSYS models are validated with measurements for four differently designed heat-pipe evacuated tubular collectors. The collectors are produced...

  2. Development, standardization and validation of social anxiety scale ...

    African Journals Online (AJOL)

    Little attention has been given to social anxiety in Nigeria despite its debilitating effects on the sufferers. The objective of this study was to develop, standardize and validate an instrument (Social Anxiety Scale) with high coefficients of Cronbach Alpha Internal Consistency Split-half reliability and construct validity.

  3. Development and validation of a new knowledge, attitude, belief and practice questionnaire on leptospirosis in Malaysia.

    Science.gov (United States)

    Zahiruddin, Wan Mohd; Arifin, Wan Nor; Mohd-Nazri, Shafei; Sukeri, Surianti; Zawaha, Idris; Bakar, Rahman Abu; Hamat, Rukman Awang; Malina, Osman; Jamaludin, Tengku Zetty Maztura Tengku; Pathman, Arumugam; Mas-Harithulfadhli-Agus, Ab Rahman; Norazlin, Idris; Suhailah, Binti Samsudin; Saudi, Siti Nor Sakinah; Abdullah, Nurul Munirah; Nozmi, Noramira; Zainuddin, Abdul Wahab; Aziah, Daud

    2018-03-07

    In Malaysia, leptospirosis is considered an endemic disease, with sporadic outbreaks following rainy or flood seasons. The objective of this study was to develop and validate a new knowledge, attitude, belief and practice (KABP) questionnaire on leptospirosis for use in urban and rural populations in Malaysia. The questionnaire comprised development and validation stages. The development phase encompassed a literature review, expert panel review, focus-group testing, and evaluation. The validation phase consisted of exploratory and confirmatory parts to verify the psychometric properties of the questionnaire. A total of 214 and 759 participants were recruited from two Malaysian states, Kelantan and Selangor respectively, for the validation phase. The participants comprised urban and rural communities with a high reported incidence of leptospirosis. The knowledge section of the validation phase utilized item response theory (IRT) analysis. The attitude and belief sections utilized exploratory factor analysis (EFA) and confirmatory factor analysis (CFA). The development phase resulted in a questionnaire that included four main sections: knowledge, attitude, belief, and practice. In the exploratory phase, as shown by the IRT analysis of knowledge about leptospirosis, the difficulty and discrimination values of the items were acceptable, with the exception of two items. Based on the EFA, the psychometric properties of the attitude, belief, and practice sections were poor. Thus, these sections were revised, and no further factor analysis of the practice section was conducted. In the confirmatory stage, the difficulty and discrimination values of the items in the knowledge section remained within the acceptable range. The CFA of the attitude section resulted in a good-fitting two-factor model. The CFA of the belief section retained low number of items, although the analysis resulted in a good fit in the final three-factor model. Based on the IRT analysis and factor

  4. Screening for postdeployment conditions: development and cross-validation of an embedded validity scale in the neurobehavioral symptom inventory.

    Science.gov (United States)

    Vanderploeg, Rodney D; Cooper, Douglas B; Belanger, Heather G; Donnell, Alison J; Kennedy, Jan E; Hopewell, Clifford A; Scott, Steven G

    2014-01-01

    To develop and cross-validate internal validity scales for the Neurobehavioral Symptom Inventory (NSI). Four existing data sets were used: (1) outpatient clinical traumatic brain injury (TBI)/neurorehabilitation database from a military site (n = 403), (2) National Department of Veterans Affairs TBI evaluation database (n = 48 175), (3) Florida National Guard nonclinical TBI survey database (n = 3098), and (4) a cross-validation outpatient clinical TBI/neurorehabilitation database combined across 2 military medical centers (n = 206). Secondary analysis of existing cohort data to develop (study 1) and cross-validate (study 2) internal validity scales for the NSI. The NSI, Mild Brain Injury Atypical Symptoms, and Personality Assessment Inventory scores. Study 1: Three NSI validity scales were developed, composed of 5 unusual items (Negative Impression Management [NIM5]), 6 low-frequency items (LOW6), and the combination of 10 nonoverlapping items (Validity-10). Cut scores maximizing sensitivity and specificity on these measures were determined, using a Mild Brain Injury Atypical Symptoms score of 8 or more as the criterion for invalidity. Study 2: The same validity scale cut scores again resulted in the highest classification accuracy and optimal balance between sensitivity and specificity in the cross-validation sample, using a Personality Assessment Inventory Negative Impression Management scale with a T score of 75 or higher as the criterion for invalidity. The NSI is widely used in the Department of Defense and Veterans Affairs as a symptom-severity assessment following TBI, but is subject to symptom overreporting or exaggeration. This study developed embedded NSI validity scales to facilitate the detection of invalid response styles. The NSI Validity-10 scale appears to hold considerable promise for validity assessment when the NSI is used as a population-screening tool.

  5. Experimental validation of a combustion kinetics based multi-zone model for natural gas-diesel RCCI engines

    NARCIS (Netherlands)

    Mikulski, M.; Bekdemir, C.; Willems, F.P.T.

    2016-01-01

    This paper presents the validation results of TNO's combustion model designed to support RCCI control development. In-depth validation was performed on a multi-cylinder heavy-duty engine operating in RCCI mode on natural gas and diesel fuel. It was shown that the adopted approach is able to

  6. Development and validation of a preoperative prediction model for colorectal cancer T-staging based on MDCT images and clinical information.

    Science.gov (United States)

    Sa, Sha; Li, Jing; Li, Xiaodong; Li, Yongrui; Liu, Xiaoming; Wang, Defeng; Zhang, Huimao; Fu, Yu

    2017-08-15

    This study aimed to establish and evaluate the efficacy of a prediction model for colorectal cancer T-staging. T-staging was positively correlated with the level of carcinoembryonic antigen (CEA), expression of carbohydrate antigen 19-9 (CA19-9), wall deformity, blurred outer edges, fat infiltration, infiltration into the surrounding tissue, tumor size and wall thickness. Age, location, enhancement rate and enhancement homogeneity were negatively correlated with T-staging. The predictive results of the model were consistent with the pathological gold standard, and the kappa value was 0.805. The total accuracy of staging improved from 51.04% to 86.98% with the proposed model. The clinical, imaging and pathological data of 611 patients with colorectal cancer (419 patients in the training group and 192 patients in the validation group) were collected. A spearman correlation analysis was used to validate the relationship among these factors and pathological T-staging. A prediction model was trained with the random forest algorithm. T staging of the patients in the validation group was predicted by both prediction model and traditional method. The consistency, accuracy, sensitivity, specificity and area under the curve (AUC) were used to compare the efficacy of the two methods. The newly established comprehensive model can improve the predictive efficiency of preoperative colorectal cancer T-staging.

  7. First Steps to Develop and Validate a CFPD Model in Order to Support the Design of Nose-to-Brain Delivered Biopharmaceuticals.

    Science.gov (United States)

    Engelhardt, Lucas; Röhm, Martina; Mavoungou, Chrystelle; Schindowski, Katharina; Schafmeister, Annette; Simon, Ulrich

    2016-06-01

    Aerosol particle deposition in the human nasal cavity is of high interest in particular for intranasal central nervous system (CNS) drug delivery via the olfactory cleft. The objective of this study was the development and comparison of a numerical and experimental model to investigate various parameters for olfactory particle deposition within the complex anatomical nasal geometry. Based on a standardized nasal cavity, a computational fluid and particle dynamics (CFPD) model was developed that enables the variation and optimization of different parameters, which were validated by in vitro experiments using a constructed rapid-prototyped human nose model. For various flow rates (5 to 40 l/min) and particle sizes (1 to 10 μm), the airflow velocities, the calculated particle airflow patterns and the particle deposition correlated very well with the experiment. Particle deposition was investigated numerically by varying particle sizes at constant flow rate and vice versa assuming the particle size distribution of the used nebulizer. The developed CFPD model could be directly translated to the in vitro results. Hence, it can be applied for parameter screening and will contribute to the improvement of aerosol particle deposition at the olfactory cleft for CNS drug delivery in particular for biopharmaceuticals.

  8. Sensitivity analysis of a validated subject-specific finite element model of the human craniofacial skeleton.

    Science.gov (United States)

    Szwedowski, T D; Fialkov, J; Whyne, C M

    2011-01-01

    Developing a more complete understanding of the mechanical response of the craniofacial skeleton (CFS) to physiological loads is fundamental to improving treatment for traumatic injuries, reconstruction due to neoplasia, and deformities. Characterization of the biomechanics of the CFS is challenging due to its highly complex structure and heterogeneity, motivating the utilization of experimentally validated computational models. As such, the objective of this study was to develop, experimentally validate, and parametrically analyse a patient-specific finite element (FE) model of the CFS to elucidate a better understanding of the factors that are of intrinsic importance to the skeletal structural behaviour of the human CFS. An FE model of a cadaveric craniofacial skeleton was created from subject-specific computed tomography data. The model was validated based on bone strain measurements taken under simulated physiological-like loading through the masseter and temporalis muscles (which are responsible for the majority of craniofacial physiologic loading due to mastication). The baseline subject-specific model using locally defined cortical bone thicknesses produced the strongest correlation to the experimental data (r2 = 0.73). Large effects on strain patterns arising from small parametric changes in cortical thickness suggest that the very thin bony structures present in the CFS are crucial to characterizing the local load distribution in the CFS accurately.

  9. Evaluating Domestic Hot Water Distribution System Options with Validated Analysis Models

    Energy Technology Data Exchange (ETDEWEB)

    Weitzel, E. [Alliance for Residential Building Innovation, Davis, CA (United States); Hoeschele, E. [Alliance for Residential Building Innovation, Davis, CA (United States)

    2014-09-01

    A developing body of work is forming that collects data on domestic hot water consumption, water use behaviors, and energy efficiency of various distribution systems. Transient System Simulation Tool (TRNSYS) is a full distribution system developed that has been validated using field monitoring data and then exercised in a number of climates to understand climate impact on performance. In this study, the Building America team built upon previous analysis modeling work to evaluate differing distribution systems and the sensitivities of water heating energy and water use efficiency to variations of climate, load, distribution type, insulation and compact plumbing practices. Overall, 124 different TRNSYS models were simulated. The results of this work are useful in informing future development of water heating best practices guides as well as more accurate (and simulation time efficient) distribution models for annual whole house simulation programs.

  10. Understanding Dynamic Model Validation of a Wind Turbine Generator and a Wind Power Plant: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Muljadi, Eduard; Zhang, Ying Chen; Gevorgian, Vahan; Kosterev, Dmitry

    2016-09-01

    Regional reliability organizations require power plants to validate the dynamic models that represent them to ensure that power systems studies are performed to the best representation of the components installed. In the process of validating a wind power plant (WPP), one must be cognizant of the parameter settings of the wind turbine generators (WTGs) and the operational settings of the WPP. Validating the dynamic model of a WPP is required to be performed periodically. This is because the control parameters of the WTGs and the other supporting components within a WPP may be modified to comply with new grid codes or upgrades to the WTG controller with new capabilities developed by the turbine manufacturers or requested by the plant owners or operators. The diversity within a WPP affects the way we represent it in a model. Diversity within a WPP may be found in the way the WTGs are controlled, the wind resource, the layout of the WPP (electrical diversity), and the type of WTGs used. Each group of WTGs constitutes a significant portion of the output power of the WPP, and their unique and salient behaviors should be represented individually. The objective of this paper is to illustrate the process of dynamic model validations of WTGs and WPPs, the available data recorded that must be screened before it is used for the dynamic validations, and the assumptions made in the dynamic models of the WTG and WPP that must be understood. Without understanding the correct process, the validations may lead to the wrong representations of the WTG and WPP modeled.

  11. The predictive and discriminant validity of the zone of proximal development.

    Science.gov (United States)

    Meijer, J; Elshout, J J

    2001-03-01

    Dynamic measurement procedures are supposed to uncover the zone of proximal development and to increase predictive validity in comparison to conventional, static measurement procedures. Two alternative explanations for the discrepancies between static and dynamic measurements were investigated. The first focuses on Vygotsky's learning potential theory, the second considers the role of anxiety tendency during test taking. If test anxious tendencies are mitigated by dynamic testing procedures, in particular the availability of assistance, the concept of the zone of proximal development may be superfluous in explaining the differences between the outcomes of static and dynamic measurement. Participants were students from secondary education in the Netherlands. They were tested repeatedly in grade three as well as in grade four. Participants were between 14 and 17 years old; their average age was 15.4 years with a standard deviation of .52. Two types of mathematics tests were used in a longitudinal experiment. The first type of test consisted of open-ended items, which participants had to solve completely on their own. With the second type of test, assistance was available to participants during the test. The latter so-called learning test was conceived of as a dynamic testing procedure. Furthermore, a test anxiety questionnaire was administered repeatedly. Structural equation modelling was used to analyse the data. Apart from emotionality and worry, lack of self-confidence appears to be an important constituent of test anxiety. The learning test appears to contribute to the predictive validity of conventional tests and thus a part of Vygotsky's claims were substantiated. Moreover, the mere inclusion of a test anxiety factor into an explanatory model for the gathered data is not sufficient. Apart from test anxiety and mathematical ability it is necessary to assume a factor which may be construed as mathematics learning potential. The results indicate that the observed

  12. Development and validation of Dutch version of Lasater Clinical Judgment Rubric in hospital practice: An instrument design study.

    Science.gov (United States)

    Vreugdenhil, Jettie; Spek, Bea

    2018-03-01

    Clinical reasoning in patient care is a skill that cannot be observed directly. So far, no reliable, valid instrument exists for the assessment of nursing students' clinical reasoning skills in hospital practice. Lasater's clinical judgment rubric (LCJR), based on Tanner's model "Thinking like a nurse" has been tested, mainly in academic simulation settings. The aim is to develop a Dutch version of the LCJR (D-LCJR) and to test its psychometric properties when used in a hospital traineeship context. A mixed-model approach was used to develop and to validate the instrument. Ten dedicated educational units in a university hospital. A well-mixed group of 52 nursing students, nurse coaches and nurse educators. A Delphi panel developed the D-LCJR. Students' clinical reasoning skills were assessed "live" by nurse coaches, nurse educators and students who rated themselves. The psychometric properties tested during the assessment process are reliability, reproducibility, content validity and construct validity by testing two hypothesis: 1) a positive correlation between assessed and self-reported sum scores (convergent validity) and 2) a linear relation between experience and sum score (clinical validity). The obtained D-LCJR was found to be internally consistent, Cronbach's alpha 0.93. The rubric is also reproducible with intraclass correlations between 0.69 and 0.78. Experts judged it to be content valid. The two hypothesis were both tested significant, supporting evidence for construct validity. The translated and modified LCJR, is a promising tool for the evaluation of nursing students' development in clinical reasoning in hospital traineeships, by students, nurse coaches and nurse educators. More evidence on construct validity is necessary, in particular for students at the end of their hospital traineeship. Based on our research, the D-LCJR applied in hospital traineeships is a usable and reliable tool. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Validation process of simulation model; Proceso de validacion de modelos de simulacion

    Energy Technology Data Exchange (ETDEWEB)

    San Isidro Pindado, M J

    1998-12-31

    It is presented a methodology on empirical about any detailed simulation model. This kind of validation it is always related with an experimental case. The empirical validation has a residual sense, because the conclusions are based on comparison between simulated outputs and experimental measurements. This methodology will guide us to detect the fails of the simulation model. Furthermore, it can be used a guide in the design of posteriori experiments. Three steps can be well differentiated: - Sensitivity analysis. It can be made with a DSA, differential sensitivity analysis, and with a MCSA, Monte-Carlo sensitivity analysis. - Looking the optimal domains of the input parameters. It has been developed a procedure based on the Monte-Carlo methods and Cluster techniques, to find the optimal domains of these parameters. - Residual analysis. This analysis has been made on the time domain on the frequency domain, it has been used the correlation analysis and spectral analysis. As application of this methodology, it is presented the validation carried out on a thermal simulation model on buildings, ESP studying the behavior of building components on a Test Cell of LECE of CIEMAT. (Author)

  14. Validation process of simulation model; Proceso de validacion de modelos de simulacion

    Energy Technology Data Exchange (ETDEWEB)

    San Isidro Pindado, M.J.

    1997-12-31

    It is presented a methodology on empirical about any detailed simulation model. This kind of validation it is always related with an experimental case. The empirical validation has a residual sense, because the conclusions are based on comparison between simulated outputs and experimental measurements. This methodology will guide us to detect the fails of the simulation model. Furthermore, it can be used a guide in the design of posteriori experiments. Three steps can be well differentiated: - Sensitivity analysis. It can be made with a DSA, differential sensitivity analysis, and with a MCSA, Monte-Carlo sensitivity analysis. - Looking the optimal domains of the input parameters. It has been developed a procedure based on the Monte-Carlo methods and Cluster techniques, to find the optimal domains of these parameters. - Residual analysis. This analysis has been made on the time domain on the frequency domain, it has been used the correlation analysis and spectral analysis. As application of this methodology, it is presented the validation carried out on a thermal simulation model on buildings, ESP studying the behavior of building components on a Test Cell of LECE of CIEMAT. (Author)

  15. Modeling and experimental validation of a Hybridized Energy Storage System for automotive applications

    Science.gov (United States)

    Fiorenti, Simone; Guanetti, Jacopo; Guezennec, Yann; Onori, Simona

    2013-11-01

    This paper presents the development and experimental validation of a dynamic model of a Hybridized Energy Storage System (HESS) consisting of a parallel connection of a lead acid (PbA) battery and double layer capacitors (DLCs), for automotive applications. The dynamic modeling of both the PbA battery and the DLC has been tackled via the equivalent electric circuit based approach. Experimental tests are designed for identification purposes. Parameters of the PbA battery model are identified as a function of state of charge and current direction, whereas parameters of the DLC model are identified for different temperatures. A physical HESS has been assembled at the Center for Automotive Research The Ohio State University and used as a test-bench to validate the model against a typical current profile generated for Start&Stop applications. The HESS model is then integrated into a vehicle simulator to assess the effects of the battery hybridization on the vehicle fuel economy and mitigation of the battery stress.

  16. Development and validation of outcome prediction models for aneurysmal subarachnoid haemorrhage : The SAHIT multinational cohort study

    NARCIS (Netherlands)

    Jaja, Blessing N R; Saposnik, Gustavo; Lingsma, Hester F.; Macdonald, Erin; Thorpe, Kevin E.; Mamdani, Muhammed; Steyerberg, Ewout W.; Molyneux, Andrew; Manoel, Airton Leonardo De Oliveira; Schatlo, Bawarjan; Hanggi, Daniel; Hasan, David M.; Wong, George K C; Etminan, Nima; Fukuda, Hitoshi; Torner, James C.; Schaller, Karl L.; Suarez, Jose I.; Stienen, Martin N.; Vergouwen, Mervyn D.I.; Rinkel, Gabriel J.E.; Spears, Julian; Cusimano, Michael D.; Todd, Michael; Le Roux, Peter; Kirkpatrick, Peter J.; Pickard, John; Van Den Bergh, Walter M.; Murray, Gordon D; Johnston, S. Claiborne; Yamagata, Sen; Mayer, Stephan A.; Schweizer, Tom A.; Macdonald, R. Loch

    2018-01-01

    Objective To develop and validate a set of practical prediction tools that reliably estimate the outcome of subarachnoid haemorrhage from ruptured intracranial aneurysms (SAH). Design Cohort study with logistic regression analysis to combine predictors and treatment modality. Setting Subarachnoid

  17. Geochemistry Model Validation Report: Material Degradation and Release Model

    Energy Technology Data Exchange (ETDEWEB)

    H. Stockman

    2001-09-28

    The purpose of this Analysis and Modeling Report (AMR) is to validate the Material Degradation and Release (MDR) model that predicts degradation and release of radionuclides from a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. This AMR is prepared according to ''Technical Work Plan for: Waste Package Design Description for LA'' (Ref. 17). The intended use of the MDR model is to estimate the long-term geochemical behavior of waste packages (WPs) containing U. S . Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The model is intended to predict (1) the extent to which criticality control material, such as gadolinium (Gd), will remain in the WP after corrosion of the initial WP, (2) the extent to which fissile Pu and uranium (U) will be carried out of the degraded WP by infiltrating water, and (3) the chemical composition and amounts of minerals and other solids left in the WP. The results of the model are intended for use in criticality calculations. The scope of the model validation report is to (1) describe the MDR model, and (2) compare the modeling results with experimental studies. A test case based on a degrading Pu-ceramic WP is provided to help explain the model. This model does not directly feed the assessment of system performance. The output from this model is used by several other models, such as the configuration generator, criticality, and criticality consequence models, prior to the evaluation of system performance. This document has been prepared according to AP-3.10Q, ''Analyses and Models'' (Ref. 2), and prepared in accordance with the technical work plan (Ref. 17).

  18. Geochemistry Model Validation Report: Material Degradation and Release Model

    International Nuclear Information System (INIS)

    Stockman, H.

    2001-01-01

    The purpose of this Analysis and Modeling Report (AMR) is to validate the Material Degradation and Release (MDR) model that predicts degradation and release of radionuclides from a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. This AMR is prepared according to ''Technical Work Plan for: Waste Package Design Description for LA'' (Ref. 17). The intended use of the MDR model is to estimate the long-term geochemical behavior of waste packages (WPs) containing U. S . Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The model is intended to predict (1) the extent to which criticality control material, such as gadolinium (Gd), will remain in the WP after corrosion of the initial WP, (2) the extent to which fissile Pu and uranium (U) will be carried out of the degraded WP by infiltrating water, and (3) the chemical composition and amounts of minerals and other solids left in the WP. The results of the model are intended for use in criticality calculations. The scope of the model validation report is to (1) describe the MDR model, and (2) compare the modeling results with experimental studies. A test case based on a degrading Pu-ceramic WP is provided to help explain the model. This model does not directly feed the assessment of system performance. The output from this model is used by several other models, such as the configuration generator, criticality, and criticality consequence models, prior to the evaluation of system performance. This document has been prepared according to AP-3.10Q, ''Analyses and Models'' (Ref. 2), and prepared in accordance with the technical work plan (Ref. 17)

  19. Validation of an immortalized human (hBMEC) in vitro blood-brain barrier model.

    Science.gov (United States)

    Eigenmann, Daniela Elisabeth; Jähne, Evelyn Andrea; Smieško, Martin; Hamburger, Matthias; Oufir, Mouhssin

    2016-03-01

    We recently established and optimized an immortalized human in vitro blood-brain barrier (BBB) model based on the hBMEC cell line. In the present work, we validated this mono-culture 24-well model with a representative series of drug substances which are known to cross or not to cross the BBB. For each individual compound, a quantitative UHPLC-MS/MS method in Ringer HEPES buffer was developed and validated according to current regulatory guidelines, with respect to selectivity, precision, and reliability. Various biological and analytical challenges were met during method validation, highlighting the importance of careful method development. The positive controls antipyrine, caffeine, diazepam, and propranolol showed mean endothelial permeability coefficients (P e) in the range of 17-70 × 10(-6) cm/s, indicating moderate to high BBB permeability when compared to the barrier integrity marker sodium fluorescein (mean P e 3-5 × 10(-6) cm/s). The negative controls atenolol, cimetidine, and vinblastine showed mean P e values < 10 × 10(-6) cm/s, suggesting low permeability. In silico calculations were in agreement with in vitro data. With the exception of quinidine (P-glycoprotein inhibitor and substrate), BBB permeability of all control compounds was correctly predicted by this new, easy, and fast to set up human in vitro BBB model. Addition of retinoic acid and puromycin did not increase transendothelial electrical resistance (TEER) values of the BBB model.

  20. Application of environmental isotopes to validate a model of regional groundwater flow and transport (Carrizo Aquifer)

    International Nuclear Information System (INIS)

    Pearson, F.J.

    1999-01-01

    It is asserted that models cannot be validated. This seems obvious if one identifies validation as the process of testing a model against absolute truth, and accepts that absolute truth is less a scientific than a philosophic or religious concept. What is here called model validation has a more modest goal - to develop confidence in the conceptual and mathematical models used to describe a groundwater system by illustrating that measured radiochemical properties of the groundwater match those predicted by the model. The system described is the Carrizo sand in the Gulf Coastal Plain of south Texas. Each element of the modelling chain describing the movement of 14 C is confirmed independently and, thus, can be said to be validated. The groundwater ages, and the 14 C measurements and carbonate geochemical model underlying them, are confirmed by the noble gas measurements, while the flow and transport model is confirmed by the 14 C results. Agreement between the modelled and measured 234 U/ 238 U ratios supports the description of U transport used in the modelling, while the need to use an unexpectedly low K D value for U raises questions about the applicability of laboratory K D data to the Carrizo groundwater system. (author)