WorldWideScience

Sample records for ii model validation

  1. Large-scale Validation of AMIP II Land-surface Simulations: Preliminary Results for Ten Models

    Energy Technology Data Exchange (ETDEWEB)

    Phillips, T J; Henderson-Sellers, A; Irannejad, P; McGuffie, K; Zhang, H

    2005-12-01

    This report summarizes initial findings of a large-scale validation of the land-surface simulations of ten atmospheric general circulation models that are entries in phase II of the Atmospheric Model Intercomparison Project (AMIP II). This validation is conducted by AMIP Diagnostic Subproject 12 on Land-surface Processes and Parameterizations, which is focusing on putative relationships between the continental climate simulations and the associated models' land-surface schemes. The selected models typify the diversity of representations of land-surface climate that are currently implemented by the global modeling community. The current dearth of global-scale terrestrial observations makes exacting validation of AMIP II continental simulations impractical. Thus, selected land-surface processes of the models are compared with several alternative validation data sets, which include merged in-situ/satellite products, climate reanalyses, and off-line simulations of land-surface schemes that are driven by observed forcings. The aggregated spatio-temporal differences between each simulated process and a chosen reference data set then are quantified by means of root-mean-square error statistics; the differences among alternative validation data sets are similarly quantified as an estimate of the current observational uncertainty in the selected land-surface process. Examples of these metrics are displayed for land-surface air temperature, precipitation, and the latent and sensible heat fluxes. It is found that the simulations of surface air temperature, when aggregated over all land and seasons, agree most closely with the chosen reference data, while the simulations of precipitation agree least. In the latter case, there also is considerable inter-model scatter in the error statistics, with the reanalyses estimates of precipitation resembling the AMIP II simulations more than to the chosen reference data. In aggregate, the simulations of land-surface latent and

  2. Testing the Predictive Validity of the Hendrich II Fall Risk Model.

    Science.gov (United States)

    Jung, Hyesil; Park, Hyeoun-Ae

    2018-03-01

    Cumulative data on patient fall risk have been compiled in electronic medical records systems, and it is possible to test the validity of fall-risk assessment tools using these data between the times of admission and occurrence of a fall. The Hendrich II Fall Risk Model scores assessed during three time points of hospital stays were extracted and used for testing the predictive validity: (a) upon admission, (b) when the maximum fall-risk score from admission to falling or discharge, and (c) immediately before falling or discharge. Predictive validity was examined using seven predictive indicators. In addition, logistic regression analysis was used to identify factors that significantly affect the occurrence of a fall. Among the different time points, the maximum fall-risk score assessed between admission and falling or discharge showed the best predictive performance. Confusion or disorientation and having a poor ability to rise from a sitting position were significant risk factors for a fall.

  3. SASSYS validation with the EBR-II shutdown heat removal tests

    International Nuclear Information System (INIS)

    Herzog, J.P.

    1989-01-01

    SASSYS is a coupled neutronic and thermal hydraulic code developed for the analysis of transients in liquid metal cooled reactors (LMRs). The code is especially suited for evaluating of normal reactor transients -- protected (design basis) and unprotected (anticipated transient without scram) transients. Because SASSYS is heavily used in support of the IFR concept and of innovative LMR designs, such as PRISM, a strong validation base for the code must exist. Part of the validation process for SASSYS is analysis of experiments performed on operating reactors, such as the metal fueled Experimental Breeder Reactor -- II (EBR-II). During the course of a series of historic whole-plant experiments, EBR-II illustrated key safety features of metal fueled LMRs. These experiments, the Shutdown Heat Removal Tests (SHRT), culminated in unprotected loss of flow and loss of heat sink transients from full power and flow. Analysis of these and earlier SHRT experiments constitutes a vital part of SASSYS validation, because it facilitates scrutiny of specific SASSYS models and of integrated code capability. 12 refs., 11 figs

  4. Predictive validity of the Hendrich fall risk model II in an acute geriatric unit.

    Science.gov (United States)

    Ivziku, Dhurata; Matarese, Maria; Pedone, Claudio

    2011-04-01

    Falls are the most common adverse events reported in acute care hospitals, and older patients are the most likely to fall. The risk of falling cannot be completely eliminated, but it can be reduced through the implementation of a fall prevention program. A major evidence-based intervention to prevent falls has been the use of fall-risk assessment tools. Many tools have been increasingly developed in recent years, but most instruments have not been investigated regarding reliability, validity and clinical usefulness. This study intends to evaluate the predictive validity and inter-rater reliability of Hendrich fall risk model II (HFRM II) in order to identify older patients at risk of falling in geriatric units and recommend its use in clinical practice. A prospective descriptive design was used. The study was carried out in a geriatric acute care unit of an Italian University hospital. All over 65 years old patients consecutively admitted to a geriatric acute care unit of an Italian University hospital over 8-month period were enrolled. The patients enrolled were screened for the falls risk by nurses with the HFRM II within 24h of admission. The falls occurring during the patient's hospital stay were registered. Inter-rater reliability, area under the ROC curve, sensitivity, specificity, positive and negative predictive values and time for the administration were evaluated. 179 elderly patients were included. The inter-rater reliability was 0.87 (95% CI 0.71-1.00). The administration time was about 1min. The most frequently reported risk factors were depression, incontinence, vertigo. Sensitivity and specificity were respectively 86% and 43%. The optimal cut-off score for screening at risk patients was 5 with an area under the ROC curve of 0.72. The risk factors more strongly associated with falls were confusion and depression. As falls of older patients are a common problem in acute care settings it is necessary that the nurses use specific validate and reliable

  5. Validating the standard for the National Board Dental Examination Part II.

    Science.gov (United States)

    Tsai, Tsung-Hsun; Neumann, Laura M; Littlefield, John H

    2012-05-01

    As part of the overall exam validation process, the Joint Commission on National Dental Examinations periodically reviews and validates the pass/fail standard for the National Board Dental Examination (NBDE), Parts I and II. The most recent standard-setting activities for NBDE Part II used the Objective Standard Setting method. This report describes the process used to set the pass/fail standard for the 2009 exam. The failure rate on the NBDE Part II increased from 5.3 percent in 2008 to 13.7 percent in 2009 and then decreased to 10 percent in 2010. This article describes the Objective Standard Setting method and presents the estimated probabilities of classification errors based on the beta binomial mathematical model. The results show that the probability of correct classifications of candidate performance is very high (0.97) and that probabilities of false negative and false positive errors are very small (.03 and <0.001, respectively). The low probability of classification errors supports the conclusion that the pass/fail score on the NBDE Part II is a valid guide for making decisions about candidates for dental licensure.

  6. Contact Modelling in Resistance Welding, Part II: Experimental Validation

    DEFF Research Database (Denmark)

    Song, Quanfeng; Zhang, Wenqi; Bay, Niels

    2006-01-01

    Contact algorithms in resistance welding presented in the previous paper are experimentally validated in the present paper. In order to verify the mechanical contact algorithm, two types of experiments, i.e. sandwich upsetting of circular, cylindrical specimens and compression tests of discs...... with a solid ring projection towards a flat ring, are carried out at room temperature. The complete algorithm, involving not only the mechanical model but also the thermal and electrical models, is validated by projection welding experiments. The experimental results are in satisfactory agreement...

  7. The Portuguese long version of the Copenhagen Psychosocial Questionnaire II (COPSOQ II) - a validation study.

    Science.gov (United States)

    Rosário, Susel; Azevedo, Luís F; Fonseca, João A; Nienhaus, Albert; Nübling, Matthias; da Costa, José Torres

    2017-01-01

    Psychosocial risks are now widely recognised as one of the biggest challenges for occupational safety and health (OSH) and a major public health concern. The aim of this paper is to investigate the Portuguese long version of the Copenhagen Psychosocial Questionnaire II (COPSOQ II), in order to analyse the psychometric properties of the instrument and to validate it. The Portuguese COPSOQ II was issued to a total of 745 Portuguese employees from both private and public organisations across several economic sectors at a baseline and then 2 weeks later. Methodological quality appraisal was based on COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) recommendations. An analysis of the psychometric properties of the long version of COPSOQ II (internal consistency, intraclass correlation coefficient, floor and ceiling effects, response rate, missing values, mean and standard deviation, exploratory factor analysis) was performed to determine the validity and reliability of the instrument. The COPSOQ II had a response rate of 60.6% (test) and a follow-up response rate of 59.5% (retest). In general, a Cronbach's alpha of the COPSOQ scales (test and retest) was above the conventional threshold of 0.70. The test-retest reliability estimated by the intraclass correlation coefficient (ICC) showed a higher reliability for most of the scales, above the conventional 0.7, except for eight scales. The proportion of the missing values was less than 1.3%, except for two scales. The average scores and standard deviations showed similar results to the original Danish study, except for eight scales. All of the scales had low floor and ceiling effects, with one exception . Overall, the exploratory factor analysis presented good results in 27 scales assuming a reflective measurement model. The hypothesized factor structure under a reflective model was not supported in 14 scales and for some but not all of these scales the explanation may be a formative

  8. Efficient Integration, Validation and Troubleshooting in Multimodal Distributed Diagnostic Schemes, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — In general, development and validation of diagnostic models for complex safety critical systems are time and cost intensive jobs. The proposed Phase-II effort will...

  9. Building and validating a prediction model for paediatric type 1 diabetes risk using next generation targeted sequencing of class II HLA genes.

    Science.gov (United States)

    Zhao, Lue Ping; Carlsson, Annelie; Larsson, Helena Elding; Forsander, Gun; Ivarsson, Sten A; Kockum, Ingrid; Ludvigsson, Johnny; Marcus, Claude; Persson, Martina; Samuelsson, Ulf; Örtqvist, Eva; Pyo, Chul-Woo; Bolouri, Hamid; Zhao, Michael; Nelson, Wyatt C; Geraghty, Daniel E; Lernmark, Åke

    2017-11-01

    It is of interest to predict possible lifetime risk of type 1 diabetes (T1D) in young children for recruiting high-risk subjects into longitudinal studies of effective prevention strategies. Utilizing a case-control study in Sweden, we applied a recently developed next generation targeted sequencing technology to genotype class II genes and applied an object-oriented regression to build and validate a prediction model for T1D. In the training set, estimated risk scores were significantly different between patients and controls (P = 8.12 × 10 -92 ), and the area under the curve (AUC) from the receiver operating characteristic (ROC) analysis was 0.917. Using the validation data set, we validated the result with AUC of 0.886. Combining both training and validation data resulted in a predictive model with AUC of 0.903. Further, we performed a "biological validation" by correlating risk scores with 6 islet autoantibodies, and found that the risk score was significantly correlated with IA-2A (Z-score = 3.628, P < 0.001). When applying this prediction model to the Swedish population, where the lifetime T1D risk ranges from 0.5% to 2%, we anticipate identifying approximately 20 000 high-risk subjects after testing all newborns, and this calculation would identify approximately 80% of all patients expected to develop T1D in their lifetime. Through both empirical and biological validation, we have established a prediction model for estimating lifetime T1D risk, using class II HLA. This prediction model should prove useful for future investigations to identify high-risk subjects for prevention research in high-risk populations. Copyright © 2017 John Wiley & Sons, Ltd.

  10. A Practical Approach to Validating a PD Model

    NARCIS (Netherlands)

    Medema, L.; Koning, de R.; Lensink, B.W.

    2009-01-01

    The capital adequacy framework Basel II aims to promote the adoption of stronger risk management practices by the banking industry. The implementation makes validation of credit risk models more important. Lenders therefore need a validation methodology to convince their supervisors that their

  11. A practical approach to validating a PD model

    NARCIS (Netherlands)

    Medema, Lydian; Koning, Ruud H.; Lensink, Robert; Medema, M.

    The capital adequacy framework Basel II aims to promote the adoption of stronger risk management practices by the banking industry. The implementation makes validation of credit risk models more important. Lenders therefore need a validation methodology to convince their supervisors that their

  12. Tools for system validation. Dynamic modelling of the direct condenser at Sandvik II in Vaexjoe; Hjaelpmedel foer systemvalidering. Dynamisk modellering av direktkondensorn paa Sandvik II i Vaexjoe

    Energy Technology Data Exchange (ETDEWEB)

    Raaberg, Martin [Dynasim AB, Lund (Sweden); Tuszynski, Jan [Sycon Energikonsult AB, Malmoe (Sweden)

    2002-04-01

    The project reported here aimed to test the suitability of existing computer tools for modelling of energy processes. The suggested use for the models are at the early tests and validations of new, refurbished or modernised thermal plants. The technique presented in this report should be applicable for clarification of the scope of delivery and testing for both the process and tile control system. The validation process can thus be simplified, allowing risk reduction and predictability of the commissioning. The main delays and economical misfortune often occurs during commissioning. This report should prove the feasibility of the purchase routines where purchaser, vendor and quality inspection will use a common model of the process to validate system requirements and specifications. Later on it is used to validate structure and predefine testing. Thanks to agreement on the common model, early tests can be conducted on complex systems, minimizing the investment risks. The modelling reported here concerns the direct condenser at Sandvik 11, power and heating plant owned by Vaexjoe Energi AB in Sweden. We have chosen the direct condenser because it is an existing, well-documented and well-defined subsystem of high complexity in both structure and operation. Heavy transients made commissioning and test runs of similar condensers throughout Sweden costly and troublesome. The work resulted in an open, general, and physically correct model. The model can easily be re-dimensioned through physical parameters of common use. The control system modelled corresponds to the actual control system at the Sandvik II plant. Any improvement or deep validation of the controllers was not included in this work. The suitability is shown through four simulation cases. Three cases are based on a registered plant operation during a turbine trip. The first test case uses present plant data, the second an old steam valve actuator and the third uses the old actuator and an error in level

  13. Validation of the Essentials of Magnetism II in Chinese critical care settings.

    Science.gov (United States)

    Bai, Jinbing; Hsu, Lily; Zhang, Qing

    2015-05-01

    To translate and evaluate the psychometric properties of the Essentials of Magnetism II tool (EOM II) for Chinese nurses in critical care settings. The EOM II is a reliable and valid scale for measuring the healthy work environment (HWE) for nurses in Western countries, however, it has not been validated among Chinese nurses. The translation of the EOM II followed internationally recognized guidelines. The Chinese version of the Essentials of Magnetism II tool (C-EOM II) was reviewed by an expert panel for culturally semantic equivalence and content validity. Then, 706 nurses from 28 intensive care units (ICUs) affiliated with 14 tertiary hospitals participated in this study. The reliability of the C-EOM II was assessed using the Cronbach's alpha coefficient; the content validity of this scale was assessed using the content validity index (CVI); and the construct validity was assessed using the confirmatory factor analysis (CFA). The C-EOM II showed excellent content validity with a CVI of 0·92. All the subscales of the C-EOM II were significantly correlated with overall nurse job satisfaction and nurse-assessed quality of care. The CFA showed that the C-EOM II was composed of 45 items with nine factors, accounting for 46·51% of the total variance. Cronbach's alpha coefficients for these factors ranged from 0·56 to 0·89. The C-EOM II is a promising scale to assess the HWE for Chinese ICU nurses. Nursing administrators and health care policy-makers can use the C-EOM II to evaluate clinical work environment so that a healthier work environment can be created and sustained for staff nurses. © 2013 British Association of Critical Care Nurses.

  14. Validated Competing Event Model for the Stage I-II Endometrial Cancer Population

    Energy Technology Data Exchange (ETDEWEB)

    Carmona, Ruben; Gulaya, Sachin; Murphy, James D. [Department of Radiation Medicine and Applied Sciences, University of California San Diego, La Jolla, California (United States); Rose, Brent S. [Harvard Radiation Oncology Program, Harvard Medical School, Boston, Massachusetts (United States); Wu, John; Noticewala, Sonal [Department of Radiation Medicine and Applied Sciences, University of California San Diego, La Jolla, California (United States); McHale, Michael T. [Department of Reproductive Medicine, Division of Gynecologic Oncology, University of California San Diego, La Jolla, California (United States); Yashar, Catheryn M. [Department of Radiation Medicine and Applied Sciences, University of California San Diego, La Jolla, California (United States); Vaida, Florin [Department of Family and Preventive Medicine, Biostatistics and Bioinformatics, University of California San Diego Medical Center, San Diego, California (United States); Mell, Loren K., E-mail: lmell@ucsd.edu [Department of Radiation Medicine and Applied Sciences, University of California San Diego, La Jolla, California (United States)

    2014-07-15

    Purpose/Objectives(s): Early-stage endometrial cancer patients are at higher risk of noncancer mortality than of cancer mortality. Competing event models incorporating comorbidity could help identify women most likely to benefit from treatment intensification. Methods and Materials: 67,397 women with stage I-II endometrioid adenocarcinoma after total hysterectomy diagnosed from 1988 to 2009 were identified in Surveillance, Epidemiology, and End Results (SEER) and linked SEER-Medicare databases. Using demographic and clinical information, including comorbidity, we sought to develop and validate a risk score to predict the incidence of competing mortality. Results: In the validation cohort, increasing competing mortality risk score was associated with increased risk of noncancer mortality (subdistribution hazard ratio [SDHR], 1.92; 95% confidence interval [CI], 1.60-2.30) and decreased risk of endometrial cancer mortality (SDHR, 0.61; 95% CI, 0.55-0.78). Controlling for other variables, Charlson Comorbidity Index (CCI) = 1 (SDHR, 1.62; 95% CI, 1.45-1.82) and CCI >1 (SDHR, 3.31; 95% CI, 2.74-4.01) were associated with increased risk of noncancer mortality. The 10-year cumulative incidences of competing mortality within low-, medium-, and high-risk strata were 27.3% (95% CI, 25.2%-29.4%), 34.6% (95% CI, 32.5%-36.7%), and 50.3% (95% CI, 48.2%-52.6%), respectively. With increasing competing mortality risk score, we observed a significant decline in omega (ω), indicating a diminishing likelihood of benefit from treatment intensification. Conclusion: Comorbidity and other factors influence the risk of competing mortality among patients with early-stage endometrial cancer. Competing event models could improve our ability to identify patients likely to benefit from treatment intensification.

  15. Validated Competing Event Model for the Stage I-II Endometrial Cancer Population

    International Nuclear Information System (INIS)

    Carmona, Ruben; Gulaya, Sachin; Murphy, James D.; Rose, Brent S.; Wu, John; Noticewala, Sonal; McHale, Michael T.; Yashar, Catheryn M.; Vaida, Florin; Mell, Loren K.

    2014-01-01

    Purpose/Objectives(s): Early-stage endometrial cancer patients are at higher risk of noncancer mortality than of cancer mortality. Competing event models incorporating comorbidity could help identify women most likely to benefit from treatment intensification. Methods and Materials: 67,397 women with stage I-II endometrioid adenocarcinoma after total hysterectomy diagnosed from 1988 to 2009 were identified in Surveillance, Epidemiology, and End Results (SEER) and linked SEER-Medicare databases. Using demographic and clinical information, including comorbidity, we sought to develop and validate a risk score to predict the incidence of competing mortality. Results: In the validation cohort, increasing competing mortality risk score was associated with increased risk of noncancer mortality (subdistribution hazard ratio [SDHR], 1.92; 95% confidence interval [CI], 1.60-2.30) and decreased risk of endometrial cancer mortality (SDHR, 0.61; 95% CI, 0.55-0.78). Controlling for other variables, Charlson Comorbidity Index (CCI) = 1 (SDHR, 1.62; 95% CI, 1.45-1.82) and CCI >1 (SDHR, 3.31; 95% CI, 2.74-4.01) were associated with increased risk of noncancer mortality. The 10-year cumulative incidences of competing mortality within low-, medium-, and high-risk strata were 27.3% (95% CI, 25.2%-29.4%), 34.6% (95% CI, 32.5%-36.7%), and 50.3% (95% CI, 48.2%-52.6%), respectively. With increasing competing mortality risk score, we observed a significant decline in omega (ω), indicating a diminishing likelihood of benefit from treatment intensification. Conclusion: Comorbidity and other factors influence the risk of competing mortality among patients with early-stage endometrial cancer. Competing event models could improve our ability to identify patients likely to benefit from treatment intensification

  16. Validation of the CAR II model for Flanders, Belgium; Validatie van het model CAR II voor Vlaanderen

    Energy Technology Data Exchange (ETDEWEB)

    Marien, S.; Celis, D.; Roekens, E.

    2013-04-15

    In Flanders, Belgium, the CAR model (Calculation of Air pollution from Road traffic) for air quality along urban roads was recently extensively validated for NO2. More clarity has been gained about the quality and accuracy of this model [Dutch] In Vlaanderen is het CAR-model (Calculation of Air pollution from Road traffic) voor de luchtkwaliteit langs binnenstedelijke wegen onlangs uitvoerig gevalideerd voor NO2. Er is nu meer duidelijkheid over de kwaliteit en nauwkeurigheid van dit model.

  17. Validation of the Parental Facilitation of Mastery Scale-II.

    Science.gov (United States)

    Zalta, Alyson K; Allred, Kelly M; Jayawickreme, Eranda; Blackie, Laura E R; Chambless, Dianne L

    2017-10-01

    To develop a more reliable and comprehensive version of the Parental Facilitation of Mastery Scale (PFMS) METHOD: In Study 1, 387 undergraduates completed an expanded PFMS (PFMS-II) and measures of parenting, perceived control, responses to early life challenges, and psychopathology. In Study 2, 182 trauma-exposed community participants completed the PFMS-II and measures of perceived control, psychopathology, and well-being RESULTS: In Study 1, exploratory factor analysis of the PFMS-II revealed two factors. These factors replicated in Study 2; one item was removed to achieve measurement invariance across race. The final PFMS-II comprised a 10-item overprotection scale and a 7-item challenge scale. In both samples, this measure demonstrated good convergent and discriminant validity and was more reliable than the original PFMS. Parental challenge was a unique predictor of perceived control in both samples CONCLUSION: The PFMS-II is a valid measure of important parenting behaviors not fully captured in other measures. © 2016 Wiley Periodicals, Inc.

  18. The Danish Barriers Questionnaire-II: preliminary validation in cancer pain patients

    DEFF Research Database (Denmark)

    Jacobsen, Ramune; Møldrup, Claus; Christrup, Lona Louring

    2009-01-01

    OBJECTIVE: The objective of this study was to examine the psychometric properties of the Danish version of the Barriers Questionnaire-II (DBQ-II). METHODS: The validated Norwegian version of the DBQ-II was translated into Danish. Cancer patients for the study were recruited from specialized pain...... cancer pain management. Scale two, Immune System, consisted of three items addressing the belief that pain medications harm the immune system. Scale three, Monitor, consisted of three items addressing the fear that pain medicine masks changes in one's body. Scale four, Communication, consisted of five......: The DBQ-II seems to be a reliable and valid measure of the barriers to pain management among Danish cancer patients....

  19. Labour anxiety questionnaire (KLP II)- revised-the construction and psychological validation

    OpenAIRE

    Putyński, Leszek; Paciorek, Mariusz

    2008-01-01

    Self-report Labour Anxiety Questionnaire (KLP II) was developed to asses the level of labour anxiety in pregnant women. This short tool consists of 9 items, which include attitudes toward labour and fear of labour. The questionnaire was valided on 53 pregnant women. The results of the study indicate that the Labour Anxiety Questionnaire (KLP II) is reliable and valid method to identify pregnant women with high level of labour anxiety.

  20. Social anxiety and fear of negative evaluation: construct validity of the BFNE-II.

    Science.gov (United States)

    Carleton, R Nicholas; Collimore, Kelsey C; Asmundson, Gordon J G

    2007-01-01

    disorder. Psychological Assessment, 17, 179-190]; however [Carleton, R. N., McCreary, D., Norton, P. J., & Asmundson, G. J. G. (in press-a). The Brief Fear of Negative Evaluation Scale, Revised. Depression & Anxiety; Collins, K. A., Westra, H. A., Dozois, D. J. A., & Stewart, S. H. (2005). The validity of the brief version of the fear of negative evaluation scale. Journal of Anxiety Disorders, 19, 345-359] recommend that these items be reworded to maintain scale sensitivity. The present study examined the reliability and validity of the BFNE-II, a version of the BFNE evaluating revisions of the reverse-worded items in a community sample. A unitary model of the BFNE-II resulted in excellent confirmatory factor analysis fit indices. Moderate convergent and discriminant validity were found when BFNE-II items were correlated with additional independent measures of social anxiety [i.e., Social Interaction Anxiety & Social Phobia Scales; Mattick, R. P., & Clarke, J. C. (1998). Development and validation of measures of social phobia scrutiny fear and social interaction anxiety. Behaviour Research and Therapy, 36, 455-470], and fear [i.e., Anxiety Sensitivity Index; Reiss, S., & McNally, R. J. (1985). The expectancy model of fear. In S. Reiss, R. R. Bootzin (Eds.), Theoretical issues in behaviour therapy (pp. 107--121). New York: Academic Press. and the Illness/Injury Sensitivity Index; Carleton, R. N., Park, I., & Asmundson, G. J. G. (in press-b). The Illness/Injury Sensitivity Index: an examination of construct validity. Depression & Anxiety). These findings support the utility of the revised items and the validity of the BFNE-II as a measure of the fear of negative evaluation. Implications and future research directions are discussed.

  1. Comparison of mortality prediction models and validation of SAPS II in critically ill burns patients.

    Science.gov (United States)

    Pantet, O; Faouzi, M; Brusselaers, N; Vernay, A; Berger, M M

    2016-06-30

    Specific burn outcome prediction scores such as the Abbreviated Burn Severity Index (ABSI), Ryan, Belgian Outcome of Burn Injury (BOBI) and revised Baux scores have been extensively studied. Validation studies of the critical care score SAPS II (Simplified Acute Physiology Score) have included burns patients but not addressed them as a cohort. The study aimed at comparing their performance in a Swiss burns intensive care unit (ICU) and to observe whether they were affected by a standardized definition of inhalation injury. We conducted a retrospective cohort study, including all consecutive ICU burn admissions (n=492) between 1996 and 2013: 5 epochs were defined by protocol changes. As required for SAPS II calculation, stays burned (TBSA) and inhalation injury (systematic standardized diagnosis since 2006). Study epochs were compared (χ2 test, ANOVA). Score performance was assessed by receiver operating characteristic curve analysis. SAPS II performed well (AUC 0.89), particularly in burns burns <40% TBSA. Ryan and BOBI scores were least accurate, as they heavily weight inhalation injury.

  2. Validity of the Mania Subscale of the Diagnostic Assessment for the Severely Handicapped-II (DASH-II).

    Science.gov (United States)

    Matson, Johnny L.; Smiroldo, Brandi B.

    1997-01-01

    A study tested the validity of the Diagnostic Assessment for the Severely Handicapped-II (DASH-II) for determining the presence of mania (bipolar disorder) in 22 individuals with severe mental retardation. Results found the mania subscale to be internally consistent and able to be used to classify manic and control subjects accurately. (Author/CR)

  3. Validation and comparison of dispersion models of RTARC DSS

    International Nuclear Information System (INIS)

    Duran, J.; Pospisil, M.

    2004-01-01

    RTARC DSS (Real Time Accident Release Consequences - Decision Support System) is a computer code developed at the VUJE Trnava, Inc. (Stubna, M. et al, 1993). The code calculations include atmospheric transport and diffusion, dose assessment, evaluation and displaying of the affected zones, evaluation of the early health effects, concentration and dose rate time dependence in the selected sites etc. The simulation of the protective measures (sheltering, iodine administration) is involved. The aim of this paper is to present the process of validation of the RTARC dispersion models. RTARC includes models for calculations of release for very short (Method Monte Carlo - MEMOC), short (Gaussian Straight-Line Model) and long distances (Puff Trajectory Model - PTM). Validation of the code RTARC was performed using the results of comparisons and experiments summarized in the Table 1.: 1. Experiments and comparisons in the process of validation of the system RTARC - experiments or comparison - distance - model. Wind tunnel experiments (Universitaet der Bundeswehr, Muenchen) - Area of NPP - Method Monte Carlo. INEL (Idaho National Engineering Laboratory) - short/medium - Gaussian model and multi tracer atmospheric experiment - distances - PTM. Model Validation Kit - short distances - Gaussian model. STEP II.b 'Realistic Case Studies' - long distances - PTM. ENSEMBLE comparison - long distances - PTM (orig.)

  4. Ensuring the Validity of the Micro Foundation in DSGE Models

    DEFF Research Database (Denmark)

    Andreasen, Martin Møller

    & Primiceri (American Economic Review, forth- coming) and Fernández-Villaverde & Rubio-Ramírez (Review of Economic Studies, 2007) do not satisfy these sufficient conditions, or any other known set of conditions ensuring finite values for the objective functions. Thus, the validity of the micro foundation......The presence of i) stochastic trends, ii) deterministic trends, and/or iii) stochastic volatil- ity in DSGE models may imply that the agents' objective functions attain infinite values. We say that such models do not have a valid micro foundation. The paper derives sufficient condi- tions which...... ensure that the objective functions of the households and the firms are finite even when various trends and stochastic volatility are included in a standard DSGE model. Based on these conditions we test the validity of the micro foundation in six DSGE models from the literature. The models of Justiniano...

  5. OC5 Project Phase II: Validation of Global Loads of the DeepCwind Floating Semisubmersible Wind Turbine

    DEFF Research Database (Denmark)

    Robertson, Amy N.; Wendt, Fabian; Jonkman, Jason M.

    2017-01-01

    This paper summarizes the findings from Phase II of the Offshore Code Comparison, Collaboration, Continued, with Correlation project. The project is run under the International Energy Agency Wind Research Task 30, and is focused on validating the tools used for modeling offshore wind systems thro...

  6. Cosmological parameter uncertainties from SALT-II type Ia supernova light curve models

    International Nuclear Information System (INIS)

    Mosher, J.; Sako, M.; Guy, J.; Astier, P.; Betoule, M.; El-Hage, P.; Pain, R.; Regnault, N.; Kessler, R.; Frieman, J. A.; Marriner, J.; Biswas, R.; Kuhlmann, S.; Schneider, D. P.

    2014-01-01

    We use simulated type Ia supernova (SN Ia) samples, including both photometry and spectra, to perform the first direct validation of cosmology analysis using the SALT-II light curve model. This validation includes residuals from the light curve training process, systematic biases in SN Ia distance measurements, and a bias on the dark energy equation of state parameter w. Using the SN-analysis package SNANA, we simulate and analyze realistic samples corresponding to the data samples used in the SNLS3 analysis: ∼120 low-redshift (z < 0.1) SNe Ia, ∼255 Sloan Digital Sky Survey SNe Ia (z < 0.4), and ∼290 SNLS SNe Ia (z ≤ 1). To probe systematic uncertainties in detail, we vary the input spectral model, the model of intrinsic scatter, and the smoothing (i.e., regularization) parameters used during the SALT-II model training. Using realistic intrinsic scatter models results in a slight bias in the ultraviolet portion of the trained SALT-II model, and w biases (w input – w recovered ) ranging from –0.005 ± 0.012 to –0.024 ± 0.010. These biases are indistinguishable from each other within the uncertainty; the average bias on w is –0.014 ± 0.007.

  7. Cosmological Parameter Uncertainties from SALT-II Type Ia Supernova Light Curve Models

    Energy Technology Data Exchange (ETDEWEB)

    Mosher, J. [Pennsylvania U.; Guy, J. [LBL, Berkeley; Kessler, R. [Chicago U., KICP; Astier, P. [Paris U., VI-VII; Marriner, J. [Fermilab; Betoule, M. [Paris U., VI-VII; Sako, M. [Pennsylvania U.; El-Hage, P. [Paris U., VI-VII; Biswas, R. [Argonne; Pain, R. [Paris U., VI-VII; Kuhlmann, S. [Argonne; Regnault, N. [Paris U., VI-VII; Frieman, J. A. [Fermilab; Schneider, D. P. [Penn State U.

    2014-08-29

    We use simulated type Ia supernova (SN Ia) samples, including both photometry and spectra, to perform the first direct validation of cosmology analysis using the SALT-II light curve model. This validation includes residuals from the light curve training process, systematic biases in SN Ia distance measurements, and a bias on the dark energy equation of state parameter w. Using the SN-analysis package SNANA, we simulate and analyze realistic samples corresponding to the data samples used in the SNLS3 analysis: ~120 low-redshift (z < 0.1) SNe Ia, ~255 Sloan Digital Sky Survey SNe Ia (z < 0.4), and ~290 SNLS SNe Ia (z ≤ 1). To probe systematic uncertainties in detail, we vary the input spectral model, the model of intrinsic scatter, and the smoothing (i.e., regularization) parameters used during the SALT-II model training. Using realistic intrinsic scatter models results in a slight bias in the ultraviolet portion of the trained SALT-II model, and w biases (w (input) – w (recovered)) ranging from –0.005 ± 0.012 to –0.024 ± 0.010. These biases are indistinguishable from each other within the uncertainty, the average bias on w is –0.014 ± 0.007.

  8. Physical validation issue of the NEPTUNE two-phase modelling: validation plan to be adopted, experimental programs to be set up and associated instrumentation techniques developed

    International Nuclear Information System (INIS)

    Pierre Peturaud; Eric Hervieu

    2005-01-01

    Full text of publication follows: A long-term joint development program for the next generation of nuclear reactors simulation tools has been launched in 2001 by EDF (Electricite de France) and CEA (Commissariat a l'Energie Atomique). The NEPTUNE Project constitutes the Thermal-Hydraulics part of this comprehensive program. Along with the underway development of this new two-phase flow software platform, the physical validation of the involved modelling is a crucial issue, whatever the modelling scale is, and the present paper deals with this issue. After a brief recall about the NEPTUNE platform, the general validation strategy to be adopted is first of all clarified by means of three major features: (i) physical validation in close connection with the concerned industrial applications, (ii) involving (as far as possible) a two-step process successively focusing on dominant separate models and assessing the whole modelling capability, (iii) thanks to the use of relevant data with respect to the validation aims. Based on this general validation process, a four-step generic work approach has been defined; it includes: (i) a thorough analysis of the concerned industrial applications to identify the key physical phenomena involved and associated dominant basic models, (ii) an assessment of these models against the available validation pieces of information, to specify the additional validation needs and define dedicated validation plans, (iii) an inventory and assessment of existing validation data (with respect to the requirements specified in the previous task) to identify the actual needs for new validation data, (iv) the specification of the new experimental programs to be set up to provide the needed new data. This work approach has been applied to the NEPTUNE software, focusing on 8 high priority industrial applications, and it has resulted in the definition of (i) the validation plan and experimental programs to be set up for the open medium 3D modelling

  9. Validation of Multibody Program to Optimize Simulated Trajectories II Parachute Simulation with Interacting Forces

    Science.gov (United States)

    Raiszadeh, Behzad; Queen, Eric M.; Hotchko, Nathaniel J.

    2009-01-01

    A capability to simulate trajectories of multiple interacting rigid bodies has been developed, tested and validated. This capability uses the Program to Optimize Simulated Trajectories II (POST 2). The standard version of POST 2 allows trajectory simulation of multiple bodies without force interaction. In the current implementation, the force interaction between the parachute and the suspended bodies has been modeled using flexible lines, allowing accurate trajectory simulation of the individual bodies in flight. The POST 2 multibody capability is intended to be general purpose and applicable to any parachute entry trajectory simulation. This research paper explains the motivation for multibody parachute simulation, discusses implementation methods, and presents validation of this capability.

  10. Groundwater Model Validation

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed E. Hassan

    2006-01-24

    Models have an inherent uncertainty. The difficulty in fully characterizing the subsurface environment makes uncertainty an integral component of groundwater flow and transport models, which dictates the need for continuous monitoring and improvement. Building and sustaining confidence in closure decisions and monitoring networks based on models of subsurface conditions require developing confidence in the models through an iterative process. The definition of model validation is postulated as a confidence building and long-term iterative process (Hassan, 2004a). Model validation should be viewed as a process not an end result. Following Hassan (2004b), an approach is proposed for the validation process of stochastic groundwater models. The approach is briefly summarized herein and detailed analyses of acceptance criteria for stochastic realizations and of using validation data to reduce input parameter uncertainty are presented and applied to two case studies. During the validation process for stochastic models, a question arises as to the sufficiency of the number of acceptable model realizations (in terms of conformity with validation data). Using a hierarchical approach to make this determination is proposed. This approach is based on computing five measures or metrics and following a decision tree to determine if a sufficient number of realizations attain satisfactory scores regarding how they represent the field data used for calibration (old) and used for validation (new). The first two of these measures are applied to hypothetical scenarios using the first case study and assuming field data consistent with the model or significantly different from the model results. In both cases it is shown how the two measures would lead to the appropriate decision about the model performance. Standard statistical tests are used to evaluate these measures with the results indicating they are appropriate measures for evaluating model realizations. The use of validation

  11. Biosorption optimization of lead(II), cadmium(II) and copper(II) using response surface methodology and applicability in isotherms and thermodynamics modeling

    International Nuclear Information System (INIS)

    Singh, Rajesh; Chadetrik, Rout; Kumar, Rajender; Bishnoi, Kiran; Bhatia, Divya; Kumar, Anil; Bishnoi, Narsi R.; Singh, Namita

    2010-01-01

    The present study was carried out to optimize the various environmental conditions for biosorption of Pb(II), Cd(II) and Cu(II) by investigating as a function of the initial metal ion concentration, temperature, biosorbent loading and pH using Trichoderma viride as adsorbent. Biosorption of ions from aqueous solution was optimized in a batch system using response surface methodology. The values of R 2 0.9716, 0.9699 and 0.9982 for Pb(II), Cd(II) and Cu(II) ions, respectively, indicated the validity of the model. The thermodynamic properties ΔG o , ΔH o , ΔE o and ΔS o by the metal ions for biosorption were analyzed using the equilibrium constant value obtained from experimental data at different temperatures. The results showed that biosorption of Pb(II) ions by T. viride adsorbent is more endothermic and spontaneous. The study was attempted to offer a better understating of representative biosorption isotherms and thermodynamics with special focuses on binding mechanism for biosorption using the FTIR spectroscopy.

  12. Biosorption optimization of lead(II), cadmium(II) and copper(II) using response surface methodology and applicability in isotherms and thermodynamics modeling

    Energy Technology Data Exchange (ETDEWEB)

    Singh, Rajesh; Chadetrik, Rout; Kumar, Rajender; Bishnoi, Kiran; Bhatia, Divya; Kumar, Anil [Department of Environmental Science and Engineering, Guru Jambheshwar University of Science and Technology, Hisar 125001, Haryana (India); Bishnoi, Narsi R., E-mail: nrbishnoi@gmail.com [Department of Environmental Science and Engineering, Guru Jambheshwar University of Science and Technology, Hisar 125001, Haryana (India); Singh, Namita [Department of Bio and Nanotechnology, Guru Jambheshwar University of Science and Technology, Hisar 125001, Haryana (India)

    2010-02-15

    The present study was carried out to optimize the various environmental conditions for biosorption of Pb(II), Cd(II) and Cu(II) by investigating as a function of the initial metal ion concentration, temperature, biosorbent loading and pH using Trichoderma viride as adsorbent. Biosorption of ions from aqueous solution was optimized in a batch system using response surface methodology. The values of R{sup 2} 0.9716, 0.9699 and 0.9982 for Pb(II), Cd(II) and Cu(II) ions, respectively, indicated the validity of the model. The thermodynamic properties {Delta}G{sup o}, {Delta}H{sup o}, {Delta}E{sup o} and {Delta}S{sup o} by the metal ions for biosorption were analyzed using the equilibrium constant value obtained from experimental data at different temperatures. The results showed that biosorption of Pb(II) ions by T. viride adsorbent is more endothermic and spontaneous. The study was attempted to offer a better understating of representative biosorption isotherms and thermodynamics with special focuses on binding mechanism for biosorption using the FTIR spectroscopy.

  13. Monte Carlo-based validation of the ENDF/MC2-II/SDX cell homogenization path

    International Nuclear Information System (INIS)

    Wade, D.C.

    1979-04-01

    The results are presented of a program of validation of the unit cell homogenization prescriptions and codes used for the analysis of Zero Power Reactor (ZPR) fast breeder reactor critical experiments. The ZPR drawer loading patterns comprise both plate type and pin-calandria type unit cells. A prescription is used to convert the three dimensional physical geometry of the drawer loadings into one dimensional calculational models. The ETOE-II/MC 2 -II/SDX code sequence is used to transform ENDF/B basic nuclear data into unit cell average broad group cross sections based on the 1D models. Cell average, broad group anisotropic diffusion coefficients are generated using the methods of Benoist or of Gelbard. The resulting broad (approx. 10 to 30) group parameters are used in multigroup diffusion and S/sub n/ transport calculations of full core XY or RZ models which employ smeared atom densities to represent the contents of the unit cells

  14. Monte Carlo; based validation of the ENDF/MC2-II/SDX cell homogenization path

    International Nuclear Information System (INIS)

    Wade, D.C.

    1978-11-01

    The results are summarized of a program of validation of the unit cell homogenization prescriptions and codes used for the analysis of Zero Power Reactor (ZPR) fast breeder reactor critical experiments. The ZPR drawer loading patterns comprise both plate type and pin-calandria type unit cells. A prescription is used to convert the three dimensional physical geometry of the drawer loadings into one dimensional calculational models. The ETOE-II/MC 2 -II/SDX code sequence is used to transform ENDF/B basic nuclear data into unit cell average broad group cross sections based on the 1D models. Cell average, broad group anisotropic diffusion coefficients are generated using the methods of Benoist or of Gelbard. The resulting broad (approx. 10 to 30) group parameters are used in multigroup diffusion and S/sub n/ transport calculations of full core XY or RZ models which employ smeared atom densities to represent the contents of the unit cells

  15. Copenhagen Psychosocial Questionnaire - A validation study using the Job Demand-Resources model.

    Science.gov (United States)

    Berthelsen, Hanne; Hakanen, Jari J; Westerlund, Hugo

    2018-01-01

    This study aims at investigating the nomological validity of the Copenhagen Psychosocial Questionnaire (COPSOQ II) by using an extension of the Job Demands-Resources (JD-R) model with aspects of work ability as outcome. The study design is cross-sectional. All staff working at public dental organizations in four regions of Sweden were invited to complete an electronic questionnaire (75% response rate, n = 1345). The questionnaire was based on COPSOQ II scales, the Utrecht Work Engagement scale, and the one-item Work Ability Score in combination with a proprietary item. The data was analysed by Structural Equation Modelling. This study contributed to the literature by showing that: A) The scale characteristics were satisfactory and the construct validity of COPSOQ instrument could be integrated in the JD-R framework; B) Job resources arising from leadership may be a driver of the two processes included in the JD-R model; and C) Both the health impairment and motivational processes were associated with WA, and the results suggested that leadership may impact WA, in particularly by securing task resources. In conclusion, the nomological validity of COPSOQ was supported as the JD-R model-can be operationalized by the instrument. This may be helpful for transferral of complex survey results and work life theories to practitioners in the field.

  16. Model Validation Status Review

    International Nuclear Information System (INIS)

    E.L. Hardin

    2001-01-01

    The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified, and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M and O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and

  17. Model Validation Status Review

    Energy Technology Data Exchange (ETDEWEB)

    E.L. Hardin

    2001-11-28

    The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified, and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M&O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and

  18. Assessing the external validity of model-based estimates of the incidence of heart attack in England: a modelling study

    Directory of Open Access Journals (Sweden)

    Peter Scarborough

    2016-11-01

    Full Text Available Abstract Background The DisMod II model is designed to estimate epidemiological parameters on diseases where measured data are incomplete and has been used to provide estimates of disease incidence for the Global Burden of Disease study. We assessed the external validity of the DisMod II model by comparing modelled estimates of the incidence of first acute myocardial infarction (AMI in England in 2010 with estimates derived from a linked dataset of hospital records and death certificates. Methods Inputs for DisMod II were prevalence rates of ever having had an AMI taken from a population health survey, total mortality rates and AMI mortality rates taken from death certificates. By definition, remission rates were zero. We estimated first AMI incidence in an external dataset from England in 2010 using a linked dataset including all hospital admissions and death certificates since 1998. 95 % confidence intervals were derived around estimates from the external dataset and DisMod II estimates based on sampling variance and reported uncertainty in prevalence estimates respectively. Results Estimates of the incidence rate for the whole population were higher in the DisMod II results than the external dataset (+54 % for men and +26 % for women. Age-specific results showed that the DisMod II results over-estimated incidence for all but the oldest age groups. Confidence intervals for the DisMod II and external dataset estimates did not overlap for most age groups. Conclusion By comparison with AMI incidence rates in England, DisMod II did not achieve external validity for age-specific incidence rates, but did provide global estimates of incidence that are of similar magnitude to measured estimates. The model should be used with caution when estimating age-specific incidence rates.

  19. Validation of the multimedia version of the RDC/TMD axis II questionnaire in Portuguese

    Directory of Open Access Journals (Sweden)

    Ricardo Figueiredo Cavalcanti

    2010-06-01

    Full Text Available OBJECTIVE: The aim of the study was to validate the multimedia version of the Research Diagnostic Criteria for Temporomandibular Disorders (RDC/TMD Axis II Questionnaire in Portuguese language. MATERIAL AND METHODS: The sample comprised 30 patients with signs and symptoms of temporomandibular disorders (TMD, evaluated at the Orofacial Pain Control Center of the Dental School of the University of Pernambuco, Brazil, between April and June 2006. Data collection was performed using the following instruments: Simplifed Anamnestic Index (SAI and RDC/TMD Axis II written version and multimedia version. The validation process consisted of analyzing the internal consistency of the scales. Concurrent and convergent validity were evaluated by the Spearman's rank correlation. In addition, test and analysis of reproducibility by the Kappa weighted statistical test and Spearman's rank correlation test were performed. RESULTS: The multimedia version of the RDC/TMD Axis II questionnaire in Portuguese was considered consistent (Crombrach alpha = 0.94, reproducible (Spearman 0.670 to 0.913, p<0.01 and valid (p<0.01. CONCLUSION: The questionnaire showed valid and reproducible results, and represents an instrument of practical application in epidemiological studies of TMD in the Brazilian population.

  20. Validation of HEDR models

    International Nuclear Information System (INIS)

    Napier, B.A.; Simpson, J.C.; Eslinger, P.W.; Ramsdell, J.V. Jr.; Thiede, M.E.; Walters, W.H.

    1994-05-01

    The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computer models for estimating the possible radiation doses that individuals may have received from past Hanford Site operations. This document describes the validation of these models. In the HEDR Project, the model validation exercise consisted of comparing computational model estimates with limited historical field measurements and experimental measurements that are independent of those used to develop the models. The results of any one test do not mean that a model is valid. Rather, the collection of tests together provide a level of confidence that the HEDR models are valid

  1. Validating the Beck Depression Inventory-II in Indonesia's general population and coronary heart disease patients

    NARCIS (Netherlands)

    Ginting, H.; Näring, G.W.B.; Veld, W.M. van der; Srisayekti, W.; Becker, E.S.

    2013-01-01

    This study assesses the validity and determines the cut-off point for the Beck Depression Inventory-II (the BDI-II) among Indonesians. The Indonesian version of the BDI-II (the Indo BDI-II) was administered to 720 healthy individuals from the general population, 215 Coronary Heart Disease (CHD)

  2. Translation, adaptation and validation of a Portuguese version of the Moorehead-Ardelt Quality of Life Questionnaire II.

    Science.gov (United States)

    Maciel, João; Infante, Paulo; Ribeiro, Susana; Ferreira, André; Silva, Artur C; Caravana, Jorge; Carvalho, Manuel G

    2014-11-01

    The prevalence of obesity has increased worldwide. An assessment of the impact of obesity on health-related quality of life (HRQoL) requires specific instruments. The Moorehead-Ardelt Quality of Life Questionnaire II (MA-II) is a widely used instrument to assess HRQoL in morbidly obese patients. The objective of this study was to translate and validate a Portuguese version of the MA-II.The study included forward and backward translations of the original MA-II. The reliability of the Portuguese MA-II was estimated using the internal consistency and test-retest methods. For validation purposes, the Spearman's rank correlation coefficient was used to evaluate the correlation between the Portuguese MA-II and the Portuguese versions of two other questionnaires, the 36-item Short Form Health Survey (SF-36) and the Impact of Weight on Quality of Life-Lite (IWQOL-Lite).One hundred and fifty morbidly obese patients were randomly assigned to test the reliability and validity of the Portuguese MA-II. Good internal consistency was demonstrated by a Cronbach's alpha coefficient of 0.80, and a very good agreement in terms of test-retest reliability was recorded, with an overall intraclass correlation coefficient (ICC) of 0.88. The total sums of MA-II scores and each item of MA-II were significantly correlated with all domains of SF-36 and IWQOL-Lite. A statistically significant negative correlation was found between the MA-II total score and BMI. Moreover, age, gender and surgical status were independent predictors of MA-II total score.A reliable and valid Portuguese version of the MA-II was produced, thus enabling the routine use of MA-II in the morbidly obese Portuguese population.

  3. Copenhagen Psychosocial Questionnaire - A validation study using the Job Demand-Resources model.

    Directory of Open Access Journals (Sweden)

    Hanne Berthelsen

    Full Text Available This study aims at investigating the nomological validity of the Copenhagen Psychosocial Questionnaire (COPSOQ II by using an extension of the Job Demands-Resources (JD-R model with aspects of work ability as outcome.The study design is cross-sectional. All staff working at public dental organizations in four regions of Sweden were invited to complete an electronic questionnaire (75% response rate, n = 1345. The questionnaire was based on COPSOQ II scales, the Utrecht Work Engagement scale, and the one-item Work Ability Score in combination with a proprietary item. The data was analysed by Structural Equation Modelling.This study contributed to the literature by showing that: A The scale characteristics were satisfactory and the construct validity of COPSOQ instrument could be integrated in the JD-R framework; B Job resources arising from leadership may be a driver of the two processes included in the JD-R model; and C Both the health impairment and motivational processes were associated with WA, and the results suggested that leadership may impact WA, in particularly by securing task resources.In conclusion, the nomological validity of COPSOQ was supported as the JD-R model-can be operationalized by the instrument. This may be helpful for transferral of complex survey results and work life theories to practitioners in the field.

  4. Doubtful outcome of the validation of the Rome II questionnaire: validation of a symptom based diagnostic tool

    Directory of Open Access Journals (Sweden)

    Nylin Henry BO

    2009-12-01

    Full Text Available Abstract Background Questionnaires are used in research and clinical practice. For gastrointestinal complaints the Rome II questionnaire is internationally known but not validated. The aim of this study was to validate a printed and a computerized version of Rome II, translated into Swedish. Results from various analyses are reported. Methods Volunteers from a population based colonoscopy study were included (n = 1011, together with patients seeking general practice (n = 45 and patients visiting a gastrointestinal specialists' clinic (n = 67. The questionnaire consists of 38 questions concerning gastrointestinal symptoms and complaints. Diagnoses are made after a special code. Our validation included analyses of the translation, feasibility, predictability, reproducibility and reliability. Kappa values and overall agreement were measured. The factor structures were confirmed using a principal component analysis and Cronbach's alpha was used to test the internal consistency. Results and Discussion Translation and back translation showed good agreement. The questionnaire was easy to understand and use. The reproducibility test showed kappa values of 0.60 for GERS, 0.52 for FD, and 0.47 for IBS. Kappa values and overall agreement for the predictability when the diagnoses by the questionnaire were compared to the diagnoses by the clinician were 0.26 and 90% for GERS, 0.18 and 85% for FD, and 0.49 and 86% for IBS. Corresponding figures for the agreement between the printed and the digital version were 0.50 and 92% for GERS, 0.64 and 95% for FD, and 0.76 and 95% for IBS. Cronbach's alpha coefficient for GERS was 0.75 with a span per item of 0.71 to 0.76. For FD the figures were 0.68 and 0.54 to 0.70 and for IBS 0.61 and 0.56 to 0.66. The Rome II questionnaire has never been thoroughly validated before even if diagnoses made by the Rome criteria have been compared to diagnoses made in clinical practice. Conclusion The accuracy of the Swedish version of

  5. Predictive modeling of infrared radiative heating in tomato dry-peeling process: Part II. Model validation and sensitivity analysis

    Science.gov (United States)

    A predictive mathematical model was developed to simulate heat transfer in a tomato undergoing double sided infrared (IR) heating in a dry-peeling process. The aims of this study were to validate the developed model using experimental data and to investigate different engineering parameters that mos...

  6. Comparative validity of MMPI-2 and MCMI-II personality disorder classifications.

    Science.gov (United States)

    Wise, E A

    1996-06-01

    Minnesota Multiphasic Personality Inventory-2 (MMPI-2) overlapping and nonoverlapping scales were demonstrated to perform comparably to their original MMPI forms. They were then evaluated for convergent and discriminant validity with the Million Clinical Multiaxial Inventory-II (MCMI-II) personality disorder scales. The MMPI-2 and MCMI-II personality disorder scales demonstrated convergent and discriminant coefficients similar to their original forms. However, the MMPI-2 personality scales classified significantly more of the sample as Dramatic, whereas the MCMI-II diagnosed more of the sample as Anxious. Furthermore, single-scale and 2-point code type classification rates were quite low, indicating that at the level of the individual, the personality disorder scales are not measuring comparable constructs. Hence, each instrument is providing similar and unique information, justifying their continued use together for the purpose of diagnosing personality disorders.

  7. How to enhance the future use of energy policy simulation models through ex post validation

    International Nuclear Information System (INIS)

    Qudrat-Ullah, Hassan

    2017-01-01

    Although simulation and modeling in general and system dynamics models in particular has long served the energy policy domain, ex post validation of these energy policy models is rarely addressed. In fact, ex post validation is a valuable area of research because it offers modelers a chance to enhance the future use of their simulation models by validating them against the field data. This paper contributes by presenting (i) a system dynamics simulation model, which was developed and used to do a three dimensional, socio-economical and environmental long-term assessment of Pakistan's energy policy in 1999, (ii) a systematic analysis of the 15-years old predictive scenarios produced by a system dynamics simulation model through ex post validation. How did the model predictions compare with the actual data? We report that the ongoing crisis of the electricity sector of Pakistan is unfolding, as the model-based scenarios had projected. - Highlights: • Argues that increased use of energy policy models is dependent on their credibility validation. • An ex post validation process is presented as a solution to build confidence in models. • A unique system dynamics model, MDESRAP, is presented. • The root mean square percentage error and Thiel's inequality statistics are applied. • The dynamic model, MDESRAP, is presented as an ex ante and ex post validated model.

  8. Polarographic validation of chemical speciation models

    International Nuclear Information System (INIS)

    Duffield, J.R.; Jarratt, J.A.

    2001-01-01

    It is well established that the chemical speciation of an element in a given matrix, or system of matrices, is of fundamental importance in controlling the transport behaviour of the element. Therefore, to accurately understand and predict the transport of elements and compounds in the environment it is a requirement that both the identities and concentrations of trace element physico-chemical forms can be ascertained. These twin requirements present the analytical scientist with considerable challenges given the labile equilibria, the range of time scales (from nanoseconds to years) and the range of concentrations (ultra-trace to macro) that may be involved. As a result of this analytical variability, chemical equilibrium modelling has become recognised as an important predictive tool in chemical speciation analysis. However, this technique requires firm underpinning by the use of complementary experimental techniques for the validation of the predictions made. The work reported here has been undertaken with the primary aim of investigating possible methodologies that can be used for the validation of chemical speciation models. However, in approaching this aim, direct chemical speciation analyses have been made in their own right. Results will be reported and analysed for the iron(II)/iron(III)-citrate proton system (pH 2 to 10; total [Fe] = 3 mmol dm -3 ; total [citrate 3- ] 10 mmol dm -3 ) in which equilibrium constants have been determined using glass electrode potentiometry, speciation is predicted using the PHREEQE computer code, and validation of predictions is achieved by determination of iron complexation and redox state with associated concentrations. (authors)

  9. Validation of simulation models

    DEFF Research Database (Denmark)

    Rehman, Muniza; Pedersen, Stig Andur

    2012-01-01

    In philosophy of science, the interest for computational models and simulations has increased heavily during the past decades. Different positions regarding the validity of models have emerged but the views have not succeeded in capturing the diversity of validation methods. The wide variety...

  10. Validity of transcobalamin II-based radioassay for the determination of serum vitamin B12 concentrations

    International Nuclear Information System (INIS)

    Paltridge, G.; Rudzki, Z.; Ryall, R.G.

    1980-01-01

    A valid radioassay for the estimation of serum vitamin B 12 in the presence of naturally occurring vitamin B 12 (= cobalamin) analogues can be operated if serum transcobalamin II (TC II) is used as the binding protein. Serum samples that gave diagnostically discrepant results when their vitamin B 12 content was analysed (i) by a commercial radioassay known to be susceptible to interference from cobalamin analogues, and (ii) by microbiological assay, were further analysed by an alternative radioassay which uses the transcobalamins (principally TC II) of diluted normal serum as the assay binding protein. Concordance between the results from microbiological assay and the TC II-based radioassay was found in all cases. In an extended study over a three-year period, all routine serum samples sent for vitamin B 12 analysis that had a vitamin B 12 content of less than 320 ng/l by the TC II-based radioassay (reference range 200-850 ng/l) were reanalysed using an established microbiological method. Over 1000 samples were thus analysed. The data are presented to demonstrate the validity of the TC II-based radioassay results in this group of patients, serum samples from which are most likely to produce diagnostically erroneous vitamin B 12 results when analysed by a radioassay that is less specific for cobalamins. (author)

  11. Transport Risk Index of Physiologic Stability, version II (TRIPS-II): a simple and practical neonatal illness severity score.

    Science.gov (United States)

    Lee, Shoo K; Aziz, Khalid; Dunn, Michael; Clarke, Maxine; Kovacs, Lajos; Ojah, Cecil; Ye, Xiang Y

    2013-05-01

    Derive and validate a practical assessment of infant illness severity at admission to neonatal intensive care units (NICUs). Prospective study involving 17,075 infants admitted to 15 NICUs in 2006 to 2008. Logistic regression was used to derive a prediction model for mortality comprising four empirically weighted items (temperature, blood pressure, respiratory status, response to noxious stimuli). This Transport Risk Index of Physiologic Stability, version II (TRIPS-II) was then validated for prediction of 7-day and total NICU mortality. TRIPS-II discriminated 7-day (receiver operating curve [ROC] area, 0.90) and total NICU mortality (ROC area, 0.87) from survival. Furthermore, there was a direct association between changes in TRIPS-II at 12 and 24 hours and mortality. There was good calibration across the full range of TRIPS-II scores and the gestational age at birth, and addition of TRIPS-II improved performance of prediction models that use gestational age and baseline population risk variables. TRIPS-II is a validated benchmarking tool for assessing infant illness severity at admission and for up to 24 hours after. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  12. Test-driven verification/validation of model transformations

    Institute of Scientific and Technical Information of China (English)

    László LENGYEL; Hassan CHARAF

    2015-01-01

    Why is it important to verify/validate model transformations? The motivation is to improve the quality of the trans-formations, and therefore the quality of the generated software artifacts. Verified/validated model transformations make it possible to ensure certain properties of the generated software artifacts. In this way, verification/validation methods can guarantee different requirements stated by the actual domain against the generated/modified/optimized software products. For example, a verified/ validated model transformation can ensure the preservation of certain properties during the model-to-model transformation. This paper emphasizes the necessity of methods that make model transformation verified/validated, discusses the different scenarios of model transformation verification and validation, and introduces the principles of a novel test-driven method for verifying/ validating model transformations. We provide a solution that makes it possible to automatically generate test input models for model transformations. Furthermore, we collect and discuss the actual open issues in the field of verification/validation of model transformations.

  13. J-ACT II. Differences in rate of valid recanalization and of a favorable outcome by site of MCA occlusion

    International Nuclear Information System (INIS)

    Hirano, Teruyuki

    2010-01-01

    The background and purpose of this study was to elucidate whether the effects of alteplase differ with occlusion site of the middle cerebral artery (MCA). An exploratory analysis was made of 57 patients enrolled on the Japan Alteplase Clinical Trial II (J-ACT II). The residual vessel length (mm), determined on pretreatment MR angiography (MRA), was used to reflect the occluded site. The proportions of patients with valid recanalization (modified Mori grade 2-3) at 6 and 24 hours, and a favorable outcome (modified Rankin scale 0-1 at 3 months) were compared between the groups dichotomized according to their lengths of residual vessel. Multiple logistic regression models were generated to elucidate the predictors of valid recanalization and a favorable outcome. Receiver operating characteristic (ROC) analysis revealed that 5 mm was the practical cutoff length for the dichotomization. In patients with M1 length <5 mm (n=12), the frequencies of valid recanalization at 6/24 hours (16.6%/25.0%) were significantly low compared with those (62.2%/82.2%) of 45 patients with a residual M1 length of ≥5 mm and M2 occlusions (p=0.008 for 6 hours, p<0.001 for 24 hours). The proportion of a favorable outcome was also small in patients with M1 length <5 mm (8.3%), as compared to the others (57.8%, p=0.004). In logistic regression models, the site of MCA occlusion (<5 mm) was the significant predictor of valid recanalization at 6/24 hours and of a favorable outcome. In patients with acute MCA occlusion, residual vessel length <5 mm on MRA can identify poor responders. (author)

  14. Development and validation of a multivariate prediction model for patients with acute pancreatitis in Intensive Care Medicine.

    Science.gov (United States)

    Zubia-Olaskoaga, Felix; Maraví-Poma, Enrique; Urreta-Barallobre, Iratxe; Ramírez-Puerta, María-Rosario; Mourelo-Fariña, Mónica; Marcos-Neira, María-Pilar; García-García, Miguel Ángel

    2018-03-01

    Development and validation of a multivariate prediction model for patients with acute pancreatitis (AP) admitted in Intensive Care Units (ICU). A prospective multicenter observational study, in 1 year period, in 46 international ICUs (EPAMI study). adults admitted to an ICU with AP and at least one organ failure. Development of a multivariate prediction model, using the worst data of the stay in ICU, based in multivariate analysis, simple imputation in a development cohort. The model was validated in another cohort. 374 patients were included (mortality of 28.9%). Variables with statistical significance in multivariate analysis were age, no alcoholic and no biliary etiology, development of shock, development of respiratory failure, need of continuous renal replacement therapy, and intra-abdominal pressure. The model created with these variables presented an AUC of ROC curve of 0.90 (CI 95% 0.81-0.94) in the validation cohort. We developed a multivariable prediction model, and AP cases could be classified as low mortality risk (between 2 and 9.5 points, mortality of 1.35%), moderate mortality risk (between 10 and 12.5 points, 28.92% of mortality), and high mortality risk (13 points of more, mortality of 88.37%). Our model presented better AUC of ROC curve than APACHE II (0.91 vs 0.80) and SOFA in the first 24 h (0.91 vs 0.79). We developed and validated a multivariate prediction model, which can be applied in any moment of the stay in ICU, with better discriminatory power than APACHE II and SOFA in the first 24 h. Copyright © 2018 IAP and EPC. Published by Elsevier B.V. All rights reserved.

  15. Validation of KENOREST with LWR-PROTEUS phase II samples

    Energy Technology Data Exchange (ETDEWEB)

    Wagner, M.; Kilger, R.; Pautz, A.; Zwermann, W. [GRS, Garching (Germany); Grimm, P.; Vasiliev, A.; Ferroukhi, H. [Paul Scherrer Institut, Villigen (Switzerland)

    2012-11-01

    In order to broaden the validation basis of the reactivity and nuclide inventory code KENOREST two samples of the LWR-PROTEUS phase II program have been calculated and compared to the experimental results. In general most nuclides are reproduced very well and agree within about ten percent with the experiment. Some already known problems, the overprediction of metallic fission products and the underprediction of the higher curium isotopes, have been confirmed. One of the largest uncertainties in the calculation was the burnup of the samples due to differences between a core simulation of the fuel vendor and the burnup determined from the measured values of the burnup indicator Nd-148. Two different models taking into account the environment for a peripheral fuel rod have been studied. The more detailed model included the three direct neighbor fuel assemblies depleted along with the fuel rod of interest. The influence on the results has been found to be very small. Compared to the uncertainties from the burnup, this effect can be considered negligible. The reason for the low influence was basically that the spectrum did not get considerably harder with increasing burnup beyond about 20GWd/tHM. Since the sample reached burnups far beyond that value, an effect could not be seen. In the near future an update of the used libraries is planned and it will be very interesting to study the effect on the results, especially for Curium. (orig.)

  16. Lagrangian Stochastic Dispersion Model IMS Model Suite and its Validation against Experimental Data

    International Nuclear Information System (INIS)

    Bartok, J.

    2010-01-01

    The dissertation presents IMS Lagrangian Dispersion Model, which is a 'new generation' Slovak dispersion model of long-range transport, developed by MicroStep-MIS. It solves trajectory equation for a vast number of Lagrangian 'particles' and stochastic equation that simulates the effects of turbulence. Model contains simulation of radioactive decay (full decay chains of more than 300 nuclides), and dry and wet deposition. Model was integrated into IMS Model Suite, a system in which several models and modules can run and cooperate, e.g. LAM model WRF preparing fine resolution meteorological data for dispersion. The main theme of the work is validation of dispersion model against large scale international campaigns CAPTEX and ETEX, which are two of the largest tracer experiments. Validation addressed treatment of missing data, data interpolation into comparable temporal and spatial representation. The best model results were observed for ETEX I, standard results for CAPTEXes and worst results for ETEX II, known in modelling community for its meteorological conditions that can be hardly resolved by models. The IMS Lagrangian Dispersion Model was identified as capable long range dispersion model for slowly- or nonreacting chemicals and radioactive matter. Influence of input data on simulation quality is discussed within the work. Additional modules were prepared according to praxis requirement: a) Recalculation of concentrations of radioactive pollutant into effective doses form inhalation, immersion in the plume and deposition. b) Dispersion of mineral dust was added and tested in desert locality, where wind and soil moisture were firstly analysed and forecast by WRF. The result was qualitatively verified in case study against satellite observations. (author)

  17. Model validation and calibration based on component functions of model output

    International Nuclear Information System (INIS)

    Wu, Danqing; Lu, Zhenzhou; Wang, Yanping; Cheng, Lei

    2015-01-01

    The target in this work is to validate the component functions of model output between physical observation and computational model with the area metric. Based on the theory of high dimensional model representations (HDMR) of independent input variables, conditional expectations are component functions of model output, and the conditional expectations reflect partial information of model output. Therefore, the model validation of conditional expectations tells the discrepancy between the partial information of the computational model output and that of the observations. Then a calibration of the conditional expectations is carried out to reduce the value of model validation metric. After that, a recalculation of the model validation metric of model output is taken with the calibrated model parameters, and the result shows that a reduction of the discrepancy in the conditional expectations can help decrease the difference in model output. At last, several examples are employed to demonstrate the rationality and necessity of the methodology in case of both single validation site and multiple validation sites. - Highlights: • A validation metric of conditional expectations of model output is proposed. • HDRM explains the relationship of conditional expectations and model output. • An improved approach of parameter calibration updates the computational models. • Validation and calibration process are applied at single site and multiple sites. • Validation and calibration process show a superiority than existing methods

  18. Base Flow Model Validation, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The program focuses on turbulence modeling enhancements for predicting high-speed rocket base flows. A key component of the effort is the collection of high-fidelity...

  19. HEDR model validation plan

    International Nuclear Information System (INIS)

    Napier, B.A.; Gilbert, R.O.; Simpson, J.C.; Ramsdell, J.V. Jr.; Thiede, M.E.; Walters, W.H.

    1993-06-01

    The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computational ''tools'' for estimating the possible radiation dose that individuals may have received from past Hanford Site operations. This document describes the planned activities to ''validate'' these tools. In the sense of the HEDR Project, ''validation'' is a process carried out by comparing computational model predictions with field observations and experimental measurements that are independent of those used to develop the model

  20. Validation of models with multivariate output

    International Nuclear Information System (INIS)

    Rebba, Ramesh; Mahadevan, Sankaran

    2006-01-01

    This paper develops metrics for validating computational models with experimental data, considering uncertainties in both. A computational model may generate multiple response quantities and the validation experiment might yield corresponding measured values. Alternatively, a single response quantity may be predicted and observed at different spatial and temporal points. Model validation in such cases involves comparison of multiple correlated quantities. Multiple univariate comparisons may give conflicting inferences. Therefore, aggregate validation metrics are developed in this paper. Both classical and Bayesian hypothesis testing are investigated for this purpose, using multivariate analysis. Since, commonly used statistical significance tests are based on normality assumptions, appropriate transformations are investigated in the case of non-normal data. The methodology is implemented to validate an empirical model for energy dissipation in lap joints under dynamic loading

  1. Model Validation in Ontology Based Transformations

    Directory of Open Access Journals (Sweden)

    Jesús M. Almendros-Jiménez

    2012-10-01

    Full Text Available Model Driven Engineering (MDE is an emerging approach of software engineering. MDE emphasizes the construction of models from which the implementation should be derived by applying model transformations. The Ontology Definition Meta-model (ODM has been proposed as a profile for UML models of the Web Ontology Language (OWL. In this context, transformations of UML models can be mapped into ODM/OWL transformations. On the other hand, model validation is a crucial task in model transformation. Meta-modeling permits to give a syntactic structure to source and target models. However, semantic requirements have to be imposed on source and target models. A given transformation will be sound when source and target models fulfill the syntactic and semantic requirements. In this paper, we present an approach for model validation in ODM based transformations. Adopting a logic programming based transformational approach we will show how it is possible to transform and validate models. Properties to be validated range from structural and semantic requirements of models (pre and post conditions to properties of the transformation (invariants. The approach has been applied to a well-known example of model transformation: the Entity-Relationship (ER to Relational Model (RM transformation.

  2. Transport of fluid and solutes in the body II. Model validation and implications.

    Science.gov (United States)

    Gyenge, C C; Bowen, B D; Reed, R K; Bert, J L

    1999-09-01

    A mathematical model of short-term whole body fluid, protein, and ion distribution and transport developed earlier [see companion paper: C. C. Gyenge, B. D. Bowen, R. K. Reed, and J. L. Bert. Am. J. Physiol. 277 (Heart Circ. Physiol. 46): H1215-H1227, 1999] is validated using experimental data available in the literature. The model was tested against data measured for the following three types of experimental infusions: 1) hyperosmolar saline solutions with an osmolarity in the range of 2,000-2,400 mosmol/l, 2) saline solutions with an osmolarity of approximately 270 mosmol/l and composition comparable with Ringer solution, and 3) an isosmotic NaCl solution with an osmolarity of approximately 300 mosmol/l. Good agreement between the model predictions and the experimental data was obtained with respect to the trends and magnitudes of fluid shifts between the intra- and extracellular compartments, extracellular ion and protein contents, and hematocrit values. The model is also able to yield information about inaccessible or difficult-to-measure system variables such as intracellular ion contents, cellular volumes, and fluid fluxes across the vascular capillary membrane, data that can be used to help interpret the behavior of the system.

  3. Construction and validation of detailed kinetic models for the combustion of gasoline surrogates; Construction et validation de modeles cinetiques detailles pour la combustion de melanges modeles des essences

    Energy Technology Data Exchange (ETDEWEB)

    Touchard, S.

    2005-10-15

    The irreversible reduction of oil resources, the CO{sub 2} emission control and the application of increasingly strict standards of pollutants emission lead the worldwide researchers to work to reduce the pollutants formation and to improve the engine yields, especially by using homogenous charge combustion of lean mixtures. The numerical simulation of fuel blends oxidation is an essential tool to study the influence of fuel formulation and motor conditions on auto-ignition and on pollutants emissions. The automatic generation helps to obtain detailed kinetic models, especially at low temperature, where the number of reactions quickly exceeds thousand. The main purpose of this study is the generation and the validation of detailed kinetic models for the oxidation of gasoline blends using the EXGAS software. This work has implied an improvement of computation rules for thermodynamic and kinetic data, those were validated by numerical simulation using CHEMKIN II softwares. A large part of this work has concerned the understanding of the low temperature oxidation chemistry of the C5 and larger alkenes. Low and high temperature mechanisms were proposed and validated for 1 pentene, 1-hexene, the binary mixtures containing 1 hexene/iso octane, 1 hexene/toluene, iso octane/toluene and the ternary mixture of 1 hexene/toluene/iso octane. Simulations were also done for propene, 1-butene and iso-octane with former models including the modifications proposed in this PhD work. If the generated models allowed us to simulate with a good agreement the auto-ignition delays of the studied molecules and blends, some uncertainties still remains for some reaction paths leading to the formation of cyclic products in the case of alkenes oxidation at low temperature. It would be also interesting to carry on this work for combustion models of gasoline blends at low temperature. (author)

  4. Validation of CRIB II for prediction of mortality in premature babies.

    Science.gov (United States)

    Rastogi, Pallav Kumar; Sreenivas, V; Kumar, Nirmal

    2010-02-01

    Validation of Clinical Risk Index for Babies (CRIB II) score in predicting the neonatal mortality in preterm neonates < or = 32 weeks gestational age. Prospective cohort study. Tertiary care neonatal unit. 86 consecutively born preterm neonates with gestational age < or = 32 weeks. The five variables related to CRIB II were recorded within the first hour of admission for data analysis. The receiver operating characteristics (ROC) curve was used to check the accuracy of the mortality prediction. HL Goodness of fit test was used to see the discrepancy between observed and expected outcomes. A total of 86 neonates (males 59.6% mean birthweight: 1228 +/- 398 grams; mean gestational age: 28.3 +/- 2.4 weeks) were enrolled in the study, of which 17 (19.8%) left hospital against medical advice (LAMA) before reaching the study end point. Among 69 neonates completing the study, 24 (34.8%) had adverse outcome during hospital stay and 45 (65.2%) had favorable outcome. CRIB II correctly predicted adverse outcome in 90.3% (Hosmer Lemeshow goodness of fit test P=0.6). Area under curve (AUC) for CRIB II was 0.9032. In intention to treat analysis with LAMA cases included as survivors, the mortality prediction was 87%. If these were included as having died then mortality prediction was 83.1%. The CRIB II score was found to be a good predictive instrument for mortality in preterm infants < or = 32 weeks gestation.

  5. Overviews of EMRAS I and II

    International Nuclear Information System (INIS)

    Kawaguchi, Isao

    2011-01-01

    Recently there has been attracting growing interest in impacts of irradiation to wildlife organisms. In IAEA, biota dosimetry working group (BWG) was established in Environmental Modeling for Radiation Safety (EMRAS) program, which aimed to intercompare biota dose assessment models to validate their assumptions and estimations. After EMRAS program finished, new program which is referred as EMRAS II was set on January 2009. In EMRAS II, there are three themes, and 9 working groups were established. One of three themes is related to environmental protection, and three working groups (Biota modeling, wildlife transfer coefficient handbook, biota dose effects modeling) were constituted in it. In this report, activities of EMRAS I BWG and EMRAS II theme II are summarized. (author)

  6. Validation of ASTEC v2.0 corium jet fragmentation model using FARO experiments

    International Nuclear Information System (INIS)

    Hermsmeyer, S.; Pla, P.; Sangiorgi, M.

    2015-01-01

    Highlights: • Model validation base extended to six FARO experiments. • Focus on the calculation of the fragmented particle diameter. • Capability and limits of the ASTEC fragmentation model. • Sensitivity analysis of model outputs. - Abstract: ASTEC is an integral code for the prediction of Severe Accidents in Nuclear Power Plants. As such, it needs to cover all physical processes that could occur during accident progression, yet keeping its models simple enough for the ensemble to stay manageable and produce results within an acceptable time. The present paper is concerned with the validation of the Corium jet fragmentation model of ASTEC v2.0 rev3 by means of a selection of six experiments carried out within the FARO facility. The different conditions applied within these six experiments help to analyse the model behaviour in different situations and to expose model limits. In addition to comparing model outputs with experimental measurements, sensitivity analyses are applied to investigate the model. Results of the paper are (i) validation runs, accompanied by an identification of situations where the implemented fragmentation model does not match the experiments well, and discussion of results; (ii) its special attention to the models calculating the diameter of fragmented particles, the identification of a fault in one model implemented, and the discussion of simplification and ad hoc modification to improve the model fit; and, (iii) an investigation of the sensitivity of predictions towards inputs and parameters. In this way, the paper offers a thorough investigation of the merit and limitation of the fragmentation model used in ASTEC

  7. Ground-water models: Validate or invalidate

    Science.gov (United States)

    Bredehoeft, J.D.; Konikow, Leonard F.

    1993-01-01

    The word validation has a clear meaning to both the scientific community and the general public. Within the scientific community the validation of scientific theory has been the subject of philosophical debate. The philosopher of science, Karl Popper, argued that scientific theory cannot be validated, only invalidated. Popper’s view is not the only opinion in this debate; however, many scientists today agree with Popper (including the authors). To the general public, proclaiming that a ground-water model is validated carries with it an aura of correctness that we do not believe many of us who model would claim. We can place all the caveats we wish, but the public has its own understanding of what the word implies. Using the word valid with respect to models misleads the public; verification carries with it similar connotations as far as the public is concerned. Our point is this: using the terms validation and verification are misleading, at best. These terms should be abandoned by the ground-water community.

  8. Assessment of ASTEC-CPA pool scrubbing models against POSEIDON-II and SGTR-ARTIST data

    International Nuclear Information System (INIS)

    Herranz, Luis E.; Fontanet, Joan

    2009-01-01

    Aerosol scrubbing in pools mitigates the potential source term in key severe accident scenarios in PWRs and BWRs. Even though models were extensively validated in the past, a thorough and systematic validation under key challenging conditions is missing. Some of those conditions are high injection velocity, high pool temperature and/or presence of submerged structures. In particular, in-code models have been neither updated nor validated based on the most recent experimental data. The POSEIDON-II and the SGTR-ARTIST projects produced sets of data under conditions of utmost interest for pool scrubbing validation: high temperature and submerged structures. This paper investigates the response of models encapsulated in the CPA module of the ASTEC code in the simulation of those experimental set-ups. The influence of key pool scrubbing variables like steam fraction, water depth, gas flow-rate and particle size has been analyzed. Additionally, comparisons to stand-alone code (i.e., SPARC90) responses have also been obtained, so that prediction-to-data deviations can be discussed and attributed to either model grounds and/or model implementation in integral accident codes. This work has demonstrated that ASTEC-CPA limitations to capture fundamental trends of aerosol pool scrubbing are substantial (although the SGTR scenarios should not be properly considered within the CPA scope) and they stem from both original models (i.e., SPARC90) and model implementation. This work has been carried out within the European SARNET project of the VI Framework Program of EURATOM. (author)

  9. A broad view of model validation

    International Nuclear Information System (INIS)

    Tsang, C.F.

    1989-10-01

    The safety assessment of a nuclear waste repository requires the use of models. Such models need to be validated to ensure, as much as possible, that they are a good representation of the actual processes occurring in the real system. In this paper we attempt to take a broad view by reviewing step by step the modeling process and bringing out the need to validating every step of this process. This model validation includes not only comparison of modeling results with data from selected experiments, but also evaluation of procedures for the construction of conceptual models and calculational models as well as methodologies for studying data and parameter correlation. The need for advancing basic scientific knowledge in related fields, for multiple assessment groups, and for presenting our modeling efforts in open literature to public scrutiny is also emphasized. 16 refs

  10. Model performance analysis and model validation in logistic regression

    Directory of Open Access Journals (Sweden)

    Rosa Arboretti Giancristofaro

    2007-10-01

    Full Text Available In this paper a new model validation procedure for a logistic regression model is presented. At first, we illustrate a brief review of different techniques of model validation. Next, we define a number of properties required for a model to be considered "good", and a number of quantitative performance measures. Lastly, we describe a methodology for the assessment of the performance of a given model by using an example taken from a management study.

  11. Establishing model credibility involves more than validation

    International Nuclear Information System (INIS)

    Kirchner, T.

    1991-01-01

    One widely used definition of validation is that the quantitative test of the performance of a model through the comparison of model predictions to independent sets of observations from the system being simulated. The ability to show that the model predictions compare well with observations is often thought to be the most rigorous test that can be used to establish credibility for a model in the scientific community. However, such tests are only part of the process used to establish credibility, and in some cases may be either unnecessary or misleading. Naylor and Finger extended the concept of validation to include the establishment of validity for the postulates embodied in the model and the test of assumptions used to select postulates for the model. Validity of postulates is established through concurrence by experts in the field of study that the mathematical or conceptual model contains the structural components and mathematical relationships necessary to adequately represent the system with respect to the goals for the model. This extended definition of validation provides for consideration of the structure of the model, not just its performance, in establishing credibility. Evaluation of a simulation model should establish the correctness of the code and the efficacy of the model within its domain of applicability. (24 refs., 6 figs.)

  12. Towards policy relevant environmental modeling: contextual validity and pragmatic models

    Science.gov (United States)

    Miles, Scott B.

    2000-01-01

    "What makes for a good model?" In various forms, this question is a question that, undoubtedly, many people, businesses, and institutions ponder with regards to their particular domain of modeling. One particular domain that is wrestling with this question is the multidisciplinary field of environmental modeling. Examples of environmental models range from models of contaminated ground water flow to the economic impact of natural disasters, such as earthquakes. One of the distinguishing claims of the field is the relevancy of environmental modeling to policy and environment-related decision-making in general. A pervasive view by both scientists and decision-makers is that a "good" model is one that is an accurate predictor. Thus, determining whether a model is "accurate" or "correct" is done by comparing model output to empirical observations. The expected outcome of this process, usually referred to as "validation" or "ground truthing," is a stamp on the model in question of "valid" or "not valid" that serves to indicate whether or not the model will be reliable before it is put into service in a decision-making context. In this paper, I begin by elaborating on the prevailing view of model validation and why this view must change. Drawing from concepts coming out of the studies of science and technology, I go on to propose a contextual view of validity that can overcome the problems associated with "ground truthing" models as an indicator of model goodness. The problem of how we talk about and determine model validity has much to do about how we perceive the utility of environmental models. In the remainder of the paper, I argue that we should adopt ideas of pragmatism in judging what makes for a good model and, in turn, developing good models. From such a perspective of model goodness, good environmental models should facilitate communication, convey—not bury or "eliminate"—uncertainties, and, thus, afford the active building of consensus decisions, instead

  13. Statistical validation of normal tissue complication probability models.

    Science.gov (United States)

    Xu, Cheng-Jian; van der Schaaf, Arjen; Van't Veld, Aart A; Langendijk, Johannes A; Schilstra, Cornelis

    2012-09-01

    To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use. Copyright © 2012 Elsevier Inc. All rights reserved.

  14. Statistical Validation of Normal Tissue Complication Probability Models

    Energy Technology Data Exchange (ETDEWEB)

    Xu Chengjian, E-mail: c.j.xu@umcg.nl [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schaaf, Arjen van der; Veld, Aart A. van' t; Langendijk, Johannes A. [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schilstra, Cornelis [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Radiotherapy Institute Friesland, Leeuwarden (Netherlands)

    2012-09-01

    Purpose: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. Methods and Materials: A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Results: Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Conclusion: Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use.

  15. Validation of an intermediate heat exchanger model for real time analysis

    International Nuclear Information System (INIS)

    Tzanos, C.P.

    1986-11-01

    A new method was presented for LMFBR intermediate heat exchanger (IHX) analysis in real time for purposes of continuous on-line data validation, plant state verification and fault identification. For the validation of this methodology the EBR-II IHX transient during Test 8A was analyzed. This paper presents the results of this analysis

  16. Validation process of simulation model

    International Nuclear Information System (INIS)

    San Isidro, M. J.

    1998-01-01

    It is presented a methodology on empirical validation about any detailed simulation model. This king of validation it is always related with an experimental case. The empirical validation has a residual sense, because the conclusions are based on comparisons between simulated outputs and experimental measurements. This methodology will guide us to detect the fails of the simulation model. Furthermore, it can be used a guide in the design of posterior experiments. Three steps can be well differentiated: Sensitivity analysis. It can be made with a DSA, differential sensitivity analysis, and with a MCSA, Monte-Carlo sensitivity analysis. Looking the optimal domains of the input parameters. It has been developed a procedure based on the Monte-Carlo methods and Cluster techniques, to find the optimal domains of these parameters. Residual analysis. This analysis has been made on the time domain and on the frequency domain, it has been used the correlation analysis and spectral analysis. As application of this methodology, it is presented the validation carried out on a thermal simulation model on buildings, Esp., studying the behavior of building components on a Test Cell of LECE of CIEMAT. (Author) 17 refs

  17. Solar photocatalytic removal of Cu(II), Ni(II), Zn(II) and Pb(II): Speciation modeling of metal-citric acid complexes

    International Nuclear Information System (INIS)

    Kabra, Kavita; Chaudhary, Rubina; Sawhney, R.L.

    2008-01-01

    The present study is targeted on solar photocatalytic removal of metal ions from wastewater. Photoreductive deposition and dark adsorption of metal ions Cu(II), Ni(II), Pb(II) and Zn(II), using solar energy irradiated TiO 2 , has been investigated. Citric acid has been used as a hole scavenger. Modeling of metal species has been performed and speciation is used as a tool for discussing the photodeposition trends. Ninety-seven percent reductive deposition was obtained for copper. The deposition values of other metals were significantly low [nickel (36.4%), zinc (22.2%) and lead (41.4%)], indicating that the photocatalytic treatment process, using solar energy, was more suitable for wastewater containing Cu(II) ions. In absence of citric acid, the decreasing order deposition was Cu(II) > Ni(II) > Pb(II) > Zn(II), which proves the theoretical thermodynamic predictions about the metals

  18. Validation of the LOD score compared with APACHE II score in prediction of the hospital outcome in critically ill patients.

    Science.gov (United States)

    Khwannimit, Bodin

    2008-01-01

    The Logistic Organ Dysfunction score (LOD) is an organ dysfunction score that can predict hospital mortality. The aim of this study was to validate the performance of the LOD score compared with the Acute Physiology and Chronic Health Evaluation II (APACHE II) score in a mixed intensive care unit (ICU) at a tertiary referral university hospital in Thailand. The data were collected prospectively on consecutive ICU admissions over a 24 month period from July1, 2004 until June 30, 2006. Discrimination was evaluated by the area under the receiver operating characteristic curve (AUROC). The calibration was assessed by the Hosmer-Lemeshow goodness-of-fit H statistic. The overall fit of the model was evaluated by the Brier's score. Overall, 1,429 patients were enrolled during the study period. The mortality in the ICU was 20.9% and in the hospital was 27.9%. The median ICU and hospital lengths of stay were 3 and 18 days, respectively, for all patients. Both models showed excellent discrimination. The AUROC for the LOD and APACHE II were 0.860 [95% confidence interval (CI) = 0.838-0.882] and 0.898 (95% Cl = 0.879-0.917), respectively. The LOD score had perfect calibration with the Hosmer-Lemeshow goodness-of-fit H chi-2 = 10 (p = 0.44). However, the APACHE II had poor calibration with the Hosmer-Lemeshow goodness-of-fit H chi-2 = 75.69 (p < 0.001). Brier's score showed the overall fit for both models were 0.123 (95%Cl = 0.107-0.141) and 0.114 (0.098-0.132) for the LOD and APACHE II, respectively. Thus, the LOD score was found to be accurate for predicting hospital mortality for general critically ill patients in Thailand.

  19. External validation of the Intensive Care National Audit & Research Centre (ICNARC) risk prediction model in critical care units in Scotland.

    Science.gov (United States)

    Harrison, David A; Lone, Nazir I; Haddow, Catriona; MacGillivray, Moranne; Khan, Angela; Cook, Brian; Rowan, Kathryn M

    2014-01-01

    Risk prediction models are used in critical care for risk stratification, summarising and communicating risk, supporting clinical decision-making and benchmarking performance. However, they require validation before they can be used with confidence, ideally using independently collected data from a different source to that used to develop the model. The aim of this study was to validate the Intensive Care National Audit & Research Centre (ICNARC) model using independently collected data from critical care units in Scotland. Data were extracted from the Scottish Intensive Care Society Audit Group (SICSAG) database for the years 2007 to 2009. Recoding and mapping of variables was performed, as required, to apply the ICNARC model (2009 recalibration) to the SICSAG data using standard computer algorithms. The performance of the ICNARC model was assessed for discrimination, calibration and overall fit and compared with that of the Acute Physiology And Chronic Health Evaluation (APACHE) II model. There were 29,626 admissions to 24 adult, general critical care units in Scotland between 1 January 2007 and 31 December 2009. After exclusions, 23,269 admissions were included in the analysis. The ICNARC model outperformed APACHE II on measures of discrimination (c index 0.848 versus 0.806), calibration (Hosmer-Lemeshow chi-squared statistic 18.8 versus 214) and overall fit (Brier's score 0.140 versus 0.157; Shapiro's R 0.652 versus 0.621). Model performance was consistent across the three years studied. The ICNARC model performed well when validated in an external population to that in which it was developed, using independently collected data.

  20. A proposed best practice model validation framework for banks

    Directory of Open Access Journals (Sweden)

    Pieter J. (Riaan de Jongh

    2017-06-01

    Full Text Available Background: With the increasing use of complex quantitative models in applications throughout the financial world, model risk has become a major concern. The credit crisis of 2008–2009 provoked added concern about the use of models in finance. Measuring and managing model risk has subsequently come under scrutiny from regulators, supervisors, banks and other financial institutions. Regulatory guidance indicates that meticulous monitoring of all phases of model development and implementation is required to mitigate this risk. Considerable resources must be mobilised for this purpose. The exercise must embrace model development, assembly, implementation, validation and effective governance. Setting: Model validation practices are generally patchy, disparate and sometimes contradictory, and although the Basel Accord and some regulatory authorities have attempted to establish guiding principles, no definite set of global standards exists. Aim: Assessing the available literature for the best validation practices. Methods: This comprehensive literature study provided a background to the complexities of effective model management and focussed on model validation as a component of model risk management. Results: We propose a coherent ‘best practice’ framework for model validation. Scorecard tools are also presented to evaluate if the proposed best practice model validation framework has been adequately assembled and implemented. Conclusion: The proposed best practice model validation framework is designed to assist firms in the construction of an effective, robust and fully compliant model validation programme and comprises three principal elements: model validation governance, policy and process.

  1. Validity and reproducibility of HOMA-IR, 1/HOMA-IR, QUICKI and McAuley's indices in patients with hypertension and type II diabetes.

    Science.gov (United States)

    Sarafidis, P A; Lasaridis, A N; Nilsson, P M; Pikilidou, M I; Stafilas, P C; Kanaki, A; Kazakos, K; Yovos, J; Bakris, G L

    2007-09-01

    The aim of this study was to evaluate the validity and reliability of homeostasis model assessment-insulin resistance (HOMA-IR) index, its reciprocal (1/HOMA-IR), quantitative insulin sensitivity check index (QUICKI) and McAuley's index in hypertensive diabetic patients. In 78 patients with hypertension and type II diabetes glucose, insulin and triglyceride levels were determined after a 12-h fast to calculate these indices, and insulin sensitivity (IS) was measured with the hyperinsulinemic euglycemic clamp technique. Two weeks later, subjects had again their glucose, insulin and triglycerides measured. Simple and multiple linear regression analysis were applied to assess the validity of these indices compared to clamp IS and coefficients of variation between the two visits were estimated to assess their reproducibility. HOMA-IR index was strongly and inversely correlated with the basic IS clamp index, the M-value (r=-0.572, PHOMA-IR and QUICKI indices were positively correlated with the M-value (r=0.342, PHOMA-IR was the best fit of clamp-derived IS. Coefficients of variation between the two visits were 23.5% for HOMA-IR, 19.2% for 1/HOMA-IR, 7.8% for QUICKI and 15.1% for McAuley's index. In conclusion, HOMA-IR, 1/HOMA-IR and QUICKI are valid estimates of clamp-derived IS in patients with hypertension and type II diabetes, whereas the validity of McAuley's index needs further evaluation. QUICKI displayed better reproducibility than the other indices.

  2. On the validity of travel-time based nonlinear bioreactive transport models in steady-state flow.

    Science.gov (United States)

    Sanz-Prat, Alicia; Lu, Chuanhe; Finkel, Michael; Cirpka, Olaf A

    2015-01-01

    Travel-time based models simplify the description of reactive transport by replacing the spatial coordinates with the groundwater travel time, posing a quasi one-dimensional (1-D) problem and potentially rendering the determination of multidimensional parameter fields unnecessary. While the approach is exact for strictly advective transport in steady-state flow if the reactive properties of the porous medium are uniform, its validity is unclear when local-scale mixing affects the reactive behavior. We compare a two-dimensional (2-D), spatially explicit, bioreactive, advective-dispersive transport model, considered as "virtual truth", with three 1-D travel-time based models which differ in the conceptualization of longitudinal dispersion: (i) neglecting dispersive mixing altogether, (ii) introducing a local-scale longitudinal dispersivity constant in time and space, and (iii) using an effective longitudinal dispersivity that increases linearly with distance. The reactive system considers biodegradation of dissolved organic carbon, which is introduced into a hydraulically heterogeneous domain together with oxygen and nitrate. Aerobic and denitrifying bacteria use the energy of the microbial transformations for growth. We analyze six scenarios differing in the variance of log-hydraulic conductivity and in the inflow boundary conditions (constant versus time-varying concentration). The concentrations of the 1-D models are mapped to the 2-D domain by means of the kinematic (for case i), and mean groundwater age (for cases ii & iii), respectively. The comparison between concentrations of the "virtual truth" and the 1-D approaches indicates extremely good agreement when using an effective, linearly increasing longitudinal dispersivity in the majority of the scenarios, while the other two 1-D approaches reproduce at least the concentration tendencies well. At late times, all 1-D models give valid approximations of two-dimensional transport. We conclude that the

  3. Statistical Validation of Engineering and Scientific Models: Background

    International Nuclear Information System (INIS)

    Hills, Richard G.; Trucano, Timothy G.

    1999-01-01

    A tutorial is presented discussing the basic issues associated with propagation of uncertainty analysis and statistical validation of engineering and scientific models. The propagation of uncertainty tutorial illustrates the use of the sensitivity method and the Monte Carlo method to evaluate the uncertainty in predictions for linear and nonlinear models. Four example applications are presented; a linear model, a model for the behavior of a damped spring-mass system, a transient thermal conduction model, and a nonlinear transient convective-diffusive model based on Burger's equation. Correlated and uncorrelated model input parameters are considered. The model validation tutorial builds on the material presented in the propagation of uncertainty tutoriaI and uses the damp spring-mass system as the example application. The validation tutorial illustrates several concepts associated with the application of statistical inference to test model predictions against experimental observations. Several validation methods are presented including error band based, multivariate, sum of squares of residuals, and optimization methods. After completion of the tutorial, a survey of statistical model validation literature is presented and recommendations for future work are made

  4. Model Validation Using Coordinate Distance with Performance Sensitivity

    Directory of Open Access Journals (Sweden)

    Jiann-Shiun Lew

    2008-01-01

    Full Text Available This paper presents an innovative approach to model validation for a structure with significant parameter variations. Model uncertainty of the structural dynamics is quantified with the use of a singular value decomposition technique to extract the principal components of parameter change, and an interval model is generated to represent the system with parameter uncertainty. The coordinate vector, corresponding to the identified principal directions, of the validation system is computed. The coordinate distance between the validation system and the identified interval model is used as a metric for model validation. A beam structure with an attached subsystem, which has significant parameter uncertainty, is used to demonstrate the proposed approach.

  5. Some considerations for validation of repository performance assessment models

    International Nuclear Information System (INIS)

    Eisenberg, N.

    1991-01-01

    Validation is an important aspect of the regulatory uses of performance assessment. A substantial body of literature exists indicating the manner in which validation of models is usually pursued. Because performance models for a nuclear waste repository cannot be tested over the long time periods for which the model must make predictions, the usual avenue for model validation is precluded. Further impediments to model validation include a lack of fundamental scientific theory to describe important aspects of repository performance and an inability to easily deduce the complex, intricate structures characteristic of a natural system. A successful strategy for validation must attempt to resolve these difficulties in a direct fashion. Although some procedural aspects will be important, the main reliance of validation should be on scientific substance and logical rigor. The level of validation needed will be mandated, in part, by the uses to which these models are put, rather than by the ideal of validation of a scientific theory. Because of the importance of the validation of performance assessment models, the NRC staff has engaged in a program of research and international cooperation to seek progress in this important area. 2 figs., 16 refs

  6. Control of uncertain systems by feedback linearization with neural networks augmentation. Part II. Controller validation by numerical simulation

    Directory of Open Access Journals (Sweden)

    Adrian TOADER

    2010-09-01

    Full Text Available The paper was conceived in two parts. Part I, previously published in this journal, highlighted the main steps of adaptive output feedback control for non-affine uncertain systems, having a known relative degree. The main paradigm of this approach was the feedback linearization (dynamic inversion with neural network augmentation. Meanwhile, based on new contributions of the authors, a new paradigm, that of robust servomechanism problem solution, has been added to the controller architecture. The current Part II of the paper presents the validation of the controller hereby obtained by using the longitudinal channel of a hovering VTOL-type aircraft as mathematical model.

  7. Validation of the Health-Promoting Lifestyle Profile II for Hispanic male truck drivers in the Southwest.

    Science.gov (United States)

    Mullins, Iris L; O'Day, Trish; Kan, Tsz Yin

    2013-08-01

    The aims of the study were to validate the English and Spanish Versions of the Health-Promoting Lifestyle Profile II (HPLP II) with Hispanic male truck drivers and to determine if there were any differences in drivers' responses based on driving responsibility. The methods included a descriptive correlation design, the HPLP II (English and Spanish versions), and a demographic questionnaire. Fifty-two Hispanic drivers participated in the study. There were no significant differences in long haul and short haul drivers' responses to the HPLP II. Cronbach's alpha for the Spanish version was .97 and the subscales alphas ranged from .74 to .94. The English version alpha was .92 and the subscales ranged from .68 to .84. Findings suggest the subscales of Health Responsibility, Physical Activities, Nutrition, and Spirituality Growth on the HPLP II Spanish and English versions may not adequately assess health-promoting behaviors and cultural influences for the Hispanic male population in the southwestern border region.

  8. SDG and qualitative trend based model multiple scale validation

    Science.gov (United States)

    Gao, Dong; Xu, Xin; Yin, Jianjin; Zhang, Hongyu; Zhang, Beike

    2017-09-01

    Verification, Validation and Accreditation (VV&A) is key technology of simulation and modelling. For the traditional model validation methods, the completeness is weak; it is carried out in one scale; it depends on human experience. The SDG (Signed Directed Graph) and qualitative trend based multiple scale validation is proposed. First the SDG model is built and qualitative trends are added to the model. And then complete testing scenarios are produced by positive inference. The multiple scale validation is carried out by comparing the testing scenarios with outputs of simulation model in different scales. Finally, the effectiveness is proved by carrying out validation for a reactor model.

  9. Validation and application of a physics database for fast reactor fuel cycle analysis

    International Nuclear Information System (INIS)

    McKnight, R.D.; Stillman, J.A.; Toppel, B.J.; Khalil, H.S.

    1994-01-01

    An effort has been made to automate the execution of fast reactor fuel cycle analysis, using EBR-II as a demonstration vehicle, and to validate the analysis results for application to the IFR closed fuel cycle demonstration at EBR-II and its fuel cycle facility. This effort has included: (1) the application of the standard ANL depletion codes to perform core-follow analyses for an extensive series of EBR-II runs, (2) incorporation of the EBR-II data into a physics database, (3) development and verification of software to update, maintain and verify the database files, (4) development and validation of fuel cycle models and methodology, (5) development and verification of software which utilizes this physics database to automate the application of the ANL depletion codes, methods and models to perform the core-follow analysis, and (6) validation studies of the ANL depletion codes and of their application in support of anticipated near-term operations in EBR-II and the Fuel Cycle Facility. Results of the validation tests indicate the physics database and associated analysis codes and procedures are adequate to predict required quantities in support of early phases of FCF operations

  10. Validation of CATHARE for gas-cooled reactors

    International Nuclear Information System (INIS)

    Fabrice Bentivoglio; Ola Widlund; Manuel Saez

    2005-01-01

    Full text of publication follows: Extensively validated and qualified for light-water reactor safety studies, the thermo-hydraulics code CATHARE has been adapted to deal also with gas-cooled reactor applications. In order to validate the code for these novel applications, CEA (Commissariat a l'Energie Atomique) has initiated an ambitious long-term experimental program. The foreseen experimental facilities range from small-scale loops for physical correlations, to component technology and system demonstration loops. In the short-term perspective, CATHARE is being validated against existing experimental data, in particular from the German power plant Oberhausen II and the South African Pebble-Bed Micro Model (PBMM). Oberhausen II, operated by the German utility EVO, is a 50 MW(e) direct-cycle Helium turbine plant. The power source is a gas burner rather than a nuclear reactor core, but the power conversion system resembles those of the GFR (Gas-cooled Fast Reactor) and other high-temperature reactor concepts. Oberhausen II was operated for more than 100 000 hours between 1974 and 1988. Design specifications, drawings and experimental data have been obtained through the European HTR project, offering a unique opportunity to validate CATHARE on a large-scale Brayton cycle. Available measurements of temperatures, pressures and mass flows throughout the circuit have allowed a very comprehensive thermohydraulic description of the plant, in steady-state conditions as well as during transients. The Pebble-Bed Micro Model (PBMM) is a small-scale model conceived to demonstrate the operability and control strategies of the South African PBMR concept. The model uses Nitrogen instead of Helium, and an electrical heater with a maximum rating of 420 kW. As the full-scale PBMR, the PBMM loop features three turbines and two compressors on the primary circuit, located on three separate shafts. The generator, however, is modelled by a third compressor on a separate circuit, with a

  11. Validating EHR clinical models using ontology patterns.

    Science.gov (United States)

    Martínez-Costa, Catalina; Schulz, Stefan

    2017-12-01

    Clinical models are artefacts that specify how information is structured in electronic health records (EHRs). However, the makeup of clinical models is not guided by any formal constraint beyond a semantically vague information model. We address this gap by advocating ontology design patterns as a mechanism that makes the semantics of clinical models explicit. This paper demonstrates how ontology design patterns can validate existing clinical models using SHACL. Based on the Clinical Information Modelling Initiative (CIMI), we show how ontology patterns detect both modeling and terminology binding errors in CIMI models. SHACL, a W3C constraint language for the validation of RDF graphs, builds on the concept of "Shape", a description of data in terms of expected cardinalities, datatypes and other restrictions. SHACL, as opposed to OWL, subscribes to the Closed World Assumption (CWA) and is therefore more suitable for the validation of clinical models. We have demonstrated the feasibility of the approach by manually describing the correspondences between six CIMI clinical models represented in RDF and two SHACL ontology design patterns. Using a Java-based SHACL implementation, we found at least eleven modeling and binding errors within these CIMI models. This demonstrates the usefulness of ontology design patterns not only as a modeling tool but also as a tool for validation. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Tracer travel time and model validation

    International Nuclear Information System (INIS)

    Tsang, Chin-Fu.

    1988-01-01

    The performance assessment of a nuclear waste repository demands much more in comparison to the safety evaluation of any civil constructions such as dams, or the resource evaluation of a petroleum or geothermal reservoir. It involves the estimation of low probability (low concentration) of radionuclide transport extrapolated 1000's of years into the future. Thus models used to make these estimates need to be carefully validated. A number of recent efforts have been devoted to the study of this problem. Some general comments on model validation were given by Tsang. The present paper discusses some issues of validation in regards to radionuclide transport. 5 refs

  13. Tank waste source term inventory validation. Volume II. Letter report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-04-01

    This document comprises Volume II of the Letter Report entitled Tank Waste Source Term Inventory Validation. This volume contains Appendix C, Radionuclide Tables, and Appendix D, Chemical Analyte Tables. The sample data for selection of 11 radionuclides and 24 chemical analytes were extracted from six separate sample data sets, were arranged in a tabular format and were plotted on scatter plots for all of the 149 single-shell tanks, the 24 double-shell tanks and the four aging waste tanks. The solid and liquid sample data was placed in separate tables and plots. The sample data and plots were compiled from the following data sets: characterization raw sample data, recent core samples, D. Braun data base, Wastren (Van Vleet) data base, TRAC and HTCE inventories.

  14. Tank waste source term inventory validation. Volume II. Letter report

    International Nuclear Information System (INIS)

    1995-04-01

    This document comprises Volume II of the Letter Report entitled Tank Waste Source Term Inventory Validation. This volume contains Appendix C, Radionuclide Tables, and Appendix D, Chemical Analyte Tables. The sample data for selection of 11 radionuclides and 24 chemical analytes were extracted from six separate sample data sets, were arranged in a tabular format and were plotted on scatter plots for all of the 149 single-shell tanks, the 24 double-shell tanks and the four aging waste tanks. The solid and liquid sample data was placed in separate tables and plots. The sample data and plots were compiled from the following data sets: characterization raw sample data, recent core samples, D. Braun data base, Wastren (Van Vleet) data base, TRAC and HTCE inventories

  15. Model validation: a systemic and systematic approach

    International Nuclear Information System (INIS)

    Sheng, G.; Elzas, M.S.; Cronhjort, B.T.

    1993-01-01

    The term 'validation' is used ubiquitously in association with the modelling activities of numerous disciplines including social, political natural, physical sciences, and engineering. There is however, a wide range of definitions which give rise to very different interpretations of what activities the process involves. Analyses of results from the present large international effort in modelling radioactive waste disposal systems illustrate the urgent need to develop a common approach to model validation. Some possible explanations are offered to account for the present state of affairs. The methodology developed treats model validation and code verification in a systematic fashion. In fact, this approach may be regarded as a comprehensive framework to assess the adequacy of any simulation study. (author)

  16. Regulatory activity based risk model identifies survival of stage II and III colorectal carcinoma.

    Science.gov (United States)

    Liu, Gang; Dong, Chuanpeng; Wang, Xing; Hou, Guojun; Zheng, Yu; Xu, Huilin; Zhan, Xiaohui; Liu, Lei

    2017-11-17

    Clinical and pathological indicators are inadequate for prognosis of stage II and III colorectal carcinoma (CRC). In this study, we utilized the activity of regulatory factors, univariate Cox regression and random forest for variable selection and developed a multivariate Cox model to predict the overall survival of Stage II/III colorectal carcinoma in GSE39582 datasets (469 samples). Patients in low-risk group showed a significant longer overall survival and recurrence-free survival time than those in high-risk group. This finding was further validated in five other independent datasets (GSE14333, GSE17536, GSE17537, GSE33113, and GSE37892). Besides, associations between clinicopathological information and risk score were analyzed. A nomogram including risk score was plotted to facilitate the utilization of risk score. The risk score model is also demonstrated to be effective on predicting both overall and recurrence-free survival of chemotherapy received patients. After performing Gene Set Enrichment Analysis (GSEA) between high and low risk groups, we found that several cell-cell interaction KEGG pathways were identified. Funnel plot results showed that there was no publication bias in these datasets. In summary, by utilizing the regulatory activity in stage II and III colorectal carcinoma, the risk score successfully predicts the survival of 1021 stage II/III CRC patients in six independent datasets.

  17. Better prognostic marker in ICU - APACHE II, SOFA or SAP II!

    Science.gov (United States)

    Naqvi, Iftikhar Haider; Mahmood, Khalid; Ziaullaha, Syed; Kashif, Syed Mohammad; Sharif, Asim

    2016-01-01

    This study was designed to determine the comparative efficacy of different scoring system in assessing the prognosis of critically ill patients. This was a retrospective study conducted in medical intensive care unit (MICU) and high dependency unit (HDU) Medical Unit III, Civil Hospital, from April 2012 to August 2012. All patients over age 16 years old who have fulfilled the criteria for MICU admission were included. Predictive mortality of APACHE II, SAP II and SOFA were calculated. Calibration and discrimination were used for validity of each scoring model. A total of 96 patients with equal gender distribution were enrolled. The average APACHE II score in non-survivors (27.97+8.53) was higher than survivors (15.82+8.79) with statistically significant p value (discrimination power than SAP II and SOFA.

  18. Higgs potential in the type II seesaw model

    International Nuclear Information System (INIS)

    Arhrib, A.; Benbrik, R.; Chabab, M.; Rahili, L.; Ramadan, J.; Moultaka, G.; Peyranere, M. C.

    2011-01-01

    The standard model Higgs sector, extended by one weak gauge triplet of scalar fields with a very small vacuum expectation value, is a very promising setting to account for neutrino masses through the so-called type II seesaw mechanism. In this paper we consider the general renormalizable doublet/triplet Higgs potential of this model. We perform a detailed study of its main dynamical features that depend on five dimensionless couplings and two mass parameters after spontaneous symmetry breaking, and highlight the implications for the Higgs phenomenology. In particular, we determine (i) the complete set of tree-level unitarity constraints on the couplings of the potential and (ii) the exact tree-level boundedness from below constraints on these couplings, valid for all directions. When combined, these constraints delineate precisely the theoretically allowed parameter space domain within our perturbative approximation. Among the seven physical Higgs states of this model, the mass of the lighter (heavier) CP even state h 0 (H 0 ) will always satisfy a theoretical upper (lower) bound that is reached for a critical value μ c of μ (the mass parameter controlling triple couplings among the doublet/triplet Higgses). Saturating the unitarity bounds, we find an upper bound m h 0 or approx. μ c and μ c . In the first regime the Higgs sector is typically very heavy, and only h 0 that becomes SM-like could be accessible to the LHC. In contrast, in the second regime, somewhat overlooked in the literature, most of the Higgs sector is light. In particular, the heaviest state H 0 becomes SM-like, the lighter states being the CP odd Higgs, the (doubly) charged Higgses, and a decoupled h 0 , possibly leading to a distinctive phenomenology at the colliders.

  19. Characterization of the TRIGA Mark II reactor full-power steady state

    Energy Technology Data Exchange (ETDEWEB)

    Cammi, Antonio, E-mail: antonio.cammi@polimi.it [Politecnico di Milano – Department of Energy, CeSNEF (Enrico Fermi Center for Nuclear Studies), via La Masa 34, 20156 Milano (Italy); Zanetti, Matteo [Politecnico di Milano – Department of Energy, CeSNEF (Enrico Fermi Center for Nuclear Studies), via La Masa 34, 20156 Milano (Italy); Chiesa, Davide; Clemenza, Massimiliano; Pozzi, Stefano; Previtali, Ezio; Sisti, Monica [University of Milano-Bicocca, Physics Department “G. Occhialini” and INFN Section, Piazza dell’Ateneo Nuovo, 20126 Milan (Italy); Magrotti, Giovanni; Prata, Michele; Salvini, Andrea [University of Pavia, Applied Nuclear Energy Laboratory (L.E.N.A.), Via Gaspare Aselli 41, 27100 Pavia (Italy)

    2016-04-15

    Highlights: • Full-power steady state characterization of the TRIGA Mark II reactor. • Monte Carlo and Multiphysics simulation of the TRIGA Mark II reactor. • Sub-cooled boiling effects in the TRIGA Mark II reactor. • Thermal feedback effects in the TRIGA Mark II reactor. • Experimental data based validation. - Abstract: In this paper, the characterization of the full-power steady state of the TRIGA Mark II nuclear reactor at the University of Pavia is achieved by coupling the Monte Carlo (MC) simulation for neutronics with the “Multiphysics” model for thermal-hydraulics. Neutronic analyses have been carried out with a MCNP5 based MC model of the entire reactor system, already validated in fresh fuel and zero-power configurations (in which thermal effects are negligible) and using all available experimental data as a benchmark. In order to describe the full-power reactor configuration, the temperature distribution in the core must be established. To evaluate this, a thermal-hydraulic model has been developed, using the power distribution results from the MC simulation as input. The thermal-hydraulic model is focused on the core active region and takes into account sub-cooled boiling effects present at full reactor power. The obtained temperature distribution is then entered into the MC model and a benchmark analysis is carried out to validate the model in fresh fuel and full-power configurations. An acceptable correspondence between experimental data and simulation results concerning full-power reactor criticality proves the reliability of the adopted methodology of analysis, both from the perspective of neutronics and thermal-hydraulics.

  20. COMDES-II: A Component-Based Framework for Generative Development of Distributed Real-Time Control Systems

    DEFF Research Database (Denmark)

    Ke, Xu; Sierszecki, Krzysztof; Angelov, Christo K.

    2007-01-01

    The paper presents a generative development methodology and component models of COMDES-II, a component-based software framework for distributed embedded control systems with real-time constraints. The adopted methodology allows for rapid modeling and validation of control software at a higher lev...... methodology for COMDES-II from a general perspective, describes the component models in details and demonstrates their application through a DC-Motor control system case study.......The paper presents a generative development methodology and component models of COMDES-II, a component-based software framework for distributed embedded control systems with real-time constraints. The adopted methodology allows for rapid modeling and validation of control software at a higher level...

  1. Validation of models in an imaging infrared simulation

    CSIR Research Space (South Africa)

    Willers, C

    2007-10-01

    Full Text Available threeprocessesfortransformingtheinformationbetweentheentities. Reality/ Problem Entity Conceptual Model Computerized Model Model Validation ModelVerification Model Qualification Computer Implementation Analysisand Modelling Simulationand Experimentation “Substantiationthata....C.Refsgaard ,ModellingGuidelines-terminology andguidingprinciples, AdvancesinWaterResources, Vol27,No1,January2004,?pp.71-82(12),Elsevier. et.al. [5]N.Oreskes,et.al.,Verification,Validation,andConfirmationof NumericalModelsintheEarthSciences,Science,Vol263, Number...

  2. Novel and validated titrimetric method for determination of selected angiotensin-II-receptor antagonists in pharmaceutical preparations and its comparison with UV spectrophotometric determination

    Directory of Open Access Journals (Sweden)

    Shrikant H. Patil

    2012-12-01

    Full Text Available A novel and simple titrimetric method for determination of commonly used angiotensin-II-receptor antagonists (ARA-IIs is developed and validated. The direct acid base titration of four ARA-IIs, namely eprosartan mesylate, irbesartan, telmisartan and valsartan, was carried out in the mixture of ethanol:water (1:1 as solvent using standardized sodium hydroxide aqueous solution as titrant, either visually using phenolphthalein as an indicator or potentiometrically using combined pH electrode. The method was found to be accurate and precise, having relative standard deviation of less than 2% for all ARA-IIs studied. Also, it was shown that the method could be successfully applied to the assay of commercial pharmaceuticals containing the above-mentioned ARA-IIs. The validity of the method was tested by the recovery studies of standard addition to pharmaceuticals and the results were found to be satisfactory. Results obtained by this method were found to be in good agreement with those obtained by UV spectrophotometric method. For UV spectrophotometric analysis ethanol was used as a solvent and wavelength of 233 nm, 246 nm, 296 nm, and 250 nm was selected for determination of eprosartan mesylate, irbesartan, telmisartan, and valsartan respectively. The proposed titrimetric method is simple, rapid, convenient and sufficiently precise for quality control purposes. Keywords: Angiotensin-II-receptor antagonists, Titrimetric assay, UV spectrophotometry, Validation

  3. System modeling and simulation at EBR-II

    International Nuclear Information System (INIS)

    Dean, E.M.; Lehto, W.K.; Larson, H.A.

    1986-01-01

    The codes being developed and verified using EBR-II data are the NATDEMO, DSNP and CSYRED. NATDEMO is a variation of the Westinghouse DEMO code coupled to the NATCON code previously used to simulate perturbations of reactor flow and inlet temperature and loss-of-flow transients leading to natural convection in EBR-II. CSYRED uses the Continuous System Modeling Program (CSMP) to simulate the EBR-II core, including power, temperature, control-rod movement reactivity effects and flow and is used primarily to model reactivity induced power transients. The Dynamic Simulator for Nuclear Power Plants (DSNP) allows a whole plant, thermal-hydraulic simulation using specific component and system models called from libraries. It has been used to simulate flow coastdown transients, reactivity insertion events and balance-of-plant perturbations

  4. Rapid Robot Design Validation, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Energid Technologies will create a comprehensive software infrastructure for rapid validation of robot designs. The software will support push-button validation...

  5. Methods and Model Development for Coupled RELAP5/PARCS Analysis of the Atucha-II Nuclear Power Plant

    International Nuclear Information System (INIS)

    Ward, A.M.; Collins, B.S.; Xu, Y.; Downar, Th.J.; Madariaga, M.

    2011-01-01

    In order to analyze the steady state and transient behavior of CNA-II, several tasks were required. Methods and models were developed in several areas. HELIOS lattice models were developed and benchmarked against WIMS/MCNP5 results generated by NA-SA. Cross-sections for the coupled RELAP5/PARCS calculation were extracted from HELIOS within the Gen PMAXS framework. The validation of both HELIOS and PARCS was performed primarily by comparisons to WIMS/PUMA and MCNP for idealized models. Special methods were developed to model the control rods and boron injection systems of CNA-II. The insertion of the rods is oblique, and a special routine was added to PARCS to treat this effect. CFD results combined with specialized mapping routines were used to model the boron injection system. In all cases there was good agreement in the results which provided confidence in the neutronics methods and modeling. A coupled code benchmark between U of M and U of Pisa is ongoing and results are still preliminary. Under a LOCA transient, the best estimate behavior of the core appears to be acceptable

  6. Validation of WIMS-AECL/(MULTICELL)/RFSP system by the results of phase-B test at Wolsung-II unit

    Energy Technology Data Exchange (ETDEWEB)

    Hong, In Seob; Min, Byung Joo; Suk, Ho Chun [Korea Atomic Energy Research Institute, Taejon (Korea)

    1999-03-01

    The object of this study is the validation of WIMS-AECL lattice code which has been proposed for the substitution of POWDERPUFS-V(PPV) code. For the validation of this code, WIMS-AECL/(MULTICELL)/RFSP (lattice calculation/(incremental cross section calculation)/core calculation) code system has been used for the Post-Simulation of Phase-B physics Test at Wolsung-II unit. This code system had been used for the Wolsong-I and Point Lepraeu reactors, but after a few modifications of WIMS-AECL input values for Wolsong-II, the results of WIMS-AECL/RFSP code calculations are much improved to those of the old ones. Most of the results show good estimation except moderator temperature coefficient test. And the verification of this result must be done, which is one of the further work. 6 figs., 15 tabs. (Author)

  7. Feature Extraction for Structural Dynamics Model Validation

    Energy Technology Data Exchange (ETDEWEB)

    Farrar, Charles [Los Alamos National Laboratory; Nishio, Mayuko [Yokohama University; Hemez, Francois [Los Alamos National Laboratory; Stull, Chris [Los Alamos National Laboratory; Park, Gyuhae [Chonnam Univesity; Cornwell, Phil [Rose-Hulman Institute of Technology; Figueiredo, Eloi [Universidade Lusófona; Luscher, D. J. [Los Alamos National Laboratory; Worden, Keith [University of Sheffield

    2016-01-13

    As structural dynamics becomes increasingly non-modal, stochastic and nonlinear, finite element model-updating technology must adopt the broader notions of model validation and uncertainty quantification. For example, particular re-sampling procedures must be implemented to propagate uncertainty through a forward calculation, and non-modal features must be defined to analyze nonlinear data sets. The latter topic is the focus of this report, but first, some more general comments regarding the concept of model validation will be discussed.

  8. PARALLEL MEASUREMENT AND MODELING OF TRANSPORT IN THE DARHT II BEAMLINE ON ETA II

    International Nuclear Information System (INIS)

    Chambers, F W; Raymond, B A; Falabella, S; Lee, B S; Richardson, R A; Weir, J T; Davis, H A; Schultze, M E

    2005-01-01

    To successfully tune the DARHT II transport beamline requires the close coupling of a model of the beam transport and the measurement of the beam observables as the beam conditions and magnet settings are varied. For the ETA II experiment using the DARHT II beamline components this was achieved using the SUICIDE (Simple User Interface Connecting to an Integrated Data Environment) data analysis environment and the FITS (Fully Integrated Transport Simulation) model. The SUICIDE environment has direct access to the experimental beam transport data at acquisition and the FITS predictions of the transport for immediate comparison. The FITS model is coupled into the control system where it can read magnet current settings for real time modeling. We find this integrated coupling is essential for model verification and the successful development of a tuning aid for the efficient convergence on a useable tune. We show the real time comparisons of simulation and experiment and explore the successes and limitations of this close coupled approach

  9. The concept of validation of numerical models for consequence analysis

    International Nuclear Information System (INIS)

    Borg, Audun; Paulsen Husted, Bjarne; Njå, Ove

    2014-01-01

    Numerical models such as computational fluid dynamics (CFD) models are increasingly used in life safety studies and other types of analyses to calculate the effects of fire and explosions. The validity of these models is usually established by benchmark testing. This is done to quantitatively measure the agreement between the predictions provided by the model and the real world represented by observations in experiments. This approach assumes that all variables in the real world relevant for the specific study are adequately measured in the experiments and in the predictions made by the model. In this paper the various definitions of validation for CFD models used for hazard prediction are investigated to assess their implication for consequence analysis in a design phase. In other words, how is uncertainty in the prediction of future events reflected in the validation process? The sources of uncertainty are viewed from the perspective of the safety engineer. An example of the use of a CFD model is included to illustrate the assumptions the analyst must make and how these affect the prediction made by the model. The assessments presented in this paper are based on a review of standards and best practice guides for CFD modeling and the documentation from two existing CFD programs. Our main thrust has been to assess how validation work is performed and communicated in practice. We conclude that the concept of validation adopted for numerical models is adequate in terms of model performance. However, it does not address the main sources of uncertainty from the perspective of the safety engineer. Uncertainty in the input quantities describing future events, which are determined by the model user, outweighs the inaccuracies in the model as reported in validation studies. - Highlights: • Examine the basic concept of validation applied to models for consequence analysis. • Review standards and guides for validation of numerical models. • Comparison of the validation

  10. Validation of ASTEC core degradation and containment models

    International Nuclear Information System (INIS)

    Kruse, Philipp; Brähler, Thimo; Koch, Marco K.

    2014-01-01

    Ruhr-Universitaet Bochum performed in a German funded project validation of in-vessel and containment models of the integral code ASTEC V2, jointly developed by IRSN (France) and GRS (Germany). In this paper selected results of this validation are presented. In the in-vessel part, the main point of interest was the validation of the code capability concerning cladding oxidation and hydrogen generation. The ASTEC calculations of QUENCH experiments QUENCH-03 and QUENCH-11 show satisfactory results, despite of some necessary adjustments in the input deck. Furthermore, the oxidation models based on the Cathcart–Pawel and Urbanic–Heidrick correlations are not suitable for higher temperatures while the ASTEC model BEST-FIT based on the Prater–Courtright approach at high temperature gives reliable enough results. One part of the containment model validation was the assessment of three hydrogen combustion models of ASTEC against the experiment BMC Ix9. The simulation results of these models differ from each other and therefore the quality of the simulations depends on the characteristic of each model. Accordingly, the CPA FRONT model, corresponding to the simplest necessary input parameters, provides the best agreement to the experimental data

  11. Analytical models approximating individual processes: a validation method.

    Science.gov (United States)

    Favier, C; Degallier, N; Menkès, C E

    2010-12-01

    Upscaling population models from fine to coarse resolutions, in space, time and/or level of description, allows the derivation of fast and tractable models based on a thorough knowledge of individual processes. The validity of such approximations is generally tested only on a limited range of parameter sets. A more general validation test, over a range of parameters, is proposed; this would estimate the error induced by the approximation, using the original model's stochastic variability as a reference. This method is illustrated by three examples taken from the field of epidemics transmitted by vectors that bite in a temporally cyclical pattern, that illustrate the use of the method: to estimate if an approximation over- or under-fits the original model; to invalidate an approximation; to rank possible approximations for their qualities. As a result, the application of the validation method to this field emphasizes the need to account for the vectors' biology in epidemic prediction models and to validate these against finer scale models. Copyright © 2010 Elsevier Inc. All rights reserved.

  12. In silico prediction of ROCK II inhibitors by different classification approaches.

    Science.gov (United States)

    Cai, Chuipu; Wu, Qihui; Luo, Yunxia; Ma, Huili; Shen, Jiangang; Zhang, Yongbin; Yang, Lei; Chen, Yunbo; Wen, Zehuai; Wang, Qi

    2017-11-01

    ROCK II is an important pharmacological target linked to central nervous system disorders such as Alzheimer's disease. The purpose of this research is to generate ROCK II inhibitor prediction models by machine learning approaches. Firstly, four sets of descriptors were calculated with MOE 2010 and PaDEL-Descriptor, and optimized by F-score and linear forward selection methods. In addition, four classification algorithms were used to initially build 16 classifiers with k-nearest neighbors [Formula: see text], naïve Bayes, Random forest, and support vector machine. Furthermore, three sets of structural fingerprint descriptors were introduced to enhance the predictive capacity of classifiers, which were assessed with fivefold cross-validation, test set validation and external test set validation. The best two models, MFK + MACCS and MLR + SubFP, have both MCC values of 0.925 for external test set. After that, a privileged substructure analysis was performed to reveal common chemical features of ROCK II inhibitors. Finally, binding modes were analyzed to identify relationships between molecular descriptors and activity, while main interactions were revealed by comparing the docking interaction of the most potent and the weakest ROCK II inhibitors. To the best of our knowledge, this is the first report on ROCK II inhibitors utilizing machine learning approaches that provides a new method for discovering novel ROCK II inhibitors.

  13. A discussion on validation of hydrogeological models

    International Nuclear Information System (INIS)

    Carrera, J.; Mousavi, S.F.; Usunoff, E.J.; Sanchez-Vila, X.; Galarza, G.

    1993-01-01

    Groundwater flow and solute transport are often driven by heterogeneities that elude easy identification. It is also difficult to select and describe the physico-chemical processes controlling solute behaviour. As a result, definition of a conceptual model involves numerous assumptions both on the selection of processes and on the representation of their spatial variability. Validating a numerical model by comparing its predictions with actual measurements may not be sufficient for evaluating whether or not it provides a good representation of 'reality'. Predictions will be close to measurements, regardless of model validity, if these are taken from experiments that stress well-calibrated model modes. On the other hand, predictions will be far from measurements when model parameters are very uncertain, even if the model is indeed a very good representation of the real system. Hence, we contend that 'classical' validation of hydrogeological models is not possible. Rather, models should be viewed as theories about the real system. We propose to follow a rigorous modeling approach in which different sources of uncertainty are explicitly recognized. The application of one such approach is illustrated by modeling a laboratory uranium tracer test performed on fresh granite, which was used as Test Case 1b in INTRAVAL. (author)

  14. Verification and validation for waste disposal models

    International Nuclear Information System (INIS)

    1987-07-01

    A set of evaluation criteria has been developed to assess the suitability of current verification and validation techniques for waste disposal methods. A survey of current practices and techniques was undertaken and evaluated using these criteria with the items most relevant to waste disposal models being identified. Recommendations regarding the most suitable verification and validation practices for nuclear waste disposal modelling software have been made

  15. Structural system identification: Structural dynamics model validation

    Energy Technology Data Exchange (ETDEWEB)

    Red-Horse, J.R.

    1997-04-01

    Structural system identification is concerned with the development of systematic procedures and tools for developing predictive analytical models based on a physical structure`s dynamic response characteristics. It is a multidisciplinary process that involves the ability (1) to define high fidelity physics-based analysis models, (2) to acquire accurate test-derived information for physical specimens using diagnostic experiments, (3) to validate the numerical simulation model by reconciling differences that inevitably exist between the analysis model and the experimental data, and (4) to quantify uncertainties in the final system models and subsequent numerical simulations. The goal of this project was to develop structural system identification techniques and software suitable for both research and production applications in code and model validation.

  16. Validating the passenger traffic model for Copenhagen

    DEFF Research Database (Denmark)

    Overgård, Christian Hansen; VUK, Goran

    2006-01-01

    The paper presents a comprehensive validation procedure for the passenger traffic model for Copenhagen based on external data from the Danish national travel survey and traffic counts. The model was validated for the years 2000 to 2004, with 2004 being of particular interest because the Copenhagen...... matched the observed traffic better than those of the transit assignment model. With respect to the metro forecasts, the model over-predicts metro passenger flows by 10% to 50%. The wide range of findings from the project resulted in two actions. First, a project was started in January 2005 to upgrade...

  17. Bayesian risk-based decision method for model validation under uncertainty

    International Nuclear Information System (INIS)

    Jiang Xiaomo; Mahadevan, Sankaran

    2007-01-01

    This paper develops a decision-making methodology for computational model validation, considering the risk of using the current model, data support for the current model, and cost of acquiring new information to improve the model. A Bayesian decision theory-based method is developed for this purpose, using a likelihood ratio as the validation metric for model assessment. An expected risk or cost function is defined as a function of the decision costs, and the likelihood and prior of each hypothesis. The risk is minimized through correctly assigning experimental data to two decision regions based on the comparison of the likelihood ratio with a decision threshold. A Bayesian validation metric is derived based on the risk minimization criterion. Two types of validation tests are considered: pass/fail tests and system response value measurement tests. The methodology is illustrated for the validation of reliability prediction models in a tension bar and an engine blade subjected to high cycle fatigue. The proposed method can effectively integrate optimal experimental design into model validation to simultaneously reduce the cost and improve the accuracy of reliability model assessment

  18. Validation of coastal oceanographic models at Laxemar-Simpevarp. Site descriptive modelling SDM-Site Laxemar

    International Nuclear Information System (INIS)

    Engqvist, Anders; Andrejev, Oleg

    2008-12-01

    validation can be summarized in three points: (i) The Baltic CR-model reproduces the measured salinity and the temperature profiles of the three peripheral stations acceptably well, while the correlation levels of the velocities are on an acceptable level for only one component, the other being close to zero; (ii) For the interior station Si24, the FR-model reproduces the salinity and the temperature profiles with a yet improved level of correlation compared with the CR-model; (iii) The bottom current velocity measured at Djupesund corresponds to an internal strait within the CDB-model and yields a correlation level of nearly 50% for salinity and about 95% for temperature. The conclusion is that the present validation of velocity components of the peripheral stations between the CR- and FR-domains has mainly confirmed what was found in the corresponding validation study of the Forsmark area, namely that this represents a challenge that demands considerably more measuring effort than has been possible to muster presently in order to average out sub-grid eddies that the model cannot resolve. This applies even though the levels of the correlation analysis are considerably higher than was found for the parallel study of the waters off the Forsmark coast. This together with supporting current velocity transects in the vicinity of the measurement stations can be explained by a more horizontally homogeneous flow field. For the inner station (Si24) that was computed by the FR-model, the correlation levels are considerably improved. Also for the station (Si25) pertaining to the CDB-model good correlation levels are reproduced. All temperature profiles are also acceptably well captured by the models, but this is judged to be more an effect of the seasonal variation than an expression of the virtue of the actual models. As for the Forsmark validation program, the salinity dynamics of the interior FR-domain is the strong point of the model, but in the present study high levels of

  19. Validation of coastal oceanographic models at Laxemar-Simpevarp. Site descriptive modelling SDM-Site Laxemar

    Energy Technology Data Exchange (ETDEWEB)

    Engqvist, Anders (A och I Engqvist Konsult HB, Vaxholm (SE)); Andrejev, Oleg (Finnish Inst. of Marine Research, Helsinki (FI))

    2008-12-15

    validation can be summarized in three points: (i) The Baltic CR-model reproduces the measured salinity and the temperature profiles of the three peripheral stations acceptably well, while the correlation levels of the velocities are on an acceptable level for only one component, the other being close to zero; (ii) For the interior station Si24, the FR-model reproduces the salinity and the temperature profiles with a yet improved level of correlation compared with the CR-model; (iii) The bottom current velocity measured at Djupesund corresponds to an internal strait within the CDB-model and yields a correlation level of nearly 50% for salinity and about 95% for temperature. The conclusion is that the present validation of velocity components of the peripheral stations between the CR- and FR-domains has mainly confirmed what was found in the corresponding validation study of the Forsmark area, namely that this represents a challenge that demands considerably more measuring effort than has been possible to muster presently in order to average out sub-grid eddies that the model cannot resolve. This applies even though the levels of the correlation analysis are considerably higher than was found for the parallel study of the waters off the Forsmark coast. This together with supporting current velocity transects in the vicinity of the measurement stations can be explained by a more horizontally homogeneous flow field. For the inner station (Si24) that was computed by the FR-model, the correlation levels are considerably improved. Also for the station (Si25) pertaining to the CDB-model good correlation levels are reproduced. All temperature profiles are also acceptably well captured by the models, but this is judged to be more an effect of the seasonal variation than an expression of the virtue of the actual models. As for the Forsmark validation program, the salinity dynamics of the interior FR-domain is the strong point of the model, but in the present study high levels of

  20. The role of CFD combustion modeling in hydrogen safety management-II: Validation based on homogeneous hydrogen-air experiments

    Energy Technology Data Exchange (ETDEWEB)

    Sathiah, Pratap, E-mail: sathiah@nrg.eu [Nuclear Research and Consultancy Group (NRG), Westerduinweg 3, 1755 ZG Petten (Netherlands); Haren, Steven van, E-mail: vanharen@nrg.eu [Nuclear Research and Consultancy Group (NRG), Westerduinweg 3, 1755 ZG Petten (Netherlands); Komen, Ed, E-mail: komen@nrg.eu [Nuclear Research and Consultancy Group (NRG), Westerduinweg 3, 1755 ZG Petten (Netherlands); Roekaerts, Dirk, E-mail: d.j.e.m.roekaerts@tudelft.nl [Department of Multi-Scale Physics, Delft University of Technology, P.O. Box 5, 2600 AA Delft (Netherlands)

    2012-11-15

    Highlights: Black-Right-Pointing-Pointer A CFD based method is proposed for the simulation of hydrogen deflagration. Black-Right-Pointing-Pointer A dynamic grid adaptation method is proposed to resolve turbulent flame brush thickness. Black-Right-Pointing-Pointer The predictions obtained using this method is in good agreement with the static grid method. Black-Right-Pointing-Pointer TFC model results are in good agreement with large-scale homogeneous hydrogen-air experiments. - Abstract: During a severe accident in a PWR, large quantities of hydrogen can be generated and released into the containment. The generated hydrogen, when mixed with air, can lead to hydrogen combustion. The dynamic pressure loads resulting from hydrogen combustion can be detrimental to the structural integrity of the reactor safety systems and the reactor containment. Therefore, accurate prediction of these pressure loads is an important safety issue. In a previous article, we presented a CFD based method to determine these pressure loads. This CFD method is based on the application of a turbulent flame speed closure combustion model. The validation analyses in our previous paper demonstrated that it is of utmost importance to apply successive mesh and time step refinement in order to get reliable results. In this article, we first determined to what extent the required computational effort required for our CFD approach can be reduced by the application of adaptive mesh refinement, while maintaining the accuracy requirements. Experiments performed within a small fan stirred explosion bomb were used for this purpose. It could be concluded that adaptive grid adaptation is a reliable and efficient method for usage in hydrogen deflagration analyses. For the two-dimensional validation analyses, the application of dynamic grid adaptation resulted in a reduction of the required computational effort by about one order of magnitude. In a second step, the considered CFD approach including adaptive

  1. Validation of coastal oceanographic models at Forsmark. Site descriptive modelling SDM-Site Forsmark

    Energy Technology Data Exchange (ETDEWEB)

    Engqvist, Anders (A och I Engqvist Konsult HB, Vaxholm (SE)); Andrejev, Oleg (Finnish Inst. of Marine Research, Helsinki (FI))

    2008-01-15

    few months; (ii) Both 3D-models miss some rapid up..and down-welling episodes that were clearly registered on all salinity- and temperature meters near the northern interface; (iii) The velocity profiles measured at the interface between the two nested models display a low but mainly positive correlation; (iv) The salinity dynamics in the interior station is fully acceptably simulated with improved correlation coefficients towards the surface; (v) The temperature profiles also generally display a high correlation between measurements and simulated data, certifying that the heat transfer through the surface is acceptably well simulated to render the salinity the dominating factor determining the density, but yet leaving room for further improvements. It seems safe to conclude that the validation of velocity components has confirmed what has been found in many instances previously, namely that this is a challenge that demands considerably more measuring effort than has been possible to muster in this study in order to average out sub-grid eddies that the model grid does not resolve. For the scalar fields temperature is acceptably well captured by the models, but this is judged to be more an effect of the seasonal variation than an expression of the virtue of the actual models. The internal salinity dynamics is the strong point of the model. Its temporal development at the inner station is convincingly well reproduced by this model approach. This means that the overall computed water exchange of the Oeregrundsgrepen can continued to be invested with due confidence

  2. Validation of coastal oceanographic models at Forsmark. Site descriptive modelling SDM-Site Forsmark

    International Nuclear Information System (INIS)

    Engqvist, Anders; Andrejev, Oleg

    2008-01-01

    few months; (ii) Both 3D-models miss some rapid up..and down-welling episodes that were clearly registered on all salinity- and temperature meters near the northern interface; (iii) The velocity profiles measured at the interface between the two nested models display a low but mainly positive correlation; (iv) The salinity dynamics in the interior station is fully acceptably simulated with improved correlation coefficients towards the surface; (v) The temperature profiles also generally display a high correlation between measurements and simulated data, certifying that the heat transfer through the surface is acceptably well simulated to render the salinity the dominating factor determining the density, but yet leaving room for further improvements. It seems safe to conclude that the validation of velocity components has confirmed what has been found in many instances previously, namely that this is a challenge that demands considerably more measuring effort than has been possible to muster in this study in order to average out sub-grid eddies that the model grid does not resolve. For the scalar fields temperature is acceptably well captured by the models, but this is judged to be more an effect of the seasonal variation than an expression of the virtue of the actual models. The internal salinity dynamics is the strong point of the model. Its temporal development at the inner station is convincingly well reproduced by this model approach. This means that the overall computed water exchange of the Oeregrundsgrepen can continued to be invested with due confidence

  3. Validation of mentorship model for newly qualified professional ...

    African Journals Online (AJOL)

    Newly qualified professional nurses (NQPNs) allocated to community health care services require the use of validated model to practice independently. Validation was done to adapt and assess if the model is understood and could be implemented by NQPNs and mentors employed in community health care services.

  4. Validation and Adaptation of Router and Switch Models

    NARCIS (Netherlands)

    Boltjes, B.; Fernandez Diaz, I.; Kock, B.A.; Langeveld, R.J.G.M.; Schoenmaker, G.

    2003-01-01

    This paper describes validating OPNET models of key devices for the next generation IP-based tactical network of the Royal Netherlands Army (RNLA). The task of TNO-FEL is to provide insight in scalability and performance of future deployed networks. Because validated models ol key Cisco equipment

  5. Validating the Copenhagen Psychosocial Questionnaire (COPSOQ-II) Using Set-ESEM: Identifying Psychosocial Risk Factors in a Sample of School Principals.

    Science.gov (United States)

    Dicke, Theresa; Marsh, Herbert W; Riley, Philip; Parker, Philip D; Guo, Jiesi; Horwood, Marcus

    2018-01-01

    School principals world-wide report high levels of strain and attrition resulting in a shortage of qualified principals. It is thus crucial to identify psychosocial risk factors that reflect principals' occupational wellbeing. For this purpose, we used the Copenhagen Psychosocial Questionnaire (COPSOQ-II), a widely used self-report measure covering multiple psychosocial factors identified by leading occupational stress theories. We evaluated the COPSOQ-II regarding factor structure and longitudinal, discriminant, and convergent validity using latent structural equation modeling in a large sample of Australian school principals ( N = 2,049). Results reveal that confirmatory factor analysis produced marginally acceptable model fit. A novel approach we call set exploratory structural equation modeling (set-ESEM), where cross-loadings were only allowed within a priori defined sets of factors, fit well, and was more parsimonious than a full ESEM. Further multitrait-multimethod models based on the set-ESEM confirm the importance of a principal's psychosocial risk factors; Stressors and depression were related to demands and ill-being, while confidence and autonomy were related to wellbeing. We also show that working in the private sector was beneficial for showing a low psychosocial risk, while other demographics have little effects. Finally, we identify five latent risk profiles (high risk to no risk) of school principals based on all psychosocial factors. Overall the research presented here closes the theory application gap of a strong multi-dimensional measure of psychosocial risk-factors.

  6. Validation of SAGE II ozone measurements

    Science.gov (United States)

    Cunnold, D. M.; Chu, W. P.; Mccormick, M. P.; Veiga, R. E.; Barnes, R. A.

    1989-01-01

    Five ozone profiles from the Stratospheric Aerosol and Gas Experiment (SAGE) II are compared with coincident ozonesonde measurements obtained at Natal, Brazil, and Wallops Island, Virginia. It is shown that the mean difference between all of the measurements is about 1 percent and that the agreement is within 7 percent at altitudes between 20 and 53 km. Good agreement is also found for ozone mixing ratios on pressure surfaces. It is concluded that the SAGE II profiles provide useful ozone information up to about 60 km altitude.

  7. Methods and Model Development for Coupled RELAP5/PARCS Analysis of the Atucha-II Nuclear Power Plant

    Directory of Open Access Journals (Sweden)

    Andrew M. Ward

    2011-01-01

    Full Text Available In order to analyze the steady state and transient behavior of CNA-II, several tasks were required. Methods and models were developed in several areas. HELIOS lattice models were developed and benchmarked against WIMS/MCNP5 results generated by NA-SA. Cross-sections for the coupled RELAP5/PARCS calculation were extracted from HELIOS within the GenPMAXS framework. The validation of both HELIOS and PARCS was performed primarily by comparisons to WIMS/PUMA and MCNP for idealized models. Special methods were developed to model the control rods and boron injection systems of CNA-II. The insertion of the rods is oblique, and a special routine was added to PARCS to treat this effect. CFD results combined with specialized mapping routines were used to model the boron injection system. In all cases there was good agreement in the results which provided confidence in the neutronics methods and modeling. A coupled code benchmark between U of M and U of Pisa is ongoing and results are still preliminary. Under a LOCA transient, the best estimate behavior of the core appears to be acceptable.

  8. Development of a Conservative Model Validation Approach for Reliable Analysis

    Science.gov (United States)

    2015-01-01

    CIE 2015 August 2-5, 2015, Boston, Massachusetts, USA [DRAFT] DETC2015-46982 DEVELOPMENT OF A CONSERVATIVE MODEL VALIDATION APPROACH FOR RELIABLE...obtain a conservative simulation model for reliable design even with limited experimental data. Very little research has taken into account the...3, the proposed conservative model validation is briefly compared to the conventional model validation approach. Section 4 describes how to account

  9. Predicting the success of IVF: external validation of the van Loendersloot's model.

    Science.gov (United States)

    Sarais, Veronica; Reschini, Marco; Busnelli, Andrea; Biancardi, Rossella; Paffoni, Alessio; Somigliana, Edgardo

    2016-06-01

    Is the predictive model for IVF success proposed by van Loendersloot et al. valid in a different geographical and cultural context? The model discriminates well but was less accurate than in the original context where it was developed. Several independent groups have developed models that combine different variables with the aim of estimating the chance of pregnancy with IVF but only four of them have been externally validated. One of these four, the van Loendersloot's model, deserves particular attention and further investigation for at least three reasons; (i) the reported area under the receiver operating characteristics curve (c-statistics) in the temporal validation setting was the highest reported to date (0.68), (ii) the perspective of the model is clinically wise since it includes variables obtained from previous failed cycles, if any, so it can be applied to any women entering an IVF cycle, (iii) the model lacks external validation in a geographically different center. Retrospective cohort study of women undergoing oocyte retrieval for IVF between January 2013 and December 2013 at the infertility unit of the Fondazione Ca' Granda, Ospedale Maggiore Policlinico of Milan, Italy. Only the first oocyte retrieval cycle performed during the study period was included in the study. Women with previous IVF cycles were excluded if the last one before the study cycle was in another center. The main outcome was the cumulative live birth rate per oocytes retrieval. Seven hundred seventy-two women were selected. Variables included in the van Loendersloot's model and the relative weights (beta) were used. The variable resulting from this combination (Y) was transformed into a probability. The discriminatory capacity was assessed using the c-statistics. Calibration was made using a logistic regression that included Y as the unique variable and live birth as the outcome. Data are presented using both the original and the calibrated models. Performance was evaluated

  10. Capitalizing on Citizen Science Data for Validating Models and Generating Hypotheses Describing Meteorological Drivers of Mosquito-Borne Disease Risk

    Science.gov (United States)

    Boger, R. A.; Low, R.; Paull, S.; Anyamba, A.; Soebiyanto, R. P.

    2017-12-01

    Temperature and precipitation are important drivers of mosquito population dynamics, and a growing set of models have been proposed to characterize these relationships. Validation of these models, and development of broader theories across mosquito species and regions could nonetheless be improved by comparing observations from a global dataset of mosquito larvae with satellite-based measurements of meteorological variables. Citizen science data can be particularly useful for two such aspects of research into the meteorological drivers of mosquito populations: i) Broad-scale validation of mosquito distribution models and ii) Generation of quantitative hypotheses regarding changes to mosquito abundance and phenology across scales. The recently released GLOBE Observer Mosquito Habitat Mapper (GO-MHM) app engages citizen scientists in identifying vector taxa, mapping breeding sites and decommissioning non-natural habitats, and provides a potentially useful new tool for validating mosquito ubiquity projections based on the analysis of remotely sensed environmental data. Our early work with GO-MHM data focuses on two objectives: validating citizen science reports of Aedes aegypti distribution through comparison with accepted scientific data sources, and exploring the relationship between extreme temperature and precipitation events and subsequent observations of mosquito larvae. Ultimately the goal is to develop testable hypotheses regarding the shape and character of this relationship between mosquito species and regions.

  11. Validation of ecological state space models using the Laplace approximation

    DEFF Research Database (Denmark)

    Thygesen, Uffe Høgsbro; Albertsen, Christoffer Moesgaard; Berg, Casper Willestofte

    2017-01-01

    Many statistical models in ecology follow the state space paradigm. For such models, the important step of model validation rarely receives as much attention as estimation or hypothesis testing, perhaps due to lack of available algorithms and software. Model validation is often based on a naive...... for estimation in general mixed effects models. Implementing one-step predictions in the R package Template Model Builder, we demonstrate that it is possible to perform model validation with little effort, even if the ecological model is multivariate, has non-linear dynamics, and whether observations...... useful directions in which the model could be improved....

  12. Advanced training simulator models. Implementation and validation

    International Nuclear Information System (INIS)

    Borkowsky, Jeffrey; Judd, Jerry; Belblidia, Lotfi; O'farrell, David; Andersen, Peter

    2008-01-01

    Modern training simulators are required to replicate plant data for both thermal-hydraulic and neutronic response. Replication is required such that reactivity manipulation on the simulator properly trains the operator for reactivity manipulation at the plant. This paper discusses advanced models which perform this function in real-time using the coupled code system THOR/S3R. This code system models the all fluids systems in detail using an advanced, two-phase thermal-hydraulic a model. The nuclear core is modeled using an advanced, three-dimensional nodal method and also by using cycle-specific nuclear data. These models are configured to run interactively from a graphical instructor station or handware operation panels. The simulator models are theoretically rigorous and are expected to replicate the physics of the plant. However, to verify replication, the models must be independently assessed. Plant data is the preferred validation method, but plant data is often not available for many important training scenarios. In the absence of data, validation may be obtained by slower-than-real-time transient analysis. This analysis can be performed by coupling a safety analysis code and a core design code. Such a coupling exists between the codes RELAP5 and SIMULATE-3K (S3K). RELAP5/S3K is used to validate the real-time model for several postulated plant events. (author)

  13. PISC II: Parametric studies. Monitoring of PISC-II parametric studies in ultrasonic NDT for PWR

    International Nuclear Information System (INIS)

    Toft, M.W.

    1989-09-01

    The CEGB NDT Applications Centre is partipating in the EEC-funded international Programme for the Inspection of Steel Components (PISC) on account of its relevance to the inspection of Sizewell B and future PWRs. This report describes an inspection monitoring exercise undertaken by NDTAC under partial funding from JRC Ispra, at the initiation of the PISC-III Ultrasonic Modelling Group. Experimental studies have been carried out under PISC-II to investigate ultrasonic defect response as a function of various parameters which characterise the inspection situation. Some of these parametric studies are potentially useful for the validation of theoretical models of ultrasonic inspection and are consequently relevant to the work of the PISC-III Modelling Group. The aim of the present exercise was to ensure that data obtained by the various contract organizations participating in the PISC-II Parametric Studies was of high quality, was a complete record of the inspection and would yield valid comparisons with the predictions of theoretical models. The exercise entailed visits by a nominated CEGB observer to 4 European NDT Laboratories at which the parametric studies were in progress; CISE (Milan); UKAEA (Harwell); UKAEA (Risley) and Vincotte (Brussels). This report presents the findings of those visits

  14. Identification of age-dependent motor and neuropsychological behavioural abnormalities in a mouse model of Mucopolysaccharidosis Type II

    Science.gov (United States)

    Gleitz, Hélène F. E.; O’Leary, Claire; Holley, Rebecca J.

    2017-01-01

    Severe mucopolysaccharidosis type II (MPS II) is a progressive lysosomal storage disease caused by mutations in the IDS gene, leading to a deficiency in the iduronate-2-sulfatase enzyme that is involved in heparan sulphate and dermatan sulphate catabolism. In constitutive form, MPS II is a multi-system disease characterised by progressive neurocognitive decline, severe skeletal abnormalities and hepatosplenomegaly. Although enzyme replacement therapy has been approved for treatment of peripheral organs, no therapy effectively treats the cognitive symptoms of the disease and novel therapies are in development to remediate this. Therapeutic efficacy and subsequent validation can be assessed using a variety of outcome measures that are translatable to clinical practice, such as behavioural measures. We sought to consolidate current knowledge of the cognitive, skeletal and motor abnormalities present in the MPS II mouse model by performing time course behavioural examinations of working memory, anxiety, activity levels, sociability and coordination and balance, up to 8 months of age. Cognitive decline associated with alterations in spatial working memory is detectable at 8 months of age in MPS II mice using spontaneous alternation, together with an altered response to novel environments and anxiolytic behaviour in the open-field. Coordination and balance on the accelerating rotarod were also significantly worse at 8 months, and may be associated with skeletal changes seen in MPS II mice. We demonstrate that the progressive nature of MPS II disease is also seen in the mouse model, and that cognitive and motor differences are detectable at 8 months of age using spontaneous alternation, the accelerating rotarod and the open-field tests. This study establishes neurological, motor and skeletal measures for use in pre-clinical studies to develop therapeutic approaches in MPS II. PMID:28207863

  15. Imidazole derivatives as angiotensin II AT1 receptor blockers: Benchmarks, drug-like calculations and quantitative structure-activity relationships modeling

    Science.gov (United States)

    Alloui, Mebarka; Belaidi, Salah; Othmani, Hasna; Jaidane, Nejm-Eddine; Hochlaf, Majdi

    2018-03-01

    We performed benchmark studies on the molecular geometry, electron properties and vibrational analysis of imidazole using semi-empirical, density functional theory and post Hartree-Fock methods. These studies validated the use of AM1 for the treatment of larger systems. Then, we treated the structural, physical and chemical relationships for a series of imidazole derivatives acting as angiotensin II AT1 receptor blockers using AM1. QSAR studies were done for these imidazole derivatives using a combination of various physicochemical descriptors. A multiple linear regression procedure was used to design the relationships between molecular descriptor and the activity of imidazole derivatives. Results validate the derived QSAR model.

  16. ARI3SG: Aerosol retention in the secondary side of a steam generator. Part II: Model validation and uncertainty analysis

    International Nuclear Information System (INIS)

    Lopez, Claudia; Herranz, Luis E.

    2012-01-01

    Highlights: ► Validation of a model (ARI3SG) for the aerosol retention in the break stage of a steam generator under SGTR conditions. ► Interpretation of the experimental SGTR and CAAT data by using the ARI3SG model. ► Assessment of the epistemic and stochastic uncertainties effect on the ARI3SG results. - Abstract: A large body of data has been gathered in the last decade through the EU-SGTR, ARTIST and ARTIST 2 projects for aerosol retention in the steam generator during SGTR severe accident sequences. At the same time the attempt to extend the analytical capability has resulted in models that need to be validated. The ARI3SG is one of such developments and it has been built to estimate the aerosol retention in the break stage of a “dry” steam generator. This paper assesses the ARI3SG predictability by comparing its estimates to open data and by analyzing the effect of associated uncertainties. Datamodel comparison has been shown to be satisfactory and highlight the potential use of an ARI3SG-like formulation in system codes.

  17. Validation of the TIARA code to tritium inventory data

    International Nuclear Information System (INIS)

    Billone, M.C.

    1994-03-01

    The TIARA code has been developed to predict tritium inventory in Li 2 O breeder ceramic and to predict purge exit flow rate and composition. Inventory predictions are based on models for bulk diffusion, surface desorption, solubility and precipitation. Parameters for these models are determined from the results of laboratory annealing studies on unirradiated and irradiated Li 2 O. Inventory data from in-reactor purge flow tests are used for model improvement, fine-tuning of model parameters and validation. In this current work, the inventory measurement near the purge inlet from the BEATRIX-II thin-ring sample is used to fine tune the surface desorption model parameters for T > 470 degrees C, and the inventory measurement near the midplane from VOM-15H is used to fine tune the moisture solubility model parameters. predictions are then validated to the remaining inventory data from EXOTIC-2 (1 point), SIBELIUS (3 axial points), VOM-15H (2 axial points), CRITIC-1 (4 axial points), BEATRIX-II thin ring (3 axial points) and BEATRIX-II thick pellet (5 radial points). Thus. of the 20 data points, two we re used for fine tuning model parameters and 18 were used for validation. The inventory data span the range of 0.05--1.44 wppm with an average of 0.48 wppm. The data pertain to samples whose end-of-life temperatures were in the range of 490--1000 degrees C. On the average, the TIARA predictions agree quite well with the data (< 0.02 wppm difference). However, the root-mean-square deviation is 0.44 wppm, mostly due to over-predictions for the SIBELIUS samples and the higher-temperature radial samples from the BEATRIX-11 thick-pellet

  18. Prospective validation of pathologic complete response models in rectal cancer: Transferability and reproducibility.

    Science.gov (United States)

    van Soest, Johan; Meldolesi, Elisa; van Stiphout, Ruud; Gatta, Roberto; Damiani, Andrea; Valentini, Vincenzo; Lambin, Philippe; Dekker, Andre

    2017-09-01

    Multiple models have been developed to predict pathologic complete response (pCR) in locally advanced rectal cancer patients. Unfortunately, validation of these models normally omit the implications of cohort differences on prediction model performance. In this work, we will perform a prospective validation of three pCR models, including information whether this validation will target transferability or reproducibility (cohort differences) of the given models. We applied a novel methodology, the cohort differences model, to predict whether a patient belongs to the training or to the validation cohort. If the cohort differences model performs well, it would suggest a large difference in cohort characteristics meaning we would validate the transferability of the model rather than reproducibility. We tested our method in a prospective validation of three existing models for pCR prediction in 154 patients. Our results showed a large difference between training and validation cohort for one of the three tested models [Area under the Receiver Operating Curve (AUC) cohort differences model: 0.85], signaling the validation leans towards transferability. Two out of three models had a lower AUC for validation (0.66 and 0.58), one model showed a higher AUC in the validation cohort (0.70). We have successfully applied a new methodology in the validation of three prediction models, which allows us to indicate if a validation targeted transferability (large differences between training/validation cohort) or reproducibility (small cohort differences). © 2017 American Association of Physicists in Medicine.

  19. A methodology for PSA model validation

    International Nuclear Information System (INIS)

    Unwin, S.D.

    1995-09-01

    This document reports Phase 2 of work undertaken by Science Applications International Corporation (SAIC) in support of the Atomic Energy Control Board's Probabilistic Safety Assessment (PSA) review. A methodology is presented for the systematic review and evaluation of a PSA model. These methods are intended to support consideration of the following question: To within the scope and depth of modeling resolution of a PSA study, is the resultant model a complete and accurate representation of the subject plant? This question was identified as a key PSA validation issue in SAIC's Phase 1 project. The validation methods are based on a model transformation process devised to enhance the transparency of the modeling assumptions. Through conversion to a 'success-oriented' framework, a closer correspondence to plant design and operational specifications is achieved. This can both enhance the scrutability of the model by plant personnel, and provide an alternative perspective on the model that may assist in the identification of deficiencies. The model transformation process is defined and applied to fault trees documented in the Darlington Probabilistic Safety Evaluation. A tentative real-time process is outlined for implementation and documentation of a PSA review based on the proposed methods. (author). 11 refs., 9 tabs., 30 refs

  20. Assessing Discriminative Performance at External Validation of Clinical Prediction Models.

    Directory of Open Access Journals (Sweden)

    Daan Nieboer

    Full Text Available External validation studies are essential to study the generalizability of prediction models. Recently a permutation test, focusing on discrimination as quantified by the c-statistic, was proposed to judge whether a prediction model is transportable to a new setting. We aimed to evaluate this test and compare it to previously proposed procedures to judge any changes in c-statistic from development to external validation setting.We compared the use of the permutation test to the use of benchmark values of the c-statistic following from a previously proposed framework to judge transportability of a prediction model. In a simulation study we developed a prediction model with logistic regression on a development set and validated them in the validation set. We concentrated on two scenarios: 1 the case-mix was more heterogeneous and predictor effects were weaker in the validation set compared to the development set, and 2 the case-mix was less heterogeneous in the validation set and predictor effects were identical in the validation and development set. Furthermore we illustrated the methods in a case study using 15 datasets of patients suffering from traumatic brain injury.The permutation test indicated that the validation and development set were homogenous in scenario 1 (in almost all simulated samples and heterogeneous in scenario 2 (in 17%-39% of simulated samples. Previously proposed benchmark values of the c-statistic and the standard deviation of the linear predictors correctly pointed at the more heterogeneous case-mix in scenario 1 and the less heterogeneous case-mix in scenario 2.The recently proposed permutation test may provide misleading results when externally validating prediction models in the presence of case-mix differences between the development and validation population. To correctly interpret the c-statistic found at external validation it is crucial to disentangle case-mix differences from incorrect regression coefficients.

  1. Testing of a one dimensional model for Field II calibration

    DEFF Research Database (Denmark)

    Bæk, David; Jensen, Jørgen Arendt; Willatzen, Morten

    2008-01-01

    Field II is a program for simulating ultrasound transducer fields. It is capable of calculating the emitted and pulse-echoed fields for both pulsed and continuous wave transducers. To make it fully calibrated a model of the transducer’s electro-mechanical impulse response must be included. We...... examine an adapted one dimensional transducer model originally proposed by Willatzen [9] to calibrate Field II. This model is modified to calculate the required impulse responses needed by Field II for a calibrated field pressure and external circuit current calculation. The testing has been performed...... to the calibrated Field II program for 1, 4, and 10 cycle excitations. Two parameter sets were applied for modeling, one real valued Pz27 parameter set, manufacturer supplied, and one complex valued parameter set found in literature, Alguer´o et al. [11]. The latter implicitly accounts for attenuation. Results show...

  2. Cross-validation pitfalls when selecting and assessing regression and classification models.

    Science.gov (United States)

    Krstajic, Damjan; Buturovic, Ljubomir J; Leahy, David E; Thomas, Simon

    2014-03-29

    We address the problem of selecting and assessing classification and regression models using cross-validation. Current state-of-the-art methods can yield models with high variance, rendering them unsuitable for a number of practical applications including QSAR. In this paper we describe and evaluate best practices which improve reliability and increase confidence in selected models. A key operational component of the proposed methods is cloud computing which enables routine use of previously infeasible approaches. We describe in detail an algorithm for repeated grid-search V-fold cross-validation for parameter tuning in classification and regression, and we define a repeated nested cross-validation algorithm for model assessment. As regards variable selection and parameter tuning we define two algorithms (repeated grid-search cross-validation and double cross-validation), and provide arguments for using the repeated grid-search in the general case. We show results of our algorithms on seven QSAR datasets. The variation of the prediction performance, which is the result of choosing different splits of the dataset in V-fold cross-validation, needs to be taken into account when selecting and assessing classification and regression models. We demonstrate the importance of repeating cross-validation when selecting an optimal model, as well as the importance of repeating nested cross-validation when assessing a prediction error.

  3. Comorbidity predicts poor prognosis in nasopharyngeal carcinoma: Development and validation of a predictive score model

    International Nuclear Information System (INIS)

    Guo, Rui; Chen, Xiao-Zhong; Chen, Lei; Jiang, Feng; Tang, Ling-Long; Mao, Yan-Ping; Zhou, Guan-Qun; Li, Wen-Fei; Liu, Li-Zhi; Tian, Li; Lin, Ai-Hua; Ma, Jun

    2015-01-01

    Background and purpose: The impact of comorbidity on prognosis in nasopharyngeal carcinoma (NPC) is poorly characterized. Material and methods: Using the Adult Comorbidity Evaluation-27 (ACE-27) system, we assessed the prognostic value of comorbidity and developed, validated and confirmed a predictive score model in a training set (n = 658), internal validation set (n = 658) and independent set (n = 652) using area under the receiver operating curve analysis. Results: Comorbidity was present in 40.4% of 1968 patients (mild, 30.1%; moderate, 9.1%; severe, 1.2%). Compared to an ACE-27 score ⩽1, patients with an ACE-27 score >1 in the training set had shorter overall survival (OS) and disease-free survival (DFS) (both P < 0.001), similar results were obtained in the other sets (P < 0.05). In multivariate analysis, ACE-27 score was a significant independent prognostic factor for OS and DFS. The combined risk score model including ACE-27 had superior prognostic value to TNM stage alone in the internal validation set (0.70 vs. 0.66; P = 0.02), independent set (0.73 vs. 0.67; P = 0.002) and all patients (0.71 vs. 0.67; P < 0.001). Conclusions: Comorbidity significantly affects prognosis, especially in stages II and III, and should be incorporated into the TNM staging system for NPC. Assessment of comorbidity may improve outcome prediction and help tailor individualized treatment

  4. Developing a model for validation and prediction of bank customer ...

    African Journals Online (AJOL)

    Credit risk is the most important risk of banks. The main approaches of the bank to reduce credit risk are correct validation using the final status and the validation model parameters. High fuel of bank reserves and lost or outstanding facilities of banks indicate the lack of appropriate validation models in the banking network.

  5. Validation of heat transfer models for gap cooling

    International Nuclear Information System (INIS)

    Okano, Yukimitsu; Nagae, Takashi; Murase, Michio

    2004-01-01

    For severe accident assessment of a light water reactor, models of heat transfer in a narrow annular gap between overheated core debris and a reactor pressure vessel are important for evaluating vessel integrity and accident management. The authors developed and improved the models of heat transfer. However, validation was not sufficient for applicability of the gap heat flux correlation to the debris cooling in the vessel lower head and applicability of the local boiling heat flux correlations to the high-pressure conditions. Therefore, in this paper, we evaluated the validity of the heat transfer models and correlations by analyses for ALPHA and LAVA experiments where molten aluminum oxide (Al 2 O 3 ) at about 2700 K was poured into the high pressure water pool in a small-scale simulated vessel lower head. In the heating process of the vessel wall, the calculated heating rate and peak temperature agreed well with the measured values, and the validity of the heat transfer models and gap heat flux correlation was confirmed. In the cooling process of the vessel wall, the calculated cooling rate was compared with the measured value, and the validity of the nucleate boiling heat flux correlation was confirmed. The peak temperatures of the vessel wall in ALPHA and LAVA experiments were lower than the temperature at the minimum heat flux point between film boiling and transition boiling, so the minimum heat flux correlation could not be validated. (author)

  6. Solar Sail Models and Test Measurements Correspondence for Validation Requirements Definition

    Science.gov (United States)

    Ewing, Anthony; Adams, Charles

    2004-01-01

    Solar sails are being developed as a mission-enabling technology in support of future NASA science missions. Current efforts have advanced solar sail technology sufficient to justify a flight validation program. A primary objective of this activity is to test and validate solar sail models that are currently under development so that they may be used with confidence in future science mission development (e.g., scalable to larger sails). Both system and model validation requirements must be defined early in the program to guide design cycles and to ensure that relevant and sufficient test data will be obtained to conduct model validation to the level required. A process of model identification, model input/output documentation, model sensitivity analyses, and test measurement correspondence is required so that decisions can be made to satisfy validation requirements within program constraints.

  7. An approach to model validation and model-based prediction -- polyurethane foam case study.

    Energy Technology Data Exchange (ETDEWEB)

    Dowding, Kevin J.; Rutherford, Brian Milne

    2003-07-01

    Enhanced software methodology and improved computing hardware have advanced the state of simulation technology to a point where large physics-based codes can be a major contributor in many systems analyses. This shift toward the use of computational methods has brought with it new research challenges in a number of areas including characterization of uncertainty, model validation, and the analysis of computer output. It is these challenges that have motivated the work described in this report. Approaches to and methods for model validation and (model-based) prediction have been developed recently in the engineering, mathematics and statistical literatures. In this report we have provided a fairly detailed account of one approach to model validation and prediction applied to an analysis investigating thermal decomposition of polyurethane foam. A model simulates the evolution of the foam in a high temperature environment as it transforms from a solid to a gas phase. The available modeling and experimental results serve as data for a case study focusing our model validation and prediction developmental efforts on this specific thermal application. We discuss several elements of the ''philosophy'' behind the validation and prediction approach: (1) We view the validation process as an activity applying to the use of a specific computational model for a specific application. We do acknowledge, however, that an important part of the overall development of a computational simulation initiative is the feedback provided to model developers and analysts associated with the application. (2) We utilize information obtained for the calibration of model parameters to estimate the parameters and quantify uncertainty in the estimates. We rely, however, on validation data (or data from similar analyses) to measure the variability that contributes to the uncertainty in predictions for specific systems or units (unit-to-unit variability). (3) We perform statistical

  8. Experimental Validation of Flow Force Models for Fast Switching Valves

    DEFF Research Database (Denmark)

    Bender, Niels Christian; Pedersen, Henrik Clemmensen; Nørgård, Christian

    2017-01-01

    This paper comprises a detailed study of the forces acting on a Fast Switching Valve (FSV) plunger. The objective is to investigate to what extend different models are valid to be used for design purposes. These models depend on the geometry of the moving plunger and the properties of the surroun......This paper comprises a detailed study of the forces acting on a Fast Switching Valve (FSV) plunger. The objective is to investigate to what extend different models are valid to be used for design purposes. These models depend on the geometry of the moving plunger and the properties...... to compare and validate different models, where an effort is directed towards capturing the fluid squeeze effect just before material on material contact. The test data is compared with simulation data relying solely on analytic formulations. The general dynamics of the plunger is validated...

  9. Statistical validation of normal tissue complication probability models

    NARCIS (Netherlands)

    Xu, Cheng-Jian; van der Schaaf, Arjen; van t Veld, Aart; Langendijk, Johannes A.; Schilstra, Cornelis

    2012-01-01

    PURPOSE: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. METHODS AND MATERIALS: A penalized regression method, LASSO (least absolute shrinkage

  10. Preliminary validation of a Monte Carlo model for IMRT fields

    International Nuclear Information System (INIS)

    Wright, Tracy; Lye, Jessica; Mohammadi, Mohammad

    2011-01-01

    Full text: A Monte Carlo model of an Elekta linac, validated for medium to large (10-30 cm) symmetric fields, has been investigated for small, irregular and asymmetric fields suitable for IMRT treatments. The model has been validated with field segments using radiochromic film in solid water. The modelled positions of the multileaf collimator (MLC) leaves have been validated using EBT film, In the model, electrons with a narrow energy spectrum are incident on the target and all components of the linac head are included. The MLC is modelled using the EGSnrc MLCE component module. For the validation, a number of single complex IMRT segments with dimensions approximately 1-8 cm were delivered to film in solid water (see Fig, I), The same segments were modelled using EGSnrc by adjusting the MLC leaf positions in the model validated for 10 cm symmetric fields. Dose distributions along the centre of each MLC leaf as determined by both methods were compared. A picket fence test was also performed to confirm the MLC leaf positions. 95% of the points in the modelled dose distribution along the leaf axis agree with the film measurement to within 1%/1 mm for dose difference and distance to agreement. Areas of most deviation occur in the penumbra region. A system has been developed to calculate the MLC leaf positions in the model for any planned field size.

  11. Validation techniques of agent based modelling for geospatial simulations

    Directory of Open Access Journals (Sweden)

    M. Darvishi

    2014-10-01

    Full Text Available One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS, biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI’s ArcGIS, OpenMap, GeoTools, etc for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  12. Validation techniques of agent based modelling for geospatial simulations

    Science.gov (United States)

    Darvishi, M.; Ahmadi, G.

    2014-10-01

    One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS) is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS), biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI's ArcGIS, OpenMap, GeoTools, etc) for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  13. Integrating Seasonal Oscillations into Basel II Behavioural Scoring Models

    Directory of Open Access Journals (Sweden)

    Goran Klepac

    2007-09-01

    Full Text Available The article introduces a new methodology of temporal influence measurement (seasonal oscillations, temporal patterns for behavioural scoring development purposes. The paper shows how significant temporal variables can be recognised and then integrated into the behavioural scoring models in order to improve model performance. Behavioural scoring models are integral parts of the Basel II standard on Internal Ratings-Based Approaches (IRB. The IRB approach much more precisely reflects individual risk bank profile.A solution of the problem of how to analyze and integrate macroeconomic and microeconomic factors represented in time series into behavioural scorecard models will be shown in the paper by using the REF II model.

  14. Kursk Operation Simulation and Validation Exercise - Phase II (KOSAVE II)

    National Research Council Canada - National Science Library

    Bauman, Walter

    1998-01-01

    ... (KOSAVE) Study (KOSAVE II) documents, in this report a statistical record of the Kursk battle, as represented in the KDB, for use as both a standalone descriptive record for historians, and as a baseline for a subsequent Phase...

  15. BIOMOVS: an international model validation study

    International Nuclear Information System (INIS)

    Haegg, C.; Johansson, G.

    1988-01-01

    BIOMOVS (BIOspheric MOdel Validation Study) is an international study where models used for describing the distribution of radioactive and nonradioactive trace substances in terrestrial and aquatic environments are compared and tested. The main objectives of the study are to compare and test the accuracy of predictions between such models, explain differences in these predictions, recommend priorities for future research concerning the improvement of the accuracy of model predictions and act as a forum for the exchange of ideas, experience and information. (author)

  16. BIOMOVS: An international model validation study

    International Nuclear Information System (INIS)

    Haegg, C.; Johansson, G.

    1987-01-01

    BIOMOVS (BIOspheric MOdel Validation Study) is an international study where models used for describing the distribution of radioactive and nonradioactive trace substances in terrestrial and aquatic environments are compared and tested. The main objectives of the study are to compare and test the accuracy of predictions between such models, explain differences in these predictions, recommend priorities for future research concerning the improvement of the accuracy of model predictions and act as a forum for the exchange of ideas, experience and information. (orig.)

  17. A thermomechanical constitutive model for cemented granular materials with quantifiable internal variables. Part II - Validation and localization analysis

    Science.gov (United States)

    Das, Arghya; Tengattini, Alessandro; Nguyen, Giang D.; Viggiani, Gioacchino; Hall, Stephen A.; Einav, Itai

    2014-10-01

    We study the mechanical failure of cemented granular materials (e.g., sandstones) using a constitutive model based on breakage mechanics for grain crushing and damage mechanics for cement fracture. The theoretical aspects of this model are presented in Part I: Tengattini et al. (2014), A thermomechanical constitutive model for cemented granular materials with quantifiable internal variables, Part I - Theory (Journal of the Mechanics and Physics of Solids, 10.1016/j.jmps.2014.05.021). In this Part II we investigate the constitutive and structural responses of cemented granular materials through analyses of Boundary Value Problems (BVPs). The multiple failure mechanisms captured by the proposed model enable the behavior of cemented granular rocks to be well reproduced for a wide range of confining pressures. Furthermore, through comparison of the model predictions and experimental data, the micromechanical basis of the model provides improved understanding of failure mechanisms of cemented granular materials. In particular, we show that grain crushing is the predominant inelastic deformation mechanism under high pressures while cement failure is the relevant mechanism at low pressures. Over an intermediate pressure regime a mixed mode of failure mechanisms is observed. Furthermore, the micromechanical roots of the model allow the effects on localized deformation modes of various initial microstructures to be studied. The results obtained from both the constitutive responses and BVP solutions indicate that the proposed approach and model provide a promising basis for future theoretical studies on cemented granular materials.

  18. External Validation Study of First Trimester Obstetric Prediction Models (Expect Study I): Research Protocol and Population Characteristics.

    Science.gov (United States)

    Meertens, Linda Jacqueline Elisabeth; Scheepers, Hubertina Cj; De Vries, Raymond G; Dirksen, Carmen D; Korstjens, Irene; Mulder, Antonius Lm; Nieuwenhuijze, Marianne J; Nijhuis, Jan G; Spaanderman, Marc Ea; Smits, Luc Jm

    2017-10-26

    A number of first-trimester prediction models addressing important obstetric outcomes have been published. However, most models have not been externally validated. External validation is essential before implementing a prediction model in clinical practice. The objective of this paper is to describe the design of a study to externally validate existing first trimester obstetric prediction models, based upon maternal characteristics and standard measurements (eg, blood pressure), for the risk of pre-eclampsia (PE), gestational diabetes mellitus (GDM), spontaneous preterm birth (PTB), small-for-gestational-age (SGA) infants, and large-for-gestational-age (LGA) infants among Dutch pregnant women (Expect Study I). The results of a pilot study on the feasibility and acceptability of the recruitment process and the comprehensibility of the Pregnancy Questionnaire 1 are also reported. A multicenter prospective cohort study was performed in The Netherlands between July 1, 2013 and December 31, 2015. First trimester obstetric prediction models were systematically selected from the literature. Predictor variables were measured by the Web-based Pregnancy Questionnaire 1 and pregnancy outcomes were established using the Postpartum Questionnaire 1 and medical records. Information about maternal health-related quality of life, costs, and satisfaction with Dutch obstetric care was collected from a subsample of women. A pilot study was carried out before the official start of inclusion. External validity of the models will be evaluated by assessing discrimination and calibration. Based on the pilot study, minor improvements were made to the recruitment process and online Pregnancy Questionnaire 1. The validation cohort consists of 2614 women. Data analysis of the external validation study is in progress. This study will offer insight into the generalizability of existing, non-invasive first trimester prediction models for various obstetric outcomes in a Dutch obstetric population

  19. Base Flow Model Validation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation is the systematic "building-block" validation of CFD/turbulence models employing a GUI driven CFD code (RPFM) and existing as well as new data sets to...

  20. The validity of METHUSELAH II in water moderated lattice calculations

    International Nuclear Information System (INIS)

    Hicks, D.; Hopkins, D.R.

    1964-09-01

    An improved version of the METHUSELAH code has been developed, which embodies some refinements in the treatment of the thermal spectrum, improved cross-section data, and a neutron balance output. The changes in nuclear data and physical models are summarised in this report; a detailed description of the programme modifications will be published separately. In this report METHUSELAH II predictions are compared with published lattice reactivity and reaction rate data. The systems examined include British S.G.H.W. type lattices (with H 2 O and D 2 O moderation), Canadian natural uranium/D 2 0 experiments, U.S. low enrichment H 2 O systems, and the Hanford Pu A1/H 2 O experiments. In general the agreement is sufficiently good to demonstrate the value of METHUSELAH II as an assessment tool and to indicate clear improvements over METHUSELAH I. A number of discrepancies are, however, observed and are the subject of comment. (author)

  1. The turbulent viscosity models and their experimental validation; Les modeles de viscosite turbulente et leur validation experimentale

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-12-31

    This workshop on turbulent viscosity models and on their experimental validation was organized by the `convection` section of the French society of thermal engineers. From the 9 papers presented during this workshop, 8 deal with the modeling of turbulent flows inside combustion chambers, turbo-machineries or in other energy-related applications, and have been selected for ETDE. (J.S.)

  2. Equilibrium modeling of mono and binary sorption of Cu(II and Zn(II onto chitosan gel beads

    Directory of Open Access Journals (Sweden)

    Nastaj Józef

    2016-12-01

    Full Text Available The objective of the work are in-depth experimental studies of Cu(II and Zn(II ion removal on chitosan gel beads from both one- and two-component water solutions at the temperature of 303 K. The optimal process conditions such as: pH value, dose of sorbent and contact time were determined. Based on the optimal process conditions, equilibrium and kinetic studies were carried out. The maximum sorption capacities equaled: 191.25 mg/g and 142.88 mg/g for Cu(II and Zn(II ions respectively, when the sorbent dose was 10 g/L and the pH of a solution was 5.0 for both heavy metal ions. One-component sorption equilibrium data were successfully presented for six of the most useful three-parameter equilibrium models: Langmuir-Freundlich, Redlich-Peterson, Sips, Koble-Corrigan, Hill and Toth. Extended forms of Langmuir-Freundlich, Koble-Corrigan and Sips models were also well fitted to the two-component equilibrium data obtained for different ratios of concentrations of Cu(II and Zn(II ions (1:1, 1:2, 2:1. Experimental sorption data were described by two kinetic models of the pseudo-first and pseudo-second order. Furthermore, an attempt to explain the mechanisms of the divalent metal ion sorption process on chitosan gel beads was undertaken.

  3. Carbonate-mediated Fe(II) oxidation in the air-cathode fuel cell: a kinetic model in terms of Fe(II) speciation.

    Science.gov (United States)

    Song, Wei; Zhai, Lin-Feng; Cui, Yu-Zhi; Sun, Min; Jiang, Yuan

    2013-06-06

    Due to the high redox activity of Fe(II) and its abundance in natural waters, the electro-oxidation of Fe(II) can be found in many air-cathode fuel cell systems, such as acid mine drainage fuel cells and sediment microbial fuel cells. To deeply understand these iron-related systems, it is essential to elucidate the kinetics and mechanisms involved in the electro-oxidation of Fe(II). This work aims to develop a kinetic model that adequately describes the electro-oxidation process of Fe(II) in air-cathode fuel cells. The speciation of Fe(II) is incorporated into the model, and contributions of individual Fe(II) species to the overall Fe(II) oxidation rate are quantitatively evaluated. The results show that the kinetic model can accurately predict the electro-oxidation rate of Fe(II) in air-cathode fuel cells. FeCO3, Fe(OH)2, and Fe(CO3)2(2-) are the most important species determining the electro-oxidation kinetics of Fe(II). The Fe(II) oxidation rate is primarily controlled by the oxidation of FeCO3 species at low pH, whereas at high pH Fe(OH)2 and Fe(CO3)2(2-) are the dominant species. Solution pH, carbonate concentration, and solution salinity are able to influence the electro-oxidation kinetics of Fe(II) through changing both distribution and kinetic activity of Fe(II) species.

  4. Modeling Fe II Emission and Revised Fe II (UV) Empirical Templates for the Seyfert 1 Galaxy I Zw 1

    Science.gov (United States)

    Bruhweiler, F.; Verner, E.

    2008-03-01

    We use the narrow-lined broad-line region (BLR) of the Seyfert 1 galaxy, I Zw 1, as a laboratory for modeling the ultraviolet (UV) Fe II 2100-3050 Å emission complex. We calculate a grid of Fe II emission spectra representative of BLR clouds and compare them with the observed I Zw 1 spectrum. Our predicted spectrum for log [nH/(cm -3) ] = 11.0, log [ΦH/(cm -2 s-1) ] = 20.5, and ξ/(1 km s-1) = 20, using Cloudy and an 830 level model atom for Fe II with energies up to 14.06 eV, gives a better fit to the UV Fe II emission than models with fewer levels. Our analysis indicates (1) the observed UV Fe II emission must be corrected for an underlying Fe II pseudocontinuum; (2) Fe II emission peaks can be misidentified as that of other ions in active galactic nuclei (AGNs) with narrow-lined BLRs possibly affecting deduced physical parameters; (3) the shape of 4200-4700 Å Fe II emission in I Zw 1 and other AGNs is a relative indicator of narrow-line region (NLR) and BLR Fe II emission; (4) predicted ratios of Lyα, C III], and Fe II emission relative to Mg II λ2800 agree with extinction corrected observed I Zw 1 fluxes, except for C IV λ1549 (5) the sensitivity of Fe II emission strength to microturbulence ξ casts doubt on existing relative Fe/Mg abundances derived from Fe II (UV)/Mg II flux ratios. Our calculated Fe II emission spectra, suitable for BLRs in AGNs, are available at http://iacs.cua.edu/people/verner/FeII. Based on observations made with the NASA/ESA Hubble Space Telescope, obtained at the Space Telescope Science Institute, which is operated by the Association of Universities for Research in Astronomy, Inc., under NASA contract NAS 05-26555.

  5. CRAB-II: a computer program to predict hydraulics and scram dynamics of LMFBR control assemblies and its validation

    International Nuclear Information System (INIS)

    Carelli, M.D.; Baker, L.A.; Willis, J.M.; Engel, F.C.; Nee, D.Y.

    1982-01-01

    This paper presents an analytical method, the computer code CRAB-II, which calculates the hydraulics and scram dynamics of LMFBR control assemblies of the rod bundle type and its validation against prototypic data obtained for the Clinch River Breeder Reactor (CRBR) primary control assemblies. The physical-mathematical model of the code is presented, followed by a description of the testing of prototypic CRBR control assemblies in water and sodium to characterize, respectively, their hydraulic and scram dynamics behavior. Comparison of code predictions against the experimental data are presened in detail; excellent agreement was found. Also reported are experimental data and empirical correlations for the friction factor of the absorber bundle in the entire flow range (laminar to turbulent) which represent an extension of the state-of-the-art, since only fuel and blanket assemblies friction factor correlations were previously reported in the open literature

  6. Dynamic modeling and simulation of EBR-II steam generator system

    International Nuclear Information System (INIS)

    Berkan, R.C.; Upadhyaya, B.R.

    1989-01-01

    This paper presents a low order dynamic model of the Experimental breeder Reactor-II (EBR-II) steam generator system. The model development includes the application of energy, mass and momentum balance equations in state-space form. The model also includes a three-element controller for the drum water level control problem. The simulation results for low-level perturbations exhibit the inherently stable characteristics of the steam generator. The predictions of test transients also verify the consistency of this low order model

  7. Developing rural palliative care: validating a conceptual model.

    Science.gov (United States)

    Kelley, Mary Lou; Williams, Allison; DeMiglio, Lily; Mettam, Hilary

    2011-01-01

    The purpose of this research was to validate a conceptual model for developing palliative care in rural communities. This model articulates how local rural healthcare providers develop palliative care services according to four sequential phases. The model has roots in concepts of community capacity development, evolves from collaborative, generalist rural practice, and utilizes existing health services infrastructure. It addresses how rural providers manage challenges, specifically those related to: lack of resources, minimal community understanding of palliative care, health professionals' resistance, the bureaucracy of the health system, and the obstacles of providing services in rural environments. Seven semi-structured focus groups were conducted with interdisciplinary health providers in 7 rural communities in two Canadian provinces. Using a constant comparative analysis approach, focus group data were analyzed by examining participants' statements in relation to the model and comparing emerging themes in the development of rural palliative care to the elements of the model. The data validated the conceptual model as the model was able to theoretically predict and explain the experiences of the 7 rural communities that participated in the study. New emerging themes from the data elaborated existing elements in the model and informed the requirement for minor revisions. The model was validated and slightly revised, as suggested by the data. The model was confirmed as being a useful theoretical tool for conceptualizing the development of rural palliative care that is applicable in diverse rural communities.

  8. Validation of the 12-gene colon cancer recurrence score as a predictor of recurrence risk in stage II and III rectal cancer patients.

    Science.gov (United States)

    Reimers, Marlies S; Kuppen, Peter J K; Lee, Mark; Lopatin, Margarita; Tezcan, Haluk; Putter, Hein; Clark-Langone, Kim; Liefers, Gerrit Jan; Shak, Steve; van de Velde, Cornelis J H

    2014-11-01

    The 12-gene Recurrence Score assay is a validated predictor of recurrence risk in stage II and III colon cancer patients. We conducted a prospectively designed study to validate this assay for prediction of recurrence risk in stage II and III rectal cancer patients from the Dutch Total Mesorectal Excision (TME) trial. RNA was extracted from fixed paraffin-embedded primary rectal tumor tissue from stage II and III patients randomized to TME surgery alone, without (neo)adjuvant treatment. Recurrence Score was assessed by quantitative real time-polymerase chain reaction using previously validated colon cancer genes and algorithm. Data were analysed by Cox proportional hazards regression, adjusting for stage and resection margin status. All statistical tests were two-sided. Recurrence Score predicted risk of recurrence (hazard ratio [HR] = 1.57, 95% confidence interval [CI] = 1.11 to 2.21, P = .01), risk of distant recurrence (HR = 1.50, 95% CI = 1.04 to 2.17, P = .03), and rectal cancer-specific survival (HR = 1.64, 95% CI = 1.15 to 2.34, P = .007). The effect of Recurrence Score was most prominent in stage II patients and attenuated with more advanced stage (P(interaction) ≤ .007 for each endpoint). In stage II, five-year cumulative incidence of recurrence ranged from 11.1% in the predefined low Recurrence Score group (48.5% of patients) to 43.3% in the high Recurrence Score group (23.1% of patients). The 12-gene Recurrence Score is a predictor of recurrence risk and cancer-specific survival in rectal cancer patients treated with surgery alone, suggesting a similar underlying biology in colon and rectal cancers. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  9. Concepts of Model Verification and Validation

    International Nuclear Information System (INIS)

    Thacker, B.H.; Doebling, S.W.; Hemez, F.M.; Anderson, M.C.; Pepin, J.E.; Rodriguez, E.A.

    2004-01-01

    Model verification and validation (VandV) is an enabling methodology for the development of computational models that can be used to make engineering predictions with quantified confidence. Model VandV procedures are needed by government and industry to reduce the time, cost, and risk associated with full-scale testing of products, materials, and weapon systems. Quantifying the confidence and predictive accuracy of model calculations provides the decision-maker with the information necessary for making high-consequence decisions. The development of guidelines and procedures for conducting a model VandV program are currently being defined by a broad spectrum of researchers. This report reviews the concepts involved in such a program. Model VandV is a current topic of great interest to both government and industry. In response to a ban on the production of new strategic weapons and nuclear testing, the Department of Energy (DOE) initiated the Science-Based Stockpile Stewardship Program (SSP). An objective of the SSP is to maintain a high level of confidence in the safety, reliability, and performance of the existing nuclear weapons stockpile in the absence of nuclear testing. This objective has challenged the national laboratories to develop high-confidence tools and methods that can be used to provide credible models needed for stockpile certification via numerical simulation. There has been a significant increase in activity recently to define VandV methods and procedures. The U.S. Department of Defense (DoD) Modeling and Simulation Office (DMSO) is working to develop fundamental concepts and terminology for VandV applied to high-level systems such as ballistic missile defense and battle management simulations. The American Society of Mechanical Engineers (ASME) has recently formed a Standards Committee for the development of VandV procedures for computational solid mechanics models. The Defense Nuclear Facilities Safety Board (DNFSB) has been a proponent of model

  10. Verification and validation of models

    International Nuclear Information System (INIS)

    Herbert, A.W.; Hodgkinson, D.P.; Jackson, C.P.; Lever, D.A.; Robinson, P.C.

    1986-12-01

    The numerical accuracy of the computer models for groundwater flow and radionuclide transport that are to be used in repository safety assessment must be tested, and their ability to describe experimental data assessed: they must be verified and validated respectively. Also appropriate ways to use the codes in performance assessments, taking into account uncertainties in present data and future conditions, must be studied. These objectives are being met by participation in international exercises, by developing bench-mark problems, and by analysing experiments. In particular the project has funded participation in the HYDROCOIN project for groundwater flow models, the Natural Analogues Working Group, and the INTRAVAL project for geosphere models. (author)

  11. Validation study of a quantitative multigene reverse transcriptase-polymerase chain reaction assay for assessment of recurrence risk in patients with stage II colon cancer.

    Science.gov (United States)

    Gray, Richard G; Quirke, Philip; Handley, Kelly; Lopatin, Margarita; Magill, Laura; Baehner, Frederick L; Beaumont, Claire; Clark-Langone, Kim M; Yoshizawa, Carl N; Lee, Mark; Watson, Drew; Shak, Steven; Kerr, David J

    2011-12-10

    We developed quantitative gene expression assays to assess recurrence risk and benefits from chemotherapy in patients with stage II colon cancer. We sought validation by using RNA extracted from fixed paraffin-embedded primary colon tumor blocks from 1,436 patients with stage II colon cancer in the QUASAR (Quick and Simple and Reliable) study of adjuvant fluoropyrimidine chemotherapy versus surgery alone. A recurrence score (RS) and a treatment score (TS) were calculated from gene expression levels of 13 cancer-related genes (n = 7 recurrence genes and n = 6 treatment benefit genes) and from five reference genes with prespecified algorithms. Cox proportional hazards regression models and log-rank methods were used to analyze the relationship between the RS and risk of recurrence in patients treated with surgery alone and between TS and benefits of chemotherapy. Risk of recurrence was significantly associated with RS (hazard ratio [HR] per interquartile range, 1.38; 95% CI, 1.11 to 1.74; P = .004). Recurrence risks at 3 years were 12%, 18%, and 22% for predefined low, intermediate, and high recurrence risk groups, respectively. T stage (HR, 1.94; P < .001) and mismatch repair (MMR) status (HR, 0.31; P < .001) were the strongest histopathologic prognostic factors. The continuous RS was associated with risk of recurrence (P = .006) beyond these and other covariates. There was no trend for increased benefit from chemotherapy at higher TS (P = .95). The continuous 12-gene RS has been validated in a prospective study for assessment of recurrence risk in patients with stage II colon cancer after surgery and provides prognostic value that complements T stage and MMR. The TS was not predictive of chemotherapy benefit.

  12. Absolute, pressure-dependent validation of a calibration-free, airborne laser hygrometer transfer standard (SEALDH-II from 5 to 1200 ppmv using a metrological humidity generator

    Directory of Open Access Journals (Sweden)

    B. Buchholz

    2018-01-01

    Full Text Available Highly accurate water vapor measurements are indispensable for understanding a variety of scientific questions as well as industrial processes. While in metrology water vapor concentrations can be defined, generated, and measured with relative uncertainties in the single percentage range, field-deployable airborne instruments deviate even under quasistatic laboratory conditions up to 10–20 %. The novel SEALDH-II hygrometer, a calibration-free, tuneable diode laser spectrometer, bridges this gap by implementing a new holistic concept to achieve higher accuracy levels in the field. We present in this paper the absolute validation of SEALDH-II at a traceable humidity generator during 23 days of permanent operation at 15 different H2O mole fraction levels between 5 and 1200 ppmv. At each mole fraction level, we studied the pressure dependence at six different gas pressures between 65 and 950 hPa. Further, we describe the setup for this metrological validation, the challenges to overcome when assessing water vapor measurements on a high accuracy level, and the comparison results. With this validation, SEALDH-II is the first airborne, metrologically validated humidity transfer standard which links several scientific airborne and laboratory measurement campaigns to the international metrological water vapor scale.

  13. Refinement, Validation and Benchmarking of a Model for E-Government Service Quality

    Science.gov (United States)

    Magoutas, Babis; Mentzas, Gregoris

    This paper presents the refinement and validation of a model for Quality of e-Government Services (QeGS). We built upon our previous work where a conceptualized model was identified and put focus on the confirmatory phase of the model development process, in order to come up with a valid and reliable QeGS model. The validated model, which was benchmarked with very positive results with similar models found in the literature, can be used for measuring the QeGS in a reliable and valid manner. This will form the basis for a continuous quality improvement process, unleashing the full potential of e-government services for both citizens and public administrations.

  14. Geochemistry Model Validation Report: Material Degradation and Release Model

    Energy Technology Data Exchange (ETDEWEB)

    H. Stockman

    2001-09-28

    The purpose of this Analysis and Modeling Report (AMR) is to validate the Material Degradation and Release (MDR) model that predicts degradation and release of radionuclides from a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. This AMR is prepared according to ''Technical Work Plan for: Waste Package Design Description for LA'' (Ref. 17). The intended use of the MDR model is to estimate the long-term geochemical behavior of waste packages (WPs) containing U. S . Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The model is intended to predict (1) the extent to which criticality control material, such as gadolinium (Gd), will remain in the WP after corrosion of the initial WP, (2) the extent to which fissile Pu and uranium (U) will be carried out of the degraded WP by infiltrating water, and (3) the chemical composition and amounts of minerals and other solids left in the WP. The results of the model are intended for use in criticality calculations. The scope of the model validation report is to (1) describe the MDR model, and (2) compare the modeling results with experimental studies. A test case based on a degrading Pu-ceramic WP is provided to help explain the model. This model does not directly feed the assessment of system performance. The output from this model is used by several other models, such as the configuration generator, criticality, and criticality consequence models, prior to the evaluation of system performance. This document has been prepared according to AP-3.10Q, ''Analyses and Models'' (Ref. 2), and prepared in accordance with the technical work plan (Ref. 17).

  15. Geochemistry Model Validation Report: Material Degradation and Release Model

    International Nuclear Information System (INIS)

    Stockman, H.

    2001-01-01

    The purpose of this Analysis and Modeling Report (AMR) is to validate the Material Degradation and Release (MDR) model that predicts degradation and release of radionuclides from a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. This AMR is prepared according to ''Technical Work Plan for: Waste Package Design Description for LA'' (Ref. 17). The intended use of the MDR model is to estimate the long-term geochemical behavior of waste packages (WPs) containing U. S . Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The model is intended to predict (1) the extent to which criticality control material, such as gadolinium (Gd), will remain in the WP after corrosion of the initial WP, (2) the extent to which fissile Pu and uranium (U) will be carried out of the degraded WP by infiltrating water, and (3) the chemical composition and amounts of minerals and other solids left in the WP. The results of the model are intended for use in criticality calculations. The scope of the model validation report is to (1) describe the MDR model, and (2) compare the modeling results with experimental studies. A test case based on a degrading Pu-ceramic WP is provided to help explain the model. This model does not directly feed the assessment of system performance. The output from this model is used by several other models, such as the configuration generator, criticality, and criticality consequence models, prior to the evaluation of system performance. This document has been prepared according to AP-3.10Q, ''Analyses and Models'' (Ref. 2), and prepared in accordance with the technical work plan (Ref. 17)

  16. Factors associated with therapeutic inertia in hypertension: validation of a predictive model.

    Science.gov (United States)

    Redón, Josep; Coca, Antonio; Lázaro, Pablo; Aguilar, Ma Dolores; Cabañas, Mercedes; Gil, Natividad; Sánchez-Zamorano, Miguel Angel; Aranda, Pedro

    2010-08-01

    To study factors associated with therapeutic inertia in treating hypertension and to develop a predictive model to estimate the probability of therapeutic inertia in a given medical consultation, based on variables related to the consultation, patient, physician, clinical characteristics, and level of care. National, multicentre, observational, cross-sectional study in primary care and specialist (hospital) physicians who each completed a questionnaire on therapeutic inertia, provided professional data and collected clinical data on four patients. Therapeutic inertia was defined as a consultation in which treatment change was indicated (i.e., SBP >or= 140 or DBP >or= 90 mmHg in all patients; SBP >or= 130 or DBP >or= 80 in patients with diabetes or stroke), but did not occur. A predictive model was constructed and validated according to the factors associated with therapeutic inertia. Data were collected on 2595 patients and 13,792 visits. Therapeutic inertia occurred in 7546 (75%) of the 10,041 consultations in which treatment change was indicated. Factors associated with therapeutic inertia were primary care setting, male sex, older age, SPB and/or DBP values close to normal, treatment with more than one antihypertensive drug, treatment with an ARB II, and more than six visits/year. Physician characteristics did not weigh heavily in the association. The predictive model was valid internally and externally, with acceptable calibration, discrimination and reproducibility, and explained one-third of the variability in therapeutic inertia. Although therapeutic inertia is frequent in the management of hypertension, the factors explaining it are not completely clear. Whereas some aspects of the consultations were associated with therapeutic inertia, physician characteristics were not a decisive factor.

  17. SPR Hydrostatic Column Model Verification and Validation.

    Energy Technology Data Exchange (ETDEWEB)

    Bettin, Giorgia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lord, David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rudeen, David Keith [Gram, Inc. Albuquerque, NM (United States)

    2015-10-01

    A Hydrostatic Column Model (HCM) was developed to help differentiate between normal "tight" well behavior and small-leak behavior under nitrogen for testing the pressure integrity of crude oil storage wells at the U.S. Strategic Petroleum Reserve. This effort was motivated by steady, yet distinct, pressure behavior of a series of Big Hill caverns that have been placed under nitrogen for extended period of time. This report describes the HCM model, its functional requirements, the model structure and the verification and validation process. Different modes of operation are also described, which illustrate how the software can be used to model extended nitrogen monitoring and Mechanical Integrity Tests by predicting wellhead pressures along with nitrogen interface movements. Model verification has shown that the program runs correctly and it is implemented as intended. The cavern BH101 long term nitrogen test was used to validate the model which showed very good agreement with measured data. This supports the claim that the model is, in fact, capturing the relevant physical phenomena and can be used to make accurate predictions of both wellhead pressure and interface movements.

  18. CIPS Validation Data Plan

    International Nuclear Information System (INIS)

    Dinh, Nam

    2012-01-01

    This report documents analysis, findings and recommendations resulted from a task 'CIPS Validation Data Plan (VDP)' formulated as an POR4 activity in the CASL VUQ Focus Area (FA), to develop a Validation Data Plan (VDP) for Crud-Induced Power Shift (CIPS) challenge problem, and provide guidance for the CIPS VDP implementation. The main reason and motivation for this task to be carried at this time in the VUQ FA is to bring together (i) knowledge of modern view and capability in VUQ, (ii) knowledge of physical processes that govern the CIPS, and (iii) knowledge of codes, models, and data available, used, potentially accessible, and/or being developed in CASL for CIPS prediction, to devise a practical VDP that effectively supports the CASL's mission in CIPS applications.

  19. Effect of Cu(II), Cd(II) and Zn(II) on Pb(II) biosorption by algae Gelidium-derived materials.

    Science.gov (United States)

    Vilar, Vítor J P; Botelho, Cidália M S; Boaventura, Rui A R

    2008-06-15

    Biosorption of Pb(II), Cu(II), Cd(II) and Zn(II) from binary metal solutions onto the algae Gelidium sesquipedale, an algal industrial waste and a waste-based composite material was investigated at pH 5.3, in a batch system. Binary Pb(II)/Cu(II), Pb(II)/Cd(II) and Pb(II)/Zn(II) solutions have been tested. For the same equilibrium concentrations of both metal ions (1 mmol l(-1)), approximately 66, 85 and 86% of the total uptake capacity of the biosorbents is taken by lead ions in the systems Pb(II)/Cu(II), Pb(II)/Cd(II) and Pb(II)/Zn(II), respectively. Two-metal results were fitted to a discrete and a continuous model, showing the inhibition of the primary metal biosorption by the co-cation. The model parameters suggest that Cd(II) and Zn(II) have the same decreasing effect on the Pb(II) uptake capacity. The uptake of Pb(II) was highly sensitive to the presence of Cu(II). From the discrete model it was possible to obtain the Langmuir affinity constant for Pb(II) biosorption. The presence of the co-cations decreases the apparent affinity of Pb(II). The experimental results were successfully fitted by the continuous model, at different pH values, for each biosorbent. The following sequence for the equilibrium affinity constants was found: Pb>Cu>Cd approximately Zn.

  20. Development and validation of a mass casualty conceptual model.

    Science.gov (United States)

    Culley, Joan M; Effken, Judith A

    2010-03-01

    To develop and validate a conceptual model that provides a framework for the development and evaluation of information systems for mass casualty events. The model was designed based on extant literature and existing theoretical models. A purposeful sample of 18 experts validated the model. Open-ended questions, as well as a 7-point Likert scale, were used to measure expert consensus on the importance of each construct and its relationship in the model and the usefulness of the model to future research. Computer-mediated applications were used to facilitate a modified Delphi technique through which a panel of experts provided validation for the conceptual model. Rounds of questions continued until consensus was reached, as measured by an interquartile range (no more than 1 scale point for each item); stability (change in the distribution of responses less than 15% between rounds); and percent agreement (70% or greater) for indicator questions. Two rounds of the Delphi process were needed to satisfy the criteria for consensus or stability related to the constructs, relationships, and indicators in the model. The panel reached consensus or sufficient stability to retain all 10 constructs, 9 relationships, and 39 of 44 indicators. Experts viewed the model as useful (mean of 5.3 on a 7-point scale). Validation of the model provides the first step in understanding the context in which mass casualty events take place and identifying variables that impact outcomes of care. This study provides a foundation for understanding the complexity of mass casualty care, the roles that nurses play in mass casualty events, and factors that must be considered in designing and evaluating information-communication systems to support effective triage under these conditions.

  1. Geochemistry Model Validation Report: External Accumulation Model

    International Nuclear Information System (INIS)

    Zarrabi, K.

    2001-01-01

    The purpose of this Analysis and Modeling Report (AMR) is to validate the External Accumulation Model that predicts accumulation of fissile materials in fractures and lithophysae in the rock beneath a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. (Lithophysae are voids in the rock having concentric shells of finely crystalline alkali feldspar, quartz, and other materials that were formed due to entrapped gas that later escaped, DOE 1998, p. A-25.) The intended use of this model is to estimate the quantities of external accumulation of fissile material for use in external criticality risk assessments for different types of degrading WPs: U.S. Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The scope of the model validation is to (1) describe the model and the parameters used to develop the model, (2) provide rationale for selection of the parameters by comparisons with measured values, and (3) demonstrate that the parameters chosen are the most conservative selection for external criticality risk calculations. To demonstrate the applicability of the model, a Pu-ceramic WP is used as an example. The model begins with a source term from separately documented EQ6 calculations; where the source term is defined as the composition versus time of the water flowing out of a breached waste package (WP). Next, PHREEQC, is used to simulate the transport and interaction of the source term with the resident water and fractured tuff below the repository. In these simulations the primary mechanism for accumulation is mixing of the high pH, actinide-laden source term with resident water; thus lowering the pH values sufficiently for fissile minerals to become insoluble and precipitate. In the final section of the model, the outputs from PHREEQC, are processed to produce mass of accumulation

  2. Model-based verification and validation of the SMAP uplink processes

    Science.gov (United States)

    Khan, M. O.; Dubos, G. F.; Tirona, J.; Standley, S.

    Model-Based Systems Engineering (MBSE) is being used increasingly within the spacecraft design community because of its benefits when compared to document-based approaches. As the complexity of projects expands dramatically with continually increasing computational power and technology infusion, the time and effort needed for verification and validation (V& V) increases geometrically. Using simulation to perform design validation with system-level models earlier in the life cycle stands to bridge the gap between design of the system (based on system-level requirements) and verifying those requirements/validating the system as a whole. This case study stands as an example of how a project can validate a system-level design earlier in the project life cycle than traditional V& V processes by using simulation on a system model. Specifically, this paper describes how simulation was added to a system model of the Soil Moisture Active-Passive (SMAP) mission's uplink process. Also discussed are the advantages and disadvantages of the methods employed and the lessons learned; which are intended to benefit future model-based and simulation-based development efforts.

  3. Solid-phase extraction of Mn(II), Co(II), Ni(II), Cu(II), Cd(II) and Pb(II) ions from environmental samples by flame atomic absorption spectrometry (FAAS)

    Energy Technology Data Exchange (ETDEWEB)

    Duran, Celal [Department of Chemistry, Faculty of Art and Science, Karadeniz Technical University, 61080 Trabzon (Turkey); Gundogdu, Ali [Department of Chemistry, Faculty of Art and Science, Karadeniz Technical University, 61080 Trabzon (Turkey); Bulut, Volkan Numan [Department of Chemistry, Giresun Faculty of Art and Science, Karadeniz Technical University, 28049 Giresun (Turkey); Soylak, Mustafa [Department of Chemistry, Faculty of Art and Science, Erciyes University, 38039 Kayseri (Turkey)]. E-mail: soylak@erciyes.edu.tr; Elci, Latif [Department of Chemistry, Faculty of Art and Science, Pamukkale University, 20020 Denizli (Turkey); Sentuerk, Hasan Basri [Department of Chemistry, Faculty of Art and Science, Karadeniz Technical University, 61080 Trabzon (Turkey); Tuefekci, Mehmet [Department of Chemistry, Faculty of Art and Science, Karadeniz Technical University, 61080 Trabzon (Turkey)

    2007-07-19

    A new method using a column packed with Amberlite XAD-2010 resin as a solid-phase extractant has been developed for the multi-element preconcentration of Mn(II), Co(II), Ni(II), Cu(II), Cd(II), and Pb(II) ions based on their complex formation with the sodium diethyldithiocarbamate (Na-DDTC) prior to flame atomic absorption spectrometric (FAAS) determinations. Metal complexes sorbed on the resin were eluted by 1 mol L{sup -1} HNO{sub 3} in acetone. Effects of the analytical conditions over the preconcentration yields of the metal ions, such as pH, quantity of Na-DDTC, eluent type, sample volume and flow rate, foreign ions etc. have been investigated. The limits of detection (LOD) of the analytes were found in the range 0.08-0.26 {mu}g L{sup -1}. The method was validated by analyzing three certified reference materials. The method has been applied for the determination of trace elements in some environmental samples.

  4. IN-DRIFT MICROBIAL COMMUNITIES MODEL VALIDATION CALCULATIONS

    Energy Technology Data Exchange (ETDEWEB)

    D.M. Jolley

    2001-12-18

    The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS M&O 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS M&O 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN M09909SPAMINGl.003 using its replacement DTN M00106SPAIDMO 1.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS M&O 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS M&O (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 200 1) which includes controls for the management of electronic data.

  5. In-Drift Microbial Communities Model Validation Calculations

    Energy Technology Data Exchange (ETDEWEB)

    D. M. Jolley

    2001-09-24

    The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS M&O 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS M&O 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN MO9909SPAMING1.003 using its replacement DTN MO0106SPAIDM01.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS M&O 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS M&O (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 2001) which includes controls for the management of electronic data.

  6. In-Drift Microbial Communities Model Validation Calculation

    Energy Technology Data Exchange (ETDEWEB)

    D. M. Jolley

    2001-10-31

    The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS M&O 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS M&O 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN MO9909SPAMING1.003 using its replacement DTN MO0106SPAIDM01.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS M&O 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS M&O (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 2001) which includes controls for the management of electronic data.

  7. In-Drift Microbial Communities Model Validation Calculations

    International Nuclear Information System (INIS)

    Jolley, D.M.

    2001-01-01

    The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS MandO 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS MandO 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN MO9909SPAMING1.003 using its replacement DTN MO0106SPAIDM01.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS MandO 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS MandO (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 2001) which includes controls for the management of electronic data

  8. IN-DRIFT MICROBIAL COMMUNITIES MODEL VALIDATION CALCULATIONS

    International Nuclear Information System (INIS)

    D.M. Jolley

    2001-01-01

    The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS M andO 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS M andO 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN M09909SPAMINGl.003 using its replacement DTN M00106SPAIDMO 1.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS M andO 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS M andO (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 200 1) which includes controls for the management of electronic data

  9. Validation of Computer Models for Homeland Security Purposes

    International Nuclear Information System (INIS)

    Schweppe, John E.; Ely, James; Kouzes, Richard T.; McConn, Ronald J.; Pagh, Richard T.; Robinson, Sean M.; Siciliano, Edward R.; Borgardt, James D.; Bender, Sarah E.; Earnhart, Alison H.

    2005-01-01

    At Pacific Northwest National Laboratory, we are developing computer models of radiation portal monitors for screening vehicles and cargo. Detailed models of the radiation detection equipment, vehicles, cargo containers, cargos, and radioactive sources have been created. These are used to determine the optimal configuration of detectors and the best alarm algorithms for the detection of items of interest while minimizing nuisance alarms due to the presence of legitimate radioactive material in the commerce stream. Most of the modeling is done with the Monte Carlo code MCNP to describe the transport of gammas and neutrons from extended sources through large, irregularly shaped absorbers to large detectors. A fundamental prerequisite is the validation of the computational models against field measurements. We describe the first step of this validation process, the comparison of the models to measurements with bare static sources

  10. A multi-source satellite data approach for modelling Lake Turkana water level: calibration and validation using satellite altimetry data

    Directory of Open Access Journals (Sweden)

    N. M. Velpuri

    2012-01-01

    satellite-driven water balance model for (i quantitative assessment of the impact of basin developmental activities on lake levels and for (ii forecasting lake level changes and their impact on fisheries. From this study, we suggest that globally available satellite altimetry data provide a unique opportunity for calibration and validation of hydrologic models in ungauged basins.

  11. A multi-source satellite data approach for modelling Lake Turkana water level: Calibration and validation using satellite altimetry data

    Science.gov (United States)

    Velpuri, N.M.; Senay, G.B.; Asante, K.O.

    2012-01-01

    model for (i) quantitative assessment of the impact of basin developmental activities on lake levels and for (ii) forecasting lake level changes and their impact on fisheries. From this study, we suggest that globally available satellite altimetry data provide a unique opportunity for calibration and validation of hydrologic models in ungauged basins. ?? Author(s) 2012.

  12. FISPACT-II: An Advanced Simulation System for Activation, Transmutation and Material Modelling

    Energy Technology Data Exchange (ETDEWEB)

    Sublet, J.-Ch., E-mail: jean-christophe.sublet@ukaea.uk [United Kingdom Atomic Energy Authority, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Eastwood, J.W.; Morgan, J.G. [Culham Electromagnetics Ltd, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Gilbert, M.R.; Fleming, M.; Arter, W. [United Kingdom Atomic Energy Authority, Culham Science Centre, Abingdon OX14 3DB (United Kingdom)

    2017-01-15

    Fispact-II is a code system and library database for modelling activation-transmutation processes, depletion-burn-up, time dependent inventory and radiation damage source terms caused by nuclear reactions and decays. The Fispact-II code, written in object-style Fortran, follows the evolution of material irradiated by neutrons, alphas, gammas, protons, or deuterons, and provides a wide range of derived radiological output quantities to satisfy most needs for nuclear applications. It can be used with any ENDF-compliant group library data for nuclear reactions, particle-induced and spontaneous fission yields, and radioactive decay (including but not limited to TENDL-2015, ENDF/B-VII.1, JEFF-3.2, JENDL-4.0u, CENDL-3.1 processed into fine-group-structure files, GEFY-5.2 and UKDD-16), as well as resolved and unresolved resonance range probability tables for self-shielding corrections and updated radiological hazard indices. The code has many novel features including: extension of the energy range up to 1 GeV; additional neutron physics including self-shielding effects, temperature dependence, thin and thick target yields; pathway analysis; and sensitivity and uncertainty quantification and propagation using full covariance data. The latest ENDF libraries such as TENDL encompass thousands of target isotopes. Nuclear data libraries for Fispact-II are prepared from these using processing codes PREPRO, NJOY and CALENDF. These data include resonance parameters, cross sections with covariances, probability tables in the resonance ranges, PKA spectra, kerma, dpa, gas and radionuclide production and energy-dependent fission yields, supplemented with all 27 decay types. All such data for the five most important incident particles are provided in evaluated data tables. The Fispact-II simulation software is described in detail in this paper, together with the nuclear data libraries. The Fispact-II system also includes several utility programs for code-use optimisation

  13. Validation of detailed thermal hydraulic models used for LMR safety and for improvement of technical specifications

    Energy Technology Data Exchange (ETDEWEB)

    Dunn, F.E.

    1995-12-31

    Detailed steady-state and transient coolant temperatures and flow rates from an operating reactor have been used to validate the multiple pin model in the SASSYS-1 liquid metal reactor systems analysis code. This multiple pin capability can be used for explicit calculations of axial and lateral temperature distributions within individual subassemblies. Thermocouples at a number of axial locations and in a number of different coolant sub-channels m the XXO9 instrumented subassembly in the EBR-II reactor provided temperature data from the Shutdown Heat Removal Test (SHRT) series. Flow meter data for XXO9 and for the overall system are also available from these tests. Results of consistent SASSYS-1 multiple pin analyses for both the SHRT-45 loss-of-flow-without-scram-test and the S14RT-17 protected loss-of-flow test agree well with the experimental data, providing validation of the SASSYS-1 code over a wide range of conditions.

  14. Models, validation, and applied geochemistry: Issues in science, communication, and philosophy

    International Nuclear Information System (INIS)

    Kirk Nordstrom, D.

    2012-01-01

    Models have become so fashionable that many scientists and engineers cannot imagine working without them. The predominant use of computer codes to execute model calculations has blurred the distinction between code and model. The recent controversy regarding model validation has brought into question what we mean by a ‘model’ and by ‘validation.’ It has become apparent that the usual meaning of validation may be common in engineering practice and seems useful in legal practice but it is contrary to scientific practice and brings into question our understanding of science and how it can best be applied to such problems as hazardous waste characterization, remediation, and aqueous geochemistry in general. This review summarizes arguments against using the phrase model validation and examines efforts to validate models for high-level radioactive waste management and for permitting and monitoring open-pit mines. Part of the controversy comes from a misunderstanding of ‘prediction’ and the need to distinguish logical from temporal prediction. Another problem stems from the difference in the engineering approach contrasted with the scientific approach. The reductionist influence on the way we approach environmental investigations also limits our ability to model the interconnected nature of reality. Guidelines are proposed to improve our perceptions and proper utilization of models. Use of the word ‘validation’ is strongly discouraged when discussing model reliability.

  15. An independent verification and validation of the Future Theater Level Model conceptual model

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, D.S. III; Kruse, K.L.; Martellaro, A.J.; Packard, S.L.; Thomas, B. Jr.; Turley, V.K.

    1994-08-01

    This report describes the methodology and results of independent verification and validation performed on a combat model in its design stage. The combat model is the Future Theater Level Model (FTLM), under development by The Joint Staff/J-8. J-8 has undertaken its development to provide an analysis tool that addresses the uncertainties of combat more directly than previous models and yields more rapid study results. The methodology adopted for this verification and validation consisted of document analyses. Included were detailed examination of the FTLM design documents (at all stages of development), the FTLM Mission Needs Statement, and selected documentation for other theater level combat models. These documents were compared to assess the FTLM as to its design stage, its purpose as an analytical combat model, and its capabilities as specified in the Mission Needs Statement. The conceptual design passed those tests. The recommendations included specific modifications as well as a recommendation for continued development. The methodology is significant because independent verification and validation have not been previously reported as being performed on a combat model in its design stage. The results are significant because The Joint Staff/J-8 will be using the recommendations from this study in determining whether to proceed with develop of the model.

  16. Geographic and temporal validity of prediction models: Different approaches were useful to examine model performance

    NARCIS (Netherlands)

    P.C. Austin (Peter); D. van Klaveren (David); Y. Vergouwe (Yvonne); D. Nieboer (Daan); D.S. Lee (Douglas); E.W. Steyerberg (Ewout)

    2016-01-01

    textabstractObjective: Validation of clinical prediction models traditionally refers to the assessment of model performance in new patients. We studied different approaches to geographic and temporal validation in the setting of multicenter data from two time periods. Study Design and Setting: We

  17. [Low grade renal trauma (Part II): diagnostic validity of ultrasonography].

    Science.gov (United States)

    Grill, R; Báca, V; Otcenásek, M; Zátura, F

    2010-04-01

    The aim of the study was to verify whether ultrasonography can be considered a reliable method for the diagnosis of low-grade renal trauma. The group investigated included patients with grade I or grade II blunt renal trauma, as classified by the AAST grading system, in whom ultrasonography alone or in conjunction with computed tomography was used as a primary diagnostic method. B-mode ultrasound with a transabdominal probe working at frequencies of 2.5 to 5.0 MHz was used. Every finding of post-traumatic changes in the renal tissues, i.e., post-contusion hypotonic infiltration of the renal parenchyma or subcapsular haematoma, was included. The results were statistically evaluated by the Chi-square test with the level of significance set at 5%, using Epi Info Version 6 CZ software. The group comprised 112 patients (43 women, 69 men) aged between 17 and 82 years (average, 38 years). It was possible to diagnose grade I or grade II renal injury by ultrasonography in only 60 (54%) of them. The statistical significance of ultrasonography as the only imaging method for the diagnosis of low-grade renal injury was not confirmed (p=0.543) Low-grade renal trauma is a problem from the diagnostic point of view. It usually does not require revision surgery and, if found during repeat surgery for more serious injury of another organ, it usually does not receive attention. Therefore, the macroscopic presentation of grade I and grade II renal injury is poorly understood, nor are their microscopic findings known, because during revision surgery these the traumatised kidneys are not usually removed and their injuries at autopsy on the patients who died of multiple trauma are not recorded either. The results of this study demonstrated that the validity of ultrasonography for the diagnosis of low-grade renal injury is not significant, because this examination can reveal only some of the renal injuries such as perirenal haematoma. An injury to the renal parenchyma is also indicated by

  18. Validation of the Colorado Retinopathy of Prematurity Screening Model.

    Science.gov (United States)

    McCourt, Emily A; Ying, Gui-Shuang; Lynch, Anne M; Palestine, Alan G; Wagner, Brandie D; Wymore, Erica; Tomlinson, Lauren A; Binenbaum, Gil

    2018-04-01

    The Colorado Retinopathy of Prematurity (CO-ROP) model uses birth weight, gestational age, and weight gain at the first month of life (WG-28) to predict risk of severe retinopathy of prematurity (ROP). In previous validation studies, the model performed very well, predicting virtually all cases of severe ROP and potentially reducing the number of infants who need ROP examinations, warranting validation in a larger, more diverse population. To validate the performance of the CO-ROP model in a large multicenter cohort. This study is a secondary analysis of data from the Postnatal Growth and Retinopathy of Prematurity (G-ROP) Study, a retrospective multicenter cohort study conducted in 29 hospitals in the United States and Canada between January 2006 and June 2012 of 6351 premature infants who received ROP examinations. Sensitivity and specificity for severe (early treatment of ROP [ETROP] type 1 or 2) ROP, and reduction in infants receiving examinations. The CO-ROP model was applied to the infants in the G-ROP data set with all 3 data points (infants would have received examinations if they met all 3 criteria: birth weight, large validation cohort. The model requires all 3 criteria to be met to signal a need for examinations, but some infants with a birth weight or gestational age above the thresholds developed severe ROP. Most of these infants who were not detected by the CO-ROP model had obvious deviation in expected weight trajectories or nonphysiologic weight gain. These findings suggest that the CO-ROP model needs to be revised before considering implementation into clinical practice.

  19. Validation of limited sampling models (LSM) for estimating AUC in therapeutic drug monitoring - is a separate validation group required?

    NARCIS (Netherlands)

    Proost, J. H.

    Objective: Limited sampling models (LSM) for estimating AUC in therapeutic drug monitoring are usually validated in a separate group of patients, according to published guidelines. The aim of this study is to evaluate the validation of LSM by comparing independent validation with cross-validation

  20. Comparison of Reliability and Validity of the Breast Cancer depression anxiety stress scales (DASS- 21) with the Beck Depression Inventory-(BDI-II) and Hospital Anxiety and Depression Scale (HADS)

    OpenAIRE

    Bener A; Alsulaiman R; Doodson LG; El Ayoubi HR

    2016-01-01

    Background: No study has been conducted to determine the reliability and validity of the Depression, Anxiety and Stress Scale (DASS-21), Hospital Anxiety and Depression [HADS] and Beck Depression Inventory (BDI-II) among the Arab Breast Cancer population. Aim: The aim of this study was to compare the reliability and validity of the Depression, Anxiety, and Stress scale (DASS-21), the Beck Depression Inventory-(BDI-II) and Hospital Anxiety and Depression Scale (HADS) among Breast Cancer women ...

  1. The generation, validation and testing of a coupled 219-group neutron 36-group gamma ray AMPX-II library

    International Nuclear Information System (INIS)

    Panini, G.C.; Siciliano, F.; Lioi, A.

    1987-01-01

    The main characteristics of a P 3 coupled 219-group neutron 36-group gamma-ray library in the AMPX-II Master Interface Format obtained processing ENDF/B-IV data by means of various AMPX-II System modules are presented in this note both for the more reprocessing aspects and features of the generated component files-neutrons, photon and secondary gamma-ray production cross sections. As far as the neutron data are concerned there is the avaibility of 186 data sets regarding most significant fission products. Results of the additional validation of the neutron data pertaining to eighteen benchmark experiments are also given. Some calculational tests on both neutron and coupled data emphasize the important role of the secondary gamma-ray data in nuclear criticality safety calculations

  2. Validation of a phytoremediation computer model

    Energy Technology Data Exchange (ETDEWEB)

    Corapcioglu, M Y; Sung, K; Rhykerd, R L; Munster, C; Drew, M [Texas A and M Univ., College Station, TX (United States)

    1999-01-01

    The use of plants to stimulate remediation of contaminated soil is an effective, low-cost cleanup method which can be applied to many different sites. A phytoremediation computer model has been developed to simulate how recalcitrant hydrocarbons interact with plant roots in unsaturated soil. A study was conducted to provide data to validate and calibrate the model. During the study, lysimeters were constructed and filled with soil contaminated with 10 [mg kg[sub -1

  3. Traffic modelling validation of advanced driver assistance systems

    NARCIS (Netherlands)

    Tongeren, R. van; Gietelink, O.J.; Schutter, B. de; Verhaegen, M.

    2007-01-01

    This paper presents a microscopic traffic model for the validation of advanced driver assistance systems. This model describes single-lane traffic and is calibrated with data from a field operational test. To illustrate the use of the model, a Monte Carlo simulation of single-lane traffic scenarios

  4. Verification and Validation of Tropospheric Model/Database

    National Research Council Canada - National Science Library

    Junho, choi

    1998-01-01

    A verification and validation of tropospheric models and databases has been performed based on ray tracing algorithm, statistical analysis, test on real time system operation, and other technical evaluation process...

  5. Comparing the dependability and associations with functioning of the DSM-5 Section III trait model of personality pathology and the DSM-5 Section II personality disorder model.

    Science.gov (United States)

    Chmielewski, Michael; Ruggero, Camilo J; Kotov, Roman; Liu, Keke; Krueger, Robert F

    2017-07-01

    Two competing models of personality psychopathology are included in the fifth edition of the Diagnostic Statistical Manual of Mental Disorders ( DSM-5 ; American Psychiatric Association, 2013); the traditional personality disorder (PD) model included in Section II and an alternative trait-based model included in Section III. Numerous studies have examined the validity of the alternative trait model and its official assessment instrument, the Personality Inventory for DSM-5 (PID-5; Krueger, Derringer, Markon, Watson, & Skodol, 2012). However, few studies have directly compared the trait-based model to the traditional PD model empirically in the same dataset. Moreover, to our knowledge, only a single study (Suzuki, Griffin, & Samuel, 2015) has examined the dependability of the PID-5, which is an essential component of construct validity for traits (Chmielewski & Watson, 2009; McCrae, Kurtz, Yamagata, & Terracciano, 2011). The current study directly compared the dependability of the DSM-5 traits, as assessed by the PID-5, and the traditional PD model, as assessed by the Personality Diagnostic Questionnaire-4 (PDQ-4+), in a large undergraduate sample. In addition, it evaluated and compared their associations with functioning, another essential component of personality pathology. In general, our findings indicate that most DSM-5 traits demonstrate high levels of dependability that are superior to the traditional PD model; however, some of the constructs assessed by the PID-5 may be more state like. The models were roughly equivalent in terms of their associations with functioning. The current results provide additional support for the validity of PID-5 and the DSM-5 Section III personality pathology model. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  6. Cost model validation: a technical and cultural approach

    Science.gov (United States)

    Hihn, J.; Rosenberg, L.; Roust, K.; Warfield, K.

    2001-01-01

    This paper summarizes how JPL's parametric mission cost model (PMCM) has been validated using both formal statistical methods and a variety of peer and management reviews in order to establish organizational acceptance of the cost model estimates.

  7. Pharmacokinetic modeling of gentamicin in treatment of infective endocarditis: Model development and validation of existing models

    Science.gov (United States)

    van der Wijk, Lars; Proost, Johannes H.; Sinha, Bhanu; Touw, Daan J.

    2017-01-01

    Gentamicin shows large variations in half-life and volume of distribution (Vd) within and between individuals. Thus, monitoring and accurately predicting serum levels are required to optimize effectiveness and minimize toxicity. Currently, two population pharmacokinetic models are applied for predicting gentamicin doses in adults. For endocarditis patients the optimal model is unknown. We aimed at: 1) creating an optimal model for endocarditis patients; and 2) assessing whether the endocarditis and existing models can accurately predict serum levels. We performed a retrospective observational two-cohort study: one cohort to parameterize the endocarditis model by iterative two-stage Bayesian analysis, and a second cohort to validate and compare all three models. The Akaike Information Criterion and the weighted sum of squares of the residuals divided by the degrees of freedom were used to select the endocarditis model. Median Prediction Error (MDPE) and Median Absolute Prediction Error (MDAPE) were used to test all models with the validation dataset. We built the endocarditis model based on data from the modeling cohort (65 patients) with a fixed 0.277 L/h/70kg metabolic clearance, 0.698 (±0.358) renal clearance as fraction of creatinine clearance, and Vd 0.312 (±0.076) L/kg corrected lean body mass. External validation with data from 14 validation cohort patients showed a similar predictive power of the endocarditis model (MDPE -1.77%, MDAPE 4.68%) as compared to the intensive-care (MDPE -1.33%, MDAPE 4.37%) and standard (MDPE -0.90%, MDAPE 4.82%) models. All models acceptably predicted pharmacokinetic parameters for gentamicin in endocarditis patients. However, these patients appear to have an increased Vd, similar to intensive care patients. Vd mainly determines the height of peak serum levels, which in turn correlate with bactericidal activity. In order to maintain simplicity, we advise to use the existing intensive-care model in clinical practice to avoid

  8. Pharmacokinetic modeling of gentamicin in treatment of infective endocarditis: Model development and validation of existing models.

    Directory of Open Access Journals (Sweden)

    Anna Gomes

    Full Text Available Gentamicin shows large variations in half-life and volume of distribution (Vd within and between individuals. Thus, monitoring and accurately predicting serum levels are required to optimize effectiveness and minimize toxicity. Currently, two population pharmacokinetic models are applied for predicting gentamicin doses in adults. For endocarditis patients the optimal model is unknown. We aimed at: 1 creating an optimal model for endocarditis patients; and 2 assessing whether the endocarditis and existing models can accurately predict serum levels. We performed a retrospective observational two-cohort study: one cohort to parameterize the endocarditis model by iterative two-stage Bayesian analysis, and a second cohort to validate and compare all three models. The Akaike Information Criterion and the weighted sum of squares of the residuals divided by the degrees of freedom were used to select the endocarditis model. Median Prediction Error (MDPE and Median Absolute Prediction Error (MDAPE were used to test all models with the validation dataset. We built the endocarditis model based on data from the modeling cohort (65 patients with a fixed 0.277 L/h/70kg metabolic clearance, 0.698 (±0.358 renal clearance as fraction of creatinine clearance, and Vd 0.312 (±0.076 L/kg corrected lean body mass. External validation with data from 14 validation cohort patients showed a similar predictive power of the endocarditis model (MDPE -1.77%, MDAPE 4.68% as compared to the intensive-care (MDPE -1.33%, MDAPE 4.37% and standard (MDPE -0.90%, MDAPE 4.82% models. All models acceptably predicted pharmacokinetic parameters for gentamicin in endocarditis patients. However, these patients appear to have an increased Vd, similar to intensive care patients. Vd mainly determines the height of peak serum levels, which in turn correlate with bactericidal activity. In order to maintain simplicity, we advise to use the existing intensive-care model in clinical practice to

  9. Modelling Energy Loss Mechanisms and a Determination of the Electron Energy Scale for the CDF Run II W Mass Measurement

    Energy Technology Data Exchange (ETDEWEB)

    Riddick, Thomas [Univ. College London, Bloomsbury (United Kingdom)

    2012-06-15

    The calibration of the calorimeter energy scale is vital to measuring the mass of the W boson at CDF Run II. For the second measurement of the W boson mass at CDF Run II, two independent simulations were developed. This thesis presents a detailed description of the modification and validation of Bremsstrahlung and pair production modelling in one of these simulations, UCL Fast Simulation, comparing to both geant4 and real data where appropriate. The total systematic uncertainty on the measurement of the W boson mass in the W → eve channel from residual inaccuracies in Bremsstrahlung modelling is estimated as 6.2 ±3.2 MeV/c2 and the total systematic uncertainty from residual inaccuracies in pair production modelling is estimated as 2.8± 2.7 MeV=c2. Two independent methods are used to calibrate the calorimeter energy scale in UCL Fast Simulation; the results of these two methods are compared to produce a measurement of the Z boson mass as a cross-check on the accuracy of the simulation.

  10. Validation and calibration of structural models that combine information from multiple sources.

    Science.gov (United States)

    Dahabreh, Issa J; Wong, John B; Trikalinos, Thomas A

    2017-02-01

    Mathematical models that attempt to capture structural relationships between their components and combine information from multiple sources are increasingly used in medicine. Areas covered: We provide an overview of methods for model validation and calibration and survey studies comparing alternative approaches. Expert commentary: Model validation entails a confrontation of models with data, background knowledge, and other models, and can inform judgments about model credibility. Calibration involves selecting parameter values to improve the agreement of model outputs with data. When the goal of modeling is quantitative inference on the effects of interventions or forecasting, calibration can be viewed as estimation. This view clarifies issues related to parameter identifiability and facilitates formal model validation and the examination of consistency among different sources of information. In contrast, when the goal of modeling is the generation of qualitative insights about the modeled phenomenon, calibration is a rather informal process for selecting inputs that result in model behavior that roughly reproduces select aspects of the modeled phenomenon and cannot be equated to an estimation procedure. Current empirical research on validation and calibration methods consists primarily of methodological appraisals or case-studies of alternative techniques and cannot address the numerous complex and multifaceted methodological decisions that modelers must make. Further research is needed on different approaches for developing and validating complex models that combine evidence from multiple sources.

  11. Application of parameters space analysis tools for empirical model validation

    Energy Technology Data Exchange (ETDEWEB)

    Paloma del Barrio, E. [LEPT-ENSAM UMR 8508, Talence (France); Guyon, G. [Electricite de France, Moret-sur-Loing (France)

    2004-01-01

    A new methodology for empirical model validation has been proposed in the framework of the Task 22 (Building Energy Analysis Tools) of the International Energy Agency. It involves two main steps: checking model validity and diagnosis. Both steps, as well as the underlying methods, have been presented in the first part of the paper. In this part, they are applied for testing modelling hypothesis in the framework of the thermal analysis of an actual building. Sensitivity analysis tools have been first used to identify the parts of the model that can be really tested on the available data. A preliminary diagnosis is then supplied by principal components analysis. Useful information for model behaviour improvement has been finally obtained by optimisation techniques. This example of application shows how model parameters space analysis is a powerful tool for empirical validation. In particular, diagnosis possibilities are largely increased in comparison with residuals analysis techniques. (author)

  12. Validation analysis of probabilistic models of dietary exposure to food additives.

    Science.gov (United States)

    Gilsenan, M B; Thompson, R L; Lambe, J; Gibney, M J

    2003-10-01

    The validity of a range of simple conceptual models designed specifically for the estimation of food additive intakes using probabilistic analysis was assessed. Modelled intake estimates that fell below traditional conservative point estimates of intake and above 'true' additive intakes (calculated from a reference database at brand level) were considered to be in a valid region. Models were developed for 10 food additives by combining food intake data, the probability of an additive being present in a food group and additive concentration data. Food intake and additive concentration data were entered as raw data or as a lognormal distribution, and the probability of an additive being present was entered based on the per cent brands or the per cent eating occasions within a food group that contained an additive. Since the three model components assumed two possible modes of input, the validity of eight (2(3)) model combinations was assessed. All model inputs were derived from the reference database. An iterative approach was employed in which the validity of individual model components was assessed first, followed by validation of full conceptual models. While the distribution of intake estimates from models fell below conservative intakes, which assume that the additive is present at maximum permitted levels (MPLs) in all foods in which it is permitted, intake estimates were not consistently above 'true' intakes. These analyses indicate the need for more complex models for the estimation of food additive intakes using probabilistic analysis. Such models should incorporate information on market share and/or brand loyalty.

  13. Continuous validation of ASTEC containment models and regression testing

    International Nuclear Information System (INIS)

    Nowack, Holger; Reinke, Nils; Sonnenkalb, Martin

    2014-01-01

    The focus of the ASTEC (Accident Source Term Evaluation Code) development at GRS is primarily on the containment module CPA (Containment Part of ASTEC), whose modelling is to a large extent based on the GRS containment code COCOSYS (COntainment COde SYStem). Validation is usually understood as the approval of the modelling capabilities by calculations of appropriate experiments done by external users different from the code developers. During the development process of ASTEC CPA, bugs and unintended side effects may occur, which leads to changes in the results of the initially conducted validation. Due to the involvement of a considerable number of developers in the coding of ASTEC modules, validation of the code alone, even if executed repeatedly, is not sufficient. Therefore, a regression testing procedure has been implemented in order to ensure that the initially obtained validation results are still valid with succeeding code versions. Within the regression testing procedure, calculations of experiments and plant sequences are performed with the same input deck but applying two different code versions. For every test-case the up-to-date code version is compared to the preceding one on the basis of physical parameters deemed to be characteristic for the test-case under consideration. In the case of post-calculations of experiments also a comparison to experimental data is carried out. Three validation cases from the regression testing procedure are presented within this paper. The very good post-calculation of the HDR E11.1 experiment shows the high quality modelling of thermal-hydraulics in ASTEC CPA. Aerosol behaviour is validated on the BMC VANAM M3 experiment, and the results show also a very good agreement with experimental data. Finally, iodine behaviour is checked in the validation test-case of the THAI IOD-11 experiment. Within this test-case, the comparison of the ASTEC versions V2.0r1 and V2.0r2 shows how an error was detected by the regression testing

  14. CIPS Validation Data Plan

    Energy Technology Data Exchange (ETDEWEB)

    Nam Dinh

    2012-03-01

    This report documents analysis, findings and recommendations resulted from a task 'CIPS Validation Data Plan (VDP)' formulated as an POR4 activity in the CASL VUQ Focus Area (FA), to develop a Validation Data Plan (VDP) for Crud-Induced Power Shift (CIPS) challenge problem, and provide guidance for the CIPS VDP implementation. The main reason and motivation for this task to be carried at this time in the VUQ FA is to bring together (i) knowledge of modern view and capability in VUQ, (ii) knowledge of physical processes that govern the CIPS, and (iii) knowledge of codes, models, and data available, used, potentially accessible, and/or being developed in CASL for CIPS prediction, to devise a practical VDP that effectively supports the CASL's mission in CIPS applications.

  15. Prediction of the hardness profile of an AISI 4340 steel cylinder heat-treated by laser - 3D and artificial neural networks modelling and experimental validation

    Energy Technology Data Exchange (ETDEWEB)

    Hadhri, Mahdi; Ouafi, Abderazzak El; Barka, Noureddine [University of Quebec, Rimouski (Canada)

    2017-02-15

    This paper presents a comprehensive approach developed to design an effective prediction model for hardness profile in laser surface transformation hardening process. Based on finite element method and Artificial neural networks, the proposed approach is built progressively by (i) examining the laser hardening parameters and conditions known to have an influence on the hardened surface attributes through a structured experimental investigation, (ii) investigating the laser hardening parameters effects on the hardness profile through extensive 3D modeling and simulation efforts and (ii) integrating the hardening process parameters via neural network model for hardness profile prediction. The experimental validation conducted on AISI4340 steel using a commercial 3 kW Nd:Yag laser, confirm the feasibility and efficiency of the proposed approach leading to an accurate and reliable hardness profile prediction model. With a maximum relative error of about 10 % under various practical conditions, the predictive model can be considered as effective especially in the case of a relatively complex system such as laser surface transformation hardening process.

  16. Prediction of the hardness profile of an AISI 4340 steel cylinder heat-treated by laser - 3D and artificial neural networks modelling and experimental validation

    International Nuclear Information System (INIS)

    Hadhri, Mahdi; Ouafi, Abderazzak El; Barka, Noureddine

    2017-01-01

    This paper presents a comprehensive approach developed to design an effective prediction model for hardness profile in laser surface transformation hardening process. Based on finite element method and Artificial neural networks, the proposed approach is built progressively by (i) examining the laser hardening parameters and conditions known to have an influence on the hardened surface attributes through a structured experimental investigation, (ii) investigating the laser hardening parameters effects on the hardness profile through extensive 3D modeling and simulation efforts and (ii) integrating the hardening process parameters via neural network model for hardness profile prediction. The experimental validation conducted on AISI4340 steel using a commercial 3 kW Nd:Yag laser, confirm the feasibility and efficiency of the proposed approach leading to an accurate and reliable hardness profile prediction model. With a maximum relative error of about 10 % under various practical conditions, the predictive model can be considered as effective especially in the case of a relatively complex system such as laser surface transformation hardening process

  17. Mortality in severe trauma patients attended by emergency services in Navarre, Spain: validation of a new prediction model and comparison with the Revised Injury Severity Classification Score II.

    Science.gov (United States)

    Ali Ali, Bismil; Lefering, Rolf; Fortún Moral, Mariano; Belzunegui Otano, Tomás

    2018-01-01

    To validate the Mortality Prediction Model of Navarre (MPMN) to predict death after severe trauma and compare it to the Revised Injury Severity Classification Score II (RISCII). Retrospective analysis of a cohort of severe trauma patients (New Injury Severity Score >15) who were attended by emergency services in the Spanish autonomous community of Navarre between 2013 and 2015. The outcome variable was 30-day all-cause mortality. Risk was calculated with the MPMN and the RISCII. The performance of each model was assessed with the area under the receiver operating characteristic (ROC) curve and precision with respect to observed mortality. Calibration was assessed with the Hosmer-Lemeshow test. We included 516 patients. The mean (SD) age was 56 (23) years, and 363 (70%) were males. Ninety patients (17.4%) died within 30 days. The 30-day mortality rates predicted by the MPMN and RISCII were 16.4% and 15.4%, respectively. The areas under the ROC curves were 0.925 (95% CI, 0.902-0.952) for the MPMN and 0.941 (95% CI, 0.921-0.962) for the RISCII (P=0.269, DeLong test). Calibration statistics were 13.6 (P=.09) for the MPMN and 8.9 (P=.35) for the RISCII. Both the MPMN and the RISCII show good ability to discriminate risk and predict 30-day all-cause mortality in severe trauma patients.

  18. Benchmark validation of statistical models: Application to mediation analysis of imagery and memory.

    Science.gov (United States)

    MacKinnon, David P; Valente, Matthew J; Wurpts, Ingrid C

    2018-03-29

    This article describes benchmark validation, an approach to validating a statistical model. According to benchmark validation, a valid model generates estimates and research conclusions consistent with a known substantive effect. Three types of benchmark validation-(a) benchmark value, (b) benchmark estimate, and (c) benchmark effect-are described and illustrated with examples. Benchmark validation methods are especially useful for statistical models with assumptions that are untestable or very difficult to test. Benchmark effect validation methods were applied to evaluate statistical mediation analysis in eight studies using the established effect that increasing mental imagery improves recall of words. Statistical mediation analysis led to conclusions about mediation that were consistent with established theory that increased imagery leads to increased word recall. Benchmark validation based on established substantive theory is discussed as a general way to investigate characteristics of statistical models and a complement to mathematical proof and statistical simulation. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  19. Cross-validation of an employee safety climate model in Malaysia.

    Science.gov (United States)

    Bahari, Siti Fatimah; Clarke, Sharon

    2013-06-01

    Whilst substantial research has investigated the nature of safety climate, and its importance as a leading indicator of organisational safety, much of this research has been conducted with Western industrial samples. The current study focuses on the cross-validation of a safety climate model in the non-Western industrial context of Malaysian manufacturing. The first-order factorial validity of Cheyne et al.'s (1998) [Cheyne, A., Cox, S., Oliver, A., Tomas, J.M., 1998. Modelling safety climate in the prediction of levels of safety activity. Work and Stress, 12(3), 255-271] model was tested, using confirmatory factor analysis, in a Malaysian sample. Results showed that the model fit indices were below accepted levels, indicating that the original Cheyne et al. (1998) safety climate model was not supported. An alternative three-factor model was developed using exploratory factor analysis. Although these findings are not consistent with previously reported cross-validation studies, we argue that previous studies have focused on validation across Western samples, and that the current study demonstrates the need to take account of cultural factors in the development of safety climate models intended for use in non-Western contexts. The results have important implications for the transferability of existing safety climate models across cultures (for example, in global organisations) and highlight the need for future research to examine cross-cultural issues in relation to safety climate. Copyright © 2013 National Safety Council and Elsevier Ltd. All rights reserved.

  20. Surface complexation modeling calculation of Pb(II) adsorption onto the calcined diatomite

    Science.gov (United States)

    Ma, Shu-Cui; Zhang, Ji-Lin; Sun, De-Hui; Liu, Gui-Xia

    2015-12-01

    Removal of noxious heavy metal ions (e.g. Pb(II)) by surface adsorption of minerals (e.g. diatomite) is an important means in the environmental aqueous pollution control. Thus, it is very essential to understand the surface adsorptive behavior and mechanism. In this work, the Pb(II) apparent surface complexation reaction equilibrium constants on the calcined diatomite and distributions of Pb(II) surface species were investigated through modeling calculations of Pb(II) based on diffuse double layer model (DLM) with three amphoteric sites. Batch experiments were used to study the adsorption of Pb(II) onto the calcined diatomite as a function of pH (3.0-7.0) and different ionic strengths (0.05 and 0.1 mol L-1 NaCl) under ambient atmosphere. Adsorption of Pb(II) can be well described by Freundlich isotherm models. The apparent surface complexation equilibrium constants (log K) were obtained by fitting the batch experimental data using the PEST 13.0 together with PHREEQC 3.1.2 codes and there is good agreement between measured and predicted data. Distribution of Pb(II) surface species on the diatomite calculated by PHREEQC 3.1.2 program indicates that the impurity cations (e.g. Al3+, Fe3+, etc.) in the diatomite play a leading role in the Pb(II) adsorption and dominant formation of complexes and additional electrostatic interaction are the main adsorption mechanism of Pb(II) on the diatomite under weak acidic conditions.

  1. Statistical methods for mechanistic model validation: Salt Repository Project

    International Nuclear Information System (INIS)

    Eggett, D.L.

    1988-07-01

    As part of the Department of Energy's Salt Repository Program, Pacific Northwest Laboratory (PNL) is studying the emplacement of nuclear waste containers in a salt repository. One objective of the SRP program is to develop an overall waste package component model which adequately describes such phenomena as container corrosion, waste form leaching, spent fuel degradation, etc., which are possible in the salt repository environment. The form of this model will be proposed, based on scientific principles and relevant salt repository conditions with supporting data. The model will be used to predict the future characteristics of the near field environment. This involves several different submodels such as the amount of time it takes a brine solution to contact a canister in the repository, how long it takes a canister to corrode and expose its contents to the brine, the leach rate of the contents of the canister, etc. These submodels are often tested in a laboratory and should be statistically validated (in this context, validate means to demonstrate that the model adequately describes the data) before they can be incorporated into the waste package component model. This report describes statistical methods for validating these models. 13 refs., 1 fig., 3 tabs

  2. Predicting the ungauged basin: Model validation and realism assessment

    Directory of Open Access Journals (Sweden)

    Tim evan Emmerik

    2015-10-01

    Full Text Available The hydrological decade on Predictions in Ungauged Basins (PUB led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of model outcome has not been discussed to a great extent. With this paper we aim to contribute to the discussion on how one can determine the value and validity of a hydrological model developed for an ungauged basin. As in many cases no local, or even regional, data are available, alternative methods should be applied. Using a PUB case study in a genuinely ungauged basin in southern Cambodia, we give several examples of how one can use different types of soft data to improve model design, calibrate and validate the model, and assess the realism of the model output. A rainfall-runoff model was coupled to an irrigation reservoir, allowing the use of additional and unconventional data. The model was mainly forced with remote sensing data, and local knowledge was used to constrain the parameters. Model realism assessment was done using data from surveys. This resulted in a successful reconstruction of the reservoir dynamics, and revealed the different hydrological characteristics of the two topographical classes. This paper does not present a generic approach that can be transferred to other ungauged catchments, but it aims to show how clever model design and alternative data acquisition can result in a valuable hydrological model for an ungauged catchment.

  3. Validating firn compaction model with remote sensing data

    DEFF Research Database (Denmark)

    Simonsen, S. B.; Stenseng, Lars; Sørensen, Louise Sandberg

    A comprehensive understanding of firn processes is of outmost importance, when estimating present and future changes of the Greenland Ice Sheet. Especially, when remote sensing altimetry is used to assess the state of ice sheets and their contribution to global sea level rise, firn compaction...... models have been shown to be a key component. Now, remote sensing data can also be used to validate the firn models. Radar penetrating the upper part of the firn column in the interior part of Greenland shows a clear layering. The observed layers from the radar data can be used as an in-situ validation...... correction relative to the changes in the elevation of the surface observed with remote sensing altimetry? What model time resolution is necessary to resolved the observed layering? What model refinements are necessary to give better estimates of the surface mass balance of the Greenland ice sheet from...

  4. PEMFC modeling and experimental validation

    Energy Technology Data Exchange (ETDEWEB)

    Vargas, J.V.C. [Federal University of Parana (UFPR), Curitiba, PR (Brazil). Dept. of Mechanical Engineering], E-mail: jvargas@demec.ufpr.br; Ordonez, J.C.; Martins, L.S. [Florida State University, Tallahassee, FL (United States). Center for Advanced Power Systems], Emails: ordonez@caps.fsu.edu, martins@caps.fsu.edu

    2009-07-01

    In this paper, a simplified and comprehensive PEMFC mathematical model introduced in previous studies is experimentally validated. Numerical results are obtained for an existing set of commercial unit PEM fuel cells. The model accounts for pressure drops in the gas channels, and for temperature gradients with respect to space in the flow direction, that are investigated by direct infrared imaging, showing that even at low current operation such gradients are present in fuel cell operation, and therefore should be considered by a PEMFC model, since large coolant flow rates are limited due to induced high pressure drops in the cooling channels. The computed polarization and power curves are directly compared to the experimentally measured ones with good qualitative and quantitative agreement. The combination of accuracy and low computational time allow for the future utilization of the model as a reliable tool for PEMFC simulation, control, design and optimization purposes. (author)

  5. Some guidance on preparing validation plans for the DART Full System Models.

    Energy Technology Data Exchange (ETDEWEB)

    Gray, Genetha Anne; Hough, Patricia Diane; Hills, Richard Guy (Sandia National Laboratories, Albuquerque, NM)

    2009-03-01

    Planning is an important part of computational model verification and validation (V&V) and the requisite planning document is vital for effectively executing the plan. The document provides a means of communicating intent to the typically large group of people, from program management to analysts to test engineers, who must work together to complete the validation activities. This report provides guidelines for writing a validation plan. It describes the components of such a plan and includes important references and resources. While the initial target audience is the DART Full System Model teams in the nuclear weapons program, the guidelines are generally applicable to other modeling efforts. Our goal in writing this document is to provide a framework for consistency in validation plans across weapon systems, different types of models, and different scenarios. Specific details contained in any given validation plan will vary according to application requirements and available resources.

  6. Validation of Self-Report Impairment Measures for Section III Obsessive-Compulsive and Avoidant Personality Disorders.

    Science.gov (United States)

    Liggett, Jacqueline; Carmichael, Kieran L C; Smith, Alexander; Sellbom, Martin

    2017-01-01

    This study examined the validity of newly developed disorder-specific impairment scales (IS), modeled on the Level of Personality Functioning Scale, for obsessive-compulsive (OCPD) and avoidant (AvPD) personality disorders. The IS focused on content validity (items directly reflected the disorder-specific impairments listed in DSM-5 Section III) and severity of impairment. A community sample of 313 adults completed personality inventories indexing the DSM-5 Sections II and III diagnostic criteria for OCPD and AvPD, as well as measures of impairment in the domains of self- and interpersonal functioning. Results indicated that both impairment measures (for AvPD in particular) showed promise in their ability to measure disorder-specific impairment, demonstrating convergent validity with their respective Section II counterparts and discriminant validity with their noncorresponding Section II disorder and with each other. The pattern of relationships between scores on the IS and scores on external measures of personality functioning, however, did not indicate that it is useful to maintain a distinction between impairment in the self- and interpersonal domains, at least for AvPD and OCPD.

  7. Studying Validity of Single-Fluid Model in Inertial Confinement Fusion

    International Nuclear Information System (INIS)

    Gu Jian-Fa; Fan Zheng-Feng; Dai Zhen-Sheng; Ye Wen-Hua; Pei Wen-Bing; Zhu Shao-Ping

    2014-01-01

    The validity of single-fluid model in inertial confinement fusion simulations is studied by comparing the results of the multi- and single-fluid models. The multi-fluid model includes the effects of collision and interpenetration between fluid species. By simulating the collision of fluid species, steady-state shock propagation into the thin DT gas and expansion of hohlraum Au wall heated by lasers, the results show that the validity of single-fluid model is strongly dependent on the ratio of the characteristic length of the simulated system to the particle mean free path. When the characteristic length L is one order larger than the mean free path λ, the single-fluid model's results are found to be in good agreement with the multi-fluid model's simulations, and the modeling of single-fluid remains valid. If the value of L/λ is lower than 10, the interpenetration between fluid species is significant, and the single-fluid simulations show some unphysical results; while the multi-fluid model can describe well the interpenetration and mix phenomena, and give more reasonable results. (physics of gases, plasmas, and electric discharges)

  8. ADVISHE: A new tool to report validation of health-economic decision models

    NARCIS (Netherlands)

    Vemer, P.; Corro Ramos, I.; Van Voorn, G.; Al, M.J.; Feenstra, T.L.

    2014-01-01

    Background: Modelers and reimbursement decision makers could both profit from a more systematic reporting of the efforts to validate health-economic (HE) models. Objectives: Development of a tool to systematically report validation efforts of HE decision models and their outcomes. Methods: A gross

  9. Modeling the World Health Organization Disability Assessment Schedule II using non-parametric item response models.

    Science.gov (United States)

    Galindo-Garre, Francisca; Hidalgo, María Dolores; Guilera, Georgina; Pino, Oscar; Rojo, J Emilio; Gómez-Benito, Juana

    2015-03-01

    The World Health Organization Disability Assessment Schedule II (WHO-DAS II) is a multidimensional instrument developed for measuring disability. It comprises six domains (getting around, self-care, getting along with others, life activities and participation in society). The main purpose of this paper is the evaluation of the psychometric properties for each domain of the WHO-DAS II with parametric and non-parametric Item Response Theory (IRT) models. A secondary objective is to assess whether the WHO-DAS II items within each domain form a hierarchy of invariantly ordered severity indicators of disability. A sample of 352 patients with a schizophrenia spectrum disorder is used in this study. The 36 items WHO-DAS II was administered during the consultation. Partial Credit and Mokken scale models are used to study the psychometric properties of the questionnaire. The psychometric properties of the WHO-DAS II scale are satisfactory for all the domains. However, we identify a few items that do not discriminate satisfactorily between different levels of disability and cannot be invariantly ordered in the scale. In conclusion the WHO-DAS II can be used to assess overall disability in patients with schizophrenia, but some domains are too general to assess functionality in these patients because they contain items that are not applicable to this pathology. Copyright © 2014 John Wiley & Sons, Ltd.

  10. Validating a continental-scale groundwater diffuse pollution model using regional datasets.

    Science.gov (United States)

    Ouedraogo, Issoufou; Defourny, Pierre; Vanclooster, Marnik

    2017-12-11

    In this study, we assess the validity of an African-scale groundwater pollution model for nitrates. In a previous study, we identified a statistical continental-scale groundwater pollution model for nitrate. The model was identified using a pan-African meta-analysis of available nitrate groundwater pollution studies. The model was implemented in both Random Forest (RF) and multiple regression formats. For both approaches, we collected as predictors a comprehensive GIS database of 13 spatial attributes, related to land use, soil type, hydrogeology, topography, climatology, region typology, nitrogen fertiliser application rate, and population density. In this paper, we validate the continental-scale model of groundwater contamination by using a nitrate measurement dataset from three African countries. We discuss the issue of data availability, and quality and scale issues, as challenges in validation. Notwithstanding that the modelling procedure exhibited very good success using a continental-scale dataset (e.g. R 2  = 0.97 in the RF format using a cross-validation approach), the continental-scale model could not be used without recalibration to predict nitrate pollution at the country scale using regional data. In addition, when recalibrating the model using country-scale datasets, the order of model exploratory factors changes. This suggests that the structure and the parameters of a statistical spatially distributed groundwater degradation model for the African continent are strongly scale dependent.

  11. Validation of spectral gas radiation models under oxyfuel conditions

    Energy Technology Data Exchange (ETDEWEB)

    Becher, Johann Valentin

    2013-05-15

    Combustion of hydrocarbon fuels with pure oxygen results in a different flue gas composition than combustion with air. Standard computational-fluid-dynamics (CFD) spectral gas radiation models for air combustion are therefore out of their validity range in oxyfuel combustion. This thesis provides a common spectral basis for the validation of new spectral models. A literature review about fundamental gas radiation theory, spectral modeling and experimental methods provides the reader with a basic understanding of the topic. In the first results section, this thesis validates detailed spectral models with high resolution spectral measurements in a gas cell with the aim of recommending one model as the best benchmark model. In the second results section, spectral measurements from a turbulent natural gas flame - as an example for a technical combustion process - are compared to simulated spectra based on measured gas atmospheres. The third results section compares simplified spectral models to the benchmark model recommended in the first results section and gives a ranking of the proposed models based on their accuracy. A concluding section gives recommendations for the selection and further development of simplified spectral radiation models. Gas cell transmissivity spectra in the spectral range of 2.4 - 5.4 {mu}m of water vapor and carbon dioxide in the temperature range from 727 C to 1500 C and at different concentrations were compared in the first results section at a nominal resolution of 32 cm{sup -1} to line-by-line models from different databases, two statistical-narrow-band models and the exponential-wide-band model. The two statistical-narrow-band models EM2C and RADCAL showed good agreement with a maximal band transmissivity deviation of 3 %. The exponential-wide-band model showed a deviation of 6 %. The new line-by-line database HITEMP2010 had the lowest band transmissivity deviation of 2.2% and was therefore recommended as a reference model for the

  12. A Qualitative Study on the Content Validity of the Social Capital Scales in the Copenhagen Psychosocial Questionnaire (COPSOQ II

    Directory of Open Access Journals (Sweden)

    Hanne Berthelsen

    2016-06-01

    Full Text Available The Copenhagen Psychosocial Questionnaire (COPSOQ II includes scales for measuring 'workplace social capital'. The overall aim of this article is to evaluate the content validity of the following scales: horizontal trust, vertical trust and justice based on data from cognitive interviews using a think-aloud procedure. Informants were selected to achieve variation in gender, age, region of residence, and occupation. A predetermined coding scheme was used to identify: 1 Perspective (reflection on behalf of oneself only or abstraction to a broader perspective, 2 Use of response options, 3 Contexts challenging the process of answering, and 4 Overall reflections included in the retrieval and judgement processes leading to an answer for each item. The results showed that 1 the intended shift from individual to a broader perspective worked for eight out of eleven items. 2 The response option balancing in the middle covered different meanings. Retrieval of information needed to answer constituted a problem in four out of eleven items. 3 Three contextually challenging situations were identified. 4 For most items the reflections corresponded well with the intention of the scales, though the items asking about withheld information caused more problems in answering and lower content validity compared to the other items of the scales. In general, the findings supported the content validity of the COPSOQ II measurement of workplace social capital as a group construct. The study opens for new insight into how concepts and questions are understood and answered among people coming from different occupations and organizational settings.

  13. Validation of Power Requirement Model for Active Loudspeakers

    DEFF Research Database (Denmark)

    Schneider, Henrik; Madsen, Anders Normann; Bjerregaard, Ruben

    2015-01-01

    . There are however many advantages that could be harvested from such knowledge like size, cost and efficiency improvements. In this paper a recently proposed power requirement model for active loudspeakers is experimentally validated and the model is expanded to include the closed and vented type enclosures...

  14. Progress in Geant4 Electromagnetic Physics Modelling and Validation

    International Nuclear Information System (INIS)

    Apostolakis, J; Burkhardt, H; Ivanchenko, V N; Asai, M; Bagulya, A; Grichine, V; Brown, J M C; Chikuma, N; Cortes-Giraldo, M A; Elles, S; Jacquemier, J; Guatelli, S; Incerti, S; Kadri, O; Maire, M; Urban, L; Pandola, L; Sawkey, D; Toshito, T; Yamashita, T

    2015-01-01

    In this work we report on recent improvements in the electromagnetic (EM) physics models of Geant4 and new validations of EM physics. Improvements have been made in models of the photoelectric effect, Compton scattering, gamma conversion to electron and muon pairs, fluctuations of energy loss, multiple scattering, synchrotron radiation, and high energy positron annihilation. The results of these developments are included in the new Geant4 version 10.1 and in patches to previous versions 9.6 and 10.0 that are planned to be used for production for run-2 at LHC. The Geant4 validation suite for EM physics has been extended and new validation results are shown in this work. In particular, the effect of gamma-nuclear interactions on EM shower shape at LHC energies is discussed. (paper)

  15. Accounting for treatment use when validating a prognostic model: a simulation study.

    Science.gov (United States)

    Pajouheshnia, Romin; Peelen, Linda M; Moons, Karel G M; Reitsma, Johannes B; Groenwold, Rolf H H

    2017-07-14

    Prognostic models often show poor performance when applied to independent validation data sets. We illustrate how treatment use in a validation set can affect measures of model performance and present the uses and limitations of available analytical methods to account for this using simulated data. We outline how the use of risk-lowering treatments in a validation set can lead to an apparent overestimation of risk by a prognostic model that was developed in a treatment-naïve cohort to make predictions of risk without treatment. Potential methods to correct for the effects of treatment use when testing or validating a prognostic model are discussed from a theoretical perspective.. Subsequently, we assess, in simulated data sets, the impact of excluding treated individuals and the use of inverse probability weighting (IPW) on the estimated model discrimination (c-index) and calibration (observed:expected ratio and calibration plots) in scenarios with different patterns and effects of treatment use. Ignoring the use of effective treatments in a validation data set leads to poorer model discrimination and calibration than would be observed in the untreated target population for the model. Excluding treated individuals provided correct estimates of model performance only when treatment was randomly allocated, although this reduced the precision of the estimates. IPW followed by exclusion of the treated individuals provided correct estimates of model performance in data sets where treatment use was either random or moderately associated with an individual's risk when the assumptions of IPW were met, but yielded incorrect estimates in the presence of non-positivity or an unobserved confounder. When validating a prognostic model developed to make predictions of risk without treatment, treatment use in the validation set can bias estimates of the performance of the model in future targeted individuals, and should not be ignored. When treatment use is random, treated

  16. Validation of dispersion model of RTARC-DSS based on ''KIT'' field experiments

    International Nuclear Information System (INIS)

    Duran, J.

    2000-01-01

    The aim of this study is to present the performance of the Gaussian dispersion model RTARC-DSS (Real Time Accident Release Consequences - Decision Support System) at the 'Kit' field experiments. The Model Validation Kit is a collection of three experimental data sets from Kincaid, Copenhagen, Lillestrom and supplementary Indianopolis experimental campaigns accompanied by software for model evaluation. The validation of the model has been performed on the basis of the maximum arc-wise concentrations using the Bootstrap resampling procedure the variation of the model residuals. Validation was performed for the short-range distances (about 1 - 10 km, maximum for Kincaid data set - 50 km from source). Model evaluation procedure and amount of relative over- or under-prediction are discussed and compared with the model. (author)

  17. A comprehensive model for the prediction of vibrations due to underground railway traffic: formulation and validation

    International Nuclear Information System (INIS)

    Costa, Pedro Alvares; Cardoso Silva, Antonio; Calçada, Rui; Lopes, Patricia; Fernandez, Jesus

    2016-01-01

    n this communication, a numerical approach for the prediction of vibrations induced in buildings due to railway traffic in tunnels is presented. The numerical model is based on the concept of dynamic sub structuring, being composed by three autonomous models to simulate the following main parts of the problem: i) generation of vibrations (train-track interaction); ii) propagation of vibrations (track - tunnel-ground system); iii) reception of vibrations (building coupled to the ground). The methodology proposed allows dealing with the three-dimensional characteristics of the problem with a reasonable computational effort [ 1 , 2 ] . After the brief description of the model, its experimental validation is performed. For that, a case study about vibrations inside of a building close to a shallow railway tunnel in Madrid are simulated and the experimental data [ 3 ] is compared with the predicted results [ 4 ]. Finally, the communication finishes with some insights about the potentialities and challenges of this numerical modelling approach on the prediction of the behavior of ancient structures subjected to vibrations induced by human sources (railway and road traffic, pile driving, etc)

  18. Method for appraising model validity of randomised controlled trials of homeopathic treatment: multi-rater concordance study

    Science.gov (United States)

    2012-01-01

    Background A method for assessing the model validity of randomised controlled trials of homeopathy is needed. To date, only conventional standards for assessing intrinsic bias (internal validity) of trials have been invoked, with little recognition of the special characteristics of homeopathy. We aimed to identify relevant judgmental domains to use in assessing the model validity of homeopathic treatment (MVHT). We define MVHT as the extent to which a homeopathic intervention and the main measure of its outcome, as implemented in a randomised controlled trial (RCT), reflect 'state-of-the-art' homeopathic practice. Methods Using an iterative process, an international group of experts developed a set of six judgmental domains, with associated descriptive criteria. The domains address: (I) the rationale for the choice of the particular homeopathic intervention; (II) the homeopathic principles reflected in the intervention; (III) the extent of homeopathic practitioner input; (IV) the nature of the main outcome measure; (V) the capability of the main outcome measure to detect change; (VI) the length of follow-up to the endpoint of the study. Six papers reporting RCTs of homeopathy of varying design were randomly selected from the literature. A standard form was used to record each assessor's independent response per domain, using the optional verdicts 'Yes', 'Unclear', 'No'. Concordance among the eight verdicts per domain, across all six papers, was evaluated using the kappa (κ) statistic. Results The six judgmental domains enabled MVHT to be assessed with 'fair' to 'almost perfect' concordance in each case. For the six RCTs examined, the method allowed MVHT to be classified overall as 'acceptable' in three, 'unclear' in two, and 'inadequate' in one. Conclusion Future systematic reviews of RCTs in homeopathy should adopt the MVHT method as part of a complete appraisal of trial validity. PMID:22510227

  19. Development and demonstration of a validation methodology for vehicle lateral dynamics simulation models

    Energy Technology Data Exchange (ETDEWEB)

    Kutluay, Emir

    2013-02-01

    In this thesis a validation methodology to be used in the assessment of the vehicle dynamics simulation models is presented. Simulation of vehicle dynamics is used to estimate the dynamic responses of existing or proposed vehicles and has a wide array of applications in the development of vehicle technologies. Although simulation environments, measurement tools and mathematical theories on vehicle dynamics are well established, the methodical link between the experimental test data and validity analysis of the simulation model is still lacking. The developed validation paradigm has a top-down approach to the problem. It is ascertained that vehicle dynamics simulation models can only be validated using test maneuvers although they are aimed for real world maneuvers. Test maneuvers are determined according to the requirements of the real event at the start of the model development project and data handling techniques, validation metrics and criteria are declared for each of the selected maneuvers. If the simulation results satisfy these criteria, then the simulation is deemed ''not invalid''. If the simulation model fails to meet the criteria, the model is deemed invalid, and model iteration should be performed. The results are analyzed to determine if the results indicate a modeling error or a modeling inadequacy; and if a conditional validity in terms of system variables can be defined. Three test cases are used to demonstrate the application of the methodology. The developed methodology successfully identified the shortcomings of the tested simulation model, and defined the limits of application. The tested simulation model is found to be acceptable but valid only in a certain dynamical range. Several insights for the deficiencies of the model are reported in the analysis but the iteration step of the methodology is not demonstrated. Utilizing the proposed methodology will help to achieve more time and cost efficient simulation projects with

  20. The validation of evacuation simulation models through the analysis of behavioural uncertainty

    International Nuclear Information System (INIS)

    Lovreglio, Ruggiero; Ronchi, Enrico; Borri, Dino

    2014-01-01

    Both experimental and simulation data on fire evacuation are influenced by a component of uncertainty caused by the impact of the unexplained variance in human behaviour, namely behavioural uncertainty (BU). Evacuation model validation studies should include the study of this type of uncertainty during the comparison of experiments and simulation results. An evacuation model validation procedure is introduced in this paper to study the impact of BU. This methodology is presented through a case study for the comparison between repeated experimental data and simulation results produced by FDS+Evac, an evacuation model for the simulation of human behaviour in fire, which makes use of distribution laws. - Highlights: • Validation of evacuation models is investigated. • Quantitative evaluation of behavioural uncertainty is performed. • A validation procedure is presented through an evacuation case study

  1. Construct validity of the ovine model in endoscopic sinus surgery training.

    Science.gov (United States)

    Awad, Zaid; Taghi, Ali; Sethukumar, Priya; Tolley, Neil S

    2015-03-01

    To demonstrate construct validity of the ovine model as a tool for training in endoscopic sinus surgery (ESS). Prospective, cross-sectional evaluation study. Over 18 consecutive months, trainees and experts were evaluated in their ability to perform a range of tasks (based on previous face validation and descriptive studies conducted by the same group) relating to ESS on the sheep-head model. Anonymized randomized video recordings of the above were assessed by two independent and blinded assessors. A validated assessment tool utilizing a five-point Likert scale was employed. Construct validity was calculated by comparing scores across training levels and experts using mean and interquartile range of global and task-specific scores. Subgroup analysis of the intermediate group ascertained previous experience. Nonparametric descriptive statistics were used, and analysis was carried out using SPSS version 21 (IBM, Armonk, NY). Reliability of the assessment tool was confirmed. The model discriminated well between different levels of expertise in global and task-specific scores. A positive correlation was noted between year in training and both global and task-specific scores (P variable, and the number of ESS procedures performed under supervision had the highest impact on performance. This study describes an alternative model for ESS training and assessment. It is also the first to demonstrate construct validity of the sheep-head model for ESS training. © 2014 The American Laryngological, Rhinological and Otological Society, Inc.

  2. Validating a Technology Enhanced Student-Centered Learning Model

    Science.gov (United States)

    Kang, Myunghee; Hahn, Jungsun; Chung, Warren

    2015-01-01

    The Technology Enhanced Student Centered Learning (TESCL) Model in this study presents the core factors that ensure the quality of learning in a technology-supported environment. Although the model was conceptually constructed using a student-centered learning framework and drawing upon previous studies, it should be validated through real-world…

  3. Systematic validation of non-equilibrium thermochemical models using Bayesian inference

    KAUST Repository

    Miki, Kenji

    2015-10-01

    © 2015 Elsevier Inc. The validation process proposed by Babuška et al. [1] is applied to thermochemical models describing post-shock flow conditions. In this validation approach, experimental data is involved only in the calibration of the models, and the decision process is based on quantities of interest (QoIs) predicted on scenarios that are not necessarily amenable experimentally. Moreover, uncertainties present in the experimental data, as well as those resulting from an incomplete physical model description, are propagated to the QoIs. We investigate four commonly used thermochemical models: a one-temperature model (which assumes thermal equilibrium among all inner modes), and two-temperature models developed by Macheret et al. [2], Marrone and Treanor [3], and Park [4]. Up to 16 uncertain parameters are estimated using Bayesian updating based on the latest absolute volumetric radiance data collected at the Electric Arc Shock Tube (EAST) installed inside the NASA Ames Research Center. Following the solution of the inverse problems, the forward problems are solved in order to predict the radiative heat flux, QoI, and examine the validity of these models. Our results show that all four models are invalid, but for different reasons: the one-temperature model simply fails to reproduce the data while the two-temperature models exhibit unacceptably large uncertainties in the QoI predictions.

  4. Stochastic modeling of oligodendrocyte generation in cell culture: model validation with time-lapse data

    Directory of Open Access Journals (Sweden)

    Noble Mark

    2006-05-01

    Full Text Available Abstract Background The purpose of this paper is two-fold. The first objective is to validate the assumptions behind a stochastic model developed earlier by these authors to describe oligodendrocyte generation in cell culture. The second is to generate time-lapse data that may help biomathematicians to build stochastic models of cell proliferation and differentiation under other experimental scenarios. Results Using time-lapse video recording it is possible to follow the individual evolutions of different cells within each clone. This experimental technique is very laborious and cannot replace model-based quantitative inference from clonal data. However, it is unrivalled in validating the structure of a stochastic model intended to describe cell proliferation and differentiation at the clonal level. In this paper, such data are reported and analyzed for oligodendrocyte precursor cells cultured in vitro. Conclusion The results strongly support the validity of the most basic assumptions underpinning the previously proposed model of oligodendrocyte development in cell culture. However, there are some discrepancies; the most important is that the contribution of progenitor cell death to cell kinetics in this experimental system has been underestimated.

  5. Chemical speciation of Pb(II, Cd(II, Hg(II, Co(II, Ni(II, Cu(II and Zn(II binary complexes of l-methionine in 1,2-propanediol-water mixtures

    Directory of Open Access Journals (Sweden)

    M. Padma Latha

    2007-04-01

    Full Text Available Chemical speciation of Pb(II, Cd(II, Hg(II, Co(II, Ni(II, Cu(II and Zn(II complexes of L-methionine in 0.0-60 % v/v 1,2-propanediol-water mixtures maintaining an ionic strength of 0.16 M at 303 K has been studied pH metrically. The active forms of ligand are LH2+, LH and L-. The predominant species detected are ML, MLH, ML2, ML2H, ML2H2 and MLOH. Models containing different numbers of species were refined by using the computer program MINIQUAD 75. The best-fit chemical models were arrived at based on statistical parameters. The trend in variation of complex stability constants with change in the dielectric constant of the medium is explained on the basis of electrostatic and non-electrostatic forces.

  6. Athletes' Perceptions of Coaching Competency Scale II-High School Teams

    Science.gov (United States)

    Myers, Nicholas D.; Chase, Melissa A.; Beauchamp, Mark R.; Jackson, Ben

    2010-01-01

    The purpose of this validity study was to improve measurement of athletes' evaluations of their head coach's coaching competency, an important multidimensional construct in models of coaching effectiveness. A revised version of the Coaching Competency Scale (CCS) was developed for athletes of high school teams (APCCS II-HST). Data were collected…

  7. PIO I-II tendencies. Part 2. Improving the pilot modeling

    Directory of Open Access Journals (Sweden)

    Ioan URSU

    2011-03-01

    Full Text Available The study is conceived in two parts and aims to get some contributions to the problem ofPIO aircraft susceptibility analysis. Part I, previously published in this journal, highlighted the mainsteps of deriving a complex model of human pilot. The current Part II of the paper considers a properprocedure of the human pilot mathematical model synthesis in order to analyze PIO II typesusceptibility of a VTOL-type aircraft, related to the presence of position and rate-limited actuator.The mathematical tools are those of semi global stability theory developed in recent works.

  8. Importance of Computer Model Validation in Pyroprocessing Technology Development

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Y. E.; Li, Hui; Yim, M. S. [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of)

    2014-05-15

    In this research, we developed a plan for experimental validation of one of the computer models developed for ER process modeling, i. e., the ERAD code. Several candidate surrogate materials are selected for the experiment considering the chemical and physical properties. Molten salt-based pyroprocessing technology is being examined internationally as an alternative to treat spent nuclear fuel over aqueous technology. The central process in pyroprocessing is electrorefining(ER) which separates uranium from transuranic elements and fission products present in spent nuclear fuel. ER is a widely used process in the minerals industry to purify impure metals. Studies of ER by using actual spent nuclear fuel materials are problematic for both technical and political reasons. Therefore, the initial effort for ER process optimization is made by using computer models. A number of models have been developed for this purpose. But as validation of these models is incomplete and often times problematic, the simulation results from these models are inherently uncertain.

  9. Validation Hydrodynamic Models of Three Topological Models of Secondary Facultative Ponds

    OpenAIRE

    Aponte-Reyes Alxander

    2014-01-01

    A methodology was developed to analyze boundary conditions, the size of the mesh and the turbulence of a mathematical model of CFD, which could explain hydrodynamic behavior on facultative stabilization ponds, FSP, built to pilot scale: conventional pond, CP, baffled pond, BP, and baffled-mesh pond, BMP. Models dispersion studies were performed in field for validation, taking samples into and out of the FSP, the information was used to carry out CFD model simulations of the three topologies. ...

  10. Validating a perceptual distraction model in a personal two-zone sound system

    DEFF Research Database (Denmark)

    Rämö, Jussi; Christensen, Lasse; Bech, Søren

    2017-01-01

    This paper focuses on validating a perceptual distraction model, which aims to predict user’s perceived distraction caused by audio-on-audio interference, e.g., two competing audio sources within the same listening space. Originally, the distraction model was trained with music-on-music stimuli...... using a simple loudspeaker setup, consisting of only two loudspeakers, one for the target sound source and the other for the interfering sound source. Recently, the model was successfully validated in a complex personal sound-zone system with speech-on-music stimuli. Second round of validations were...... conducted by physically altering the sound-zone system and running a set of new listening experiments utilizing two sound zones within the sound-zone system. Thus, validating the model using a different sound-zone system with both speech-on-music and music-on-speech stimuli sets. Preliminary results show...

  11. Modelling Zn(II) sorption onto clayey sediments using a multi-site ion-exchange model

    International Nuclear Information System (INIS)

    Tertre, E.; Beaucaire, C.; Coreau, N.; Juery, A.

    2009-01-01

    In environmental studies, it is necessary to be able to predict the behaviour of contaminants in more or less complex physico-chemical contexts. The improvement of this prediction partly depends on establishing thermodynamic models that can describe the behaviour of these contaminants and, in particular, the sorption reactions on mineral surfaces. In this way, based on the mass action law, it is possible to use surface complexation models and ion exchange models. Therefore, the aim of this study is (i) to develop an ion-exchange model able to describe the sorption of transition metal onto pure clay minerals and (ii) to test the ability of this approach to predict the sorption of these elements onto natural materials containing clay minerals (i.e. soils/sediments) under various chemical conditions. This study is focused on the behaviour of Zn(II) in the presence of clayey sediments. Considering that clay minerals are cation exchangers containing multiple sorption sites, it is possible to interpret the sorption of Zn(II), as well as competitor cations, by ion-exchange equilibria with the clay minerals. This approach is applied with success to interpret the experimental data obtained previously in the Zn(II)-H + -Na + -montmorillonite system. The authors' research team has already studied the behaviour of Na + , K + , Ca 2+ and Mg 2+ versus pH in terms of ion exchange onto pure montmorillonite, leading to the development of a thermodynamic database including the exchange site concentrations associated with montmorillonite and the selectivity coefficients of Na + , K + , Ca 2+ , Mg 2+ , and Zn 2+ versus H + . In the present study, experimental isotherms of Zn(II) on two different sediments in batch reactors at different pH and ionic strengths, using NaCl and CaSO 4 as electrolytes are reported. Assuming clay minerals are the main ion-exchanging phases, it is possible to predict Zn(II) sorption onto sediments under different experimental conditions, using the previously

  12. The Edinburgh Postnatal Depression Scale: translation and validation for a Greek sample

    Directory of Open Access Journals (Sweden)

    Kogevinas Manolis

    2009-09-01

    Full Text Available Abstract Background Edinburgh Postnatal Depression Scale (EPDS is an important screening instrument that is used routinely with mothers during the postpartum period for early identification of postnatal depression. The purpose of this study was to validate the Greek version of EPDS along with sensitivity, specificity and predictive values. Methods 120 mothers within 12 weeks postpartum were recruited from the perinatal care registers of the Maternity Departments of 4 Hospitals of Heraklion municipality, Greece. EPDS and Beck Depression Inventory-II (BDI-II surveys were administered in random order to the mothers. Each mother was diagnosed with depression according to the validated Greek version of BDI-II. The psychometric measurements that were performed included: two independent samples t-tests, One-way analysis of variance (ANOVA, reliability coefficients, Explanatory factor analysis using a Varimax rotation and Principal Components Method. Confirmatory analysis -known as structural equation modelling- of principal components was conducted by LISREL (Linear Structural Relations. A receiver operating characteristic (ROC analysis was carried out to evaluate the global functioning of the scale. Results 8 (6.7% of the mothers were diagnosed with major postnatal depression, 14 (11.7% with moderate and 38 (31.7% with mild depression on the basis of BDI-II scores. The internal consistency of the EPDS Greek version -using Chronbach's alpha coefficient- was found 0.804 and that of Guttman split-half coefficient 0.742. Our findings confirm the multidimensionality of EPDS, demonstrating a two-factor structure which contained subscales reflecting depressive symptoms and anxiety. The Confirmatory Factor analysis demonstrated that the two factor model offered a very good fit to our data. The area under ROC curve AUC was found 0.7470 and the logistic estimate for the threshold score of 8/9 fitted the model sensitivity at 76.7% and model specificity at 68

  13. The Edinburgh Postnatal Depression Scale: translation and validation for a Greek sample.

    Science.gov (United States)

    Vivilaki, Victoria G; Dafermos, Vassilis; Kogevinas, Manolis; Bitsios, Panos; Lionis, Christos

    2009-09-09

    Edinburgh Postnatal Depression Scale (EPDS) is an important screening instrument that is used routinely with mothers during the postpartum period for early identification of postnatal depression. The purpose of this study was to validate the Greek version of EPDS along with sensitivity, specificity and predictive values. 120 mothers within 12 weeks postpartum were recruited from the perinatal care registers of the Maternity Departments of 4 Hospitals of Heraklion municipality, Greece. EPDS and Beck Depression Inventory-II (BDI-II) surveys were administered in random order to the mothers. Each mother was diagnosed with depression according to the validated Greek version of BDI-II. The psychometric measurements that were performed included: two independent samples t-tests, One-way analysis of variance (ANOVA), reliability coefficients, Explanatory factor analysis using a Varimax rotation and Principal Components Method. Confirmatory analysis -known as structural equation modelling- of principal components was conducted by LISREL (Linear Structural Relations). A receiver operating characteristic (ROC) analysis was carried out to evaluate the global functioning of the scale. 8 (6.7%) of the mothers were diagnosed with major postnatal depression, 14 (11.7%) with moderate and 38 (31.7%) with mild depression on the basis of BDI-II scores. The internal consistency of the EPDS Greek version -using Chronbach's alpha coefficient- was found 0.804 and that of Guttman split-half coefficient 0.742. Our findings confirm the multidimensionality of EPDS, demonstrating a two-factor structure which contained subscales reflecting depressive symptoms and anxiety. The Confirmatory Factor analysis demonstrated that the two factor model offered a very good fit to our data. The area under ROC curve AUC was found 0.7470 and the logistic estimate for the threshold score of 8/9 fitted the model sensitivity at 76.7% and model specificity at 68.3%. Our data confirm the validity of the Greek

  14. Accounting for treatment use when validating a prognostic model: a simulation study

    Directory of Open Access Journals (Sweden)

    Romin Pajouheshnia

    2017-07-01

    Full Text Available Abstract Background Prognostic models often show poor performance when applied to independent validation data sets. We illustrate how treatment use in a validation set can affect measures of model performance and present the uses and limitations of available analytical methods to account for this using simulated data. Methods We outline how the use of risk-lowering treatments in a validation set can lead to an apparent overestimation of risk by a prognostic model that was developed in a treatment-naïve cohort to make predictions of risk without treatment. Potential methods to correct for the effects of treatment use when testing or validating a prognostic model are discussed from a theoretical perspective.. Subsequently, we assess, in simulated data sets, the impact of excluding treated individuals and the use of inverse probability weighting (IPW on the estimated model discrimination (c-index and calibration (observed:expected ratio and calibration plots in scenarios with different patterns and effects of treatment use. Results Ignoring the use of effective treatments in a validation data set leads to poorer model discrimination and calibration than would be observed in the untreated target population for the model. Excluding treated individuals provided correct estimates of model performance only when treatment was randomly allocated, although this reduced the precision of the estimates. IPW followed by exclusion of the treated individuals provided correct estimates of model performance in data sets where treatment use was either random or moderately associated with an individual's risk when the assumptions of IPW were met, but yielded incorrect estimates in the presence of non-positivity or an unobserved confounder. Conclusions When validating a prognostic model developed to make predictions of risk without treatment, treatment use in the validation set can bias estimates of the performance of the model in future targeted individuals, and

  15. Paleoclimate validation of a numerical climate model

    International Nuclear Information System (INIS)

    Schelling, F.J.; Church, H.W.; Zak, B.D.; Thompson, S.L.

    1994-01-01

    An analysis planned to validate regional climate model results for a past climate state at Yucca Mountain, Nevada, against paleoclimate evidence for the period is described. This analysis, which will use the GENESIS model of global climate nested with the RegCM2 regional climate model, is part of a larger study for DOE's Yucca Mountain Site Characterization Project that is evaluating the impacts of long term future climate change on performance of the potential high level nuclear waste repository at Yucca Mountain. The planned analysis and anticipated results are presented

  16. Validation of a phytoremediation computer model

    International Nuclear Information System (INIS)

    Corapcioglu, M.Y.; Sung, K.; Rhykerd, R.L.; Munster, C.; Drew, M.

    1999-01-01

    The use of plants to stimulate remediation of contaminated soil is an effective, low-cost cleanup method which can be applied to many different sites. A phytoremediation computer model has been developed to simulate how recalcitrant hydrocarbons interact with plant roots in unsaturated soil. A study was conducted to provide data to validate and calibrate the model. During the study, lysimeters were constructed and filled with soil contaminated with 10 [mg kg -1 ] TNT, PBB and chrysene. Vegetated and unvegetated treatments were conducted in triplicate to obtain data regarding contaminant concentrations in the soil, plant roots, root distribution, microbial activity, plant water use and soil moisture. When given the parameters of time and depth, the model successfully predicted contaminant concentrations under actual field conditions. Other model parameters are currently being evaluated. 15 refs., 2 figs

  17. Making Validated Educational Models Central in Preschool Standards.

    Science.gov (United States)

    Schweinhart, Lawrence J.

    This paper presents some ideas to preschool educators and policy makers about how to make validated educational models central in standards for preschool education and care programs that are available to all 3- and 4-year-olds. Defining an educational model as a coherent body of program practices, curriculum content, program and child, and teacher…

  18. Validation of the newborn larynx modeling with aerodynamical experimental data.

    Science.gov (United States)

    Nicollas, R; Giordano, J; Garrel, R; Medale, M; Caminat, P; Giovanni, A; Ouaknine, M; Triglia, J M

    2009-06-01

    Many authors have studied adult's larynx modelization, but the mechanisms of newborn's voice production have very rarely been investigated. After validating a numerical model with acoustic data, studies were performed on larynges of human fetuses in order to validate this model with aerodynamical experiments. Anatomical measurements were performed and a simplified numerical model was built using Fluent((R)) with the vocal folds in phonatory position. The results obtained are in good agreement with those obtained by laser Doppler velocimetry (LDV) and high-frame rate particle image velocimetry (HFR-PIV), on an experimental bench with excised human fetus larynges. It appears that computing with first cry physiological parameters leads to a model which is close to those obtained in experiments with real organs.

  19. PEP-II vacuum system pressure profile modeling using EXCEL

    International Nuclear Information System (INIS)

    Nordby, M.; Perkins, C.

    1994-06-01

    A generic, adaptable Microsoft EXCEL program to simulate molecular flow in beam line vacuum systems is introduced. Modeling using finite-element approximation of the governing differential equation is discussed, as well as error estimation and program capabilities. The ease of use and flexibility of the spreadsheet-based program is demonstrated. PEP-II vacuum system models are reviewed and compared with analytical models

  20. Validation od computational model ALDERSON/EGSnrc for chest radiography

    International Nuclear Information System (INIS)

    Muniz, Bianca C.; Santos, André L. dos; Menezes, Claudio J.M.

    2017-01-01

    To perform dose studies in situations of exposure to radiation, without exposing individuals, the numerical dosimetry uses Computational Exposure Models (ECM). Composed essentially by a radioactive source simulator algorithm, a voxel phantom representing the human anatomy and a Monte Carlo code, the ECMs must be validated to determine the reliability of the physical array representation. The objective of this work is to validate the ALDERSON / EGSnrc MCE by through comparisons between the experimental measurements obtained with the ionization chamber and virtual simulations using Monte Carlo Method to determine the ratio of the input and output radiation dose. Preliminary results of these comparisons showed that the ECM reproduced the results of the experimental measurements performed with the physical phantom with a relative error of less than 10%, validating the use of this model for simulations of chest radiographs and estimates of radiation doses in tissues in the irradiated structures

  1. Development and Validation of Methodology to Model Flow in Ventilation Systems Commonly Found in Nuclear Facilities - Phase II

    Energy Technology Data Exchange (ETDEWEB)

    Strons, Philip [Argonne National Lab. (ANL), Argonne, IL (United States); Bailey, James L. [Argonne National Lab. (ANL), Argonne, IL (United States); Davis, John [Argonne National Lab. (ANL), Argonne, IL (United States); Grudzinski, James [Argonne National Lab. (ANL), Argonne, IL (United States); Hlotke, John [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-03-01

    In this report we present the results of the Phase II analysis and testing of the flow patterns encountered in the Alpha Gamma Hot Cell Facility (AGHCF), as well as the results from an opportunity to expand upon field test work from Phase I by the use of a Class IIIb laser. The addition to the Phase I work is covered before proceeding to the results of the Phase II work, followed by a summary of findings.

  2. Guidelines for uncertainty analysis developed for the participants in the BIOMOVS II study

    International Nuclear Information System (INIS)

    Baeverstam, U.; Davis, P.; Garcia-Olivares, A.; Henrich, E.; Koch, J.

    1993-07-01

    This report has been produced to provide guidelines for uncertainty analysis for use by participants in the BIOMOVS II study. It is hoped that others with an interest in modelling contamination in the biosphere will also find it useful. The report has been prepared by members of the Uncertainty and Validation Working Group and has been reviewed by other BIOMOVS II participants. The opinions expressed are those of the authors and should not be taken to represent the views of the BIOMOVS II sponsors or other BIOMOVS Il participating organisations

  3. Guidelines for uncertainty analysis developed for the participants in the BIOMOVS II study

    Energy Technology Data Exchange (ETDEWEB)

    Baeverstam, U; Davis, P; Garcia-Olivares, A; Henrich, E; Koch, J

    1993-07-01

    This report has been produced to provide guidelines for uncertainty analysis for use by participants in the BIOMOVS II study. It is hoped that others with an interest in modelling contamination in the biosphere will also find it useful. The report has been prepared by members of the Uncertainty and Validation Working Group and has been reviewed by other BIOMOVS II participants. The opinions expressed are those of the authors and should not be taken to represent the views of the BIOMOVS II sponsors or other BIOMOVS Il participating organisations.

  4. Predictive validation of an influenza spread model.

    Directory of Open Access Journals (Sweden)

    Ayaz Hyder

    Full Text Available BACKGROUND: Modeling plays a critical role in mitigating impacts of seasonal influenza epidemics. Complex simulation models are currently at the forefront of evaluating optimal mitigation strategies at multiple scales and levels of organization. Given their evaluative role, these models remain limited in their ability to predict and forecast future epidemics leading some researchers and public-health practitioners to question their usefulness. The objective of this study is to evaluate the predictive ability of an existing complex simulation model of influenza spread. METHODS AND FINDINGS: We used extensive data on past epidemics to demonstrate the process of predictive validation. This involved generalizing an individual-based model for influenza spread and fitting it to laboratory-confirmed influenza infection data from a single observed epidemic (1998-1999. Next, we used the fitted model and modified two of its parameters based on data on real-world perturbations (vaccination coverage by age group and strain type. Simulating epidemics under these changes allowed us to estimate the deviation/error between the expected epidemic curve under perturbation and observed epidemics taking place from 1999 to 2006. Our model was able to forecast absolute intensity and epidemic peak week several weeks earlier with reasonable reliability and depended on the method of forecasting-static or dynamic. CONCLUSIONS: Good predictive ability of influenza epidemics is critical for implementing mitigation strategies in an effective and timely manner. Through the process of predictive validation applied to a current complex simulation model of influenza spread, we provided users of the model (e.g. public-health officials and policy-makers with quantitative metrics and practical recommendations on mitigating impacts of seasonal influenza epidemics. This methodology may be applied to other models of communicable infectious diseases to test and potentially improve

  5. Predictive Validation of an Influenza Spread Model

    Science.gov (United States)

    Hyder, Ayaz; Buckeridge, David L.; Leung, Brian

    2013-01-01

    Background Modeling plays a critical role in mitigating impacts of seasonal influenza epidemics. Complex simulation models are currently at the forefront of evaluating optimal mitigation strategies at multiple scales and levels of organization. Given their evaluative role, these models remain limited in their ability to predict and forecast future epidemics leading some researchers and public-health practitioners to question their usefulness. The objective of this study is to evaluate the predictive ability of an existing complex simulation model of influenza spread. Methods and Findings We used extensive data on past epidemics to demonstrate the process of predictive validation. This involved generalizing an individual-based model for influenza spread and fitting it to laboratory-confirmed influenza infection data from a single observed epidemic (1998–1999). Next, we used the fitted model and modified two of its parameters based on data on real-world perturbations (vaccination coverage by age group and strain type). Simulating epidemics under these changes allowed us to estimate the deviation/error between the expected epidemic curve under perturbation and observed epidemics taking place from 1999 to 2006. Our model was able to forecast absolute intensity and epidemic peak week several weeks earlier with reasonable reliability and depended on the method of forecasting-static or dynamic. Conclusions Good predictive ability of influenza epidemics is critical for implementing mitigation strategies in an effective and timely manner. Through the process of predictive validation applied to a current complex simulation model of influenza spread, we provided users of the model (e.g. public-health officials and policy-makers) with quantitative metrics and practical recommendations on mitigating impacts of seasonal influenza epidemics. This methodology may be applied to other models of communicable infectious diseases to test and potentially improve their predictive

  6. Model Validation and Verification of Data Mining from the ...

    African Journals Online (AJOL)

    Michael Horsfall

    In this paper, we seek to present a hybrid method for Model Validation and Verification of Data Mining from the ... This model generally states the numerical value of knowledge .... procedures found in the field of software engineering should be ...

  7. Assessment calculation of MARS-LMR using EBR-II SHRT-45R

    Energy Technology Data Exchange (ETDEWEB)

    Choi, C.; Ha, K.S.

    2016-10-15

    Highlights: • Neutronic and thermal-hydraulic behavior predicted by MARS-LMR is validated with EBR-II SHRT-45R test data. • Decay heat model of ANS-94 give better prediction of the fission power. • The core power is well predicted by reactivity feedback during initial transient, however, the predicted power after approximately 200 s is over-estimated. The study of the reactivity feedback model of the EBR-II is necessary for the better calculation of the power. • Heat transfer between inter-subassemblies is the most important parameter, especially, a low flow and power subassembly, like non-fueled subassembly. - Abstract: KAERI has designed a prototype Gen-IV SFR (PGSFR) with metallic fuel. And the safety analysis code for the PGSFR, MARS-LMR, is based on the MARS code, and supplemented with various liquid metal related features including sodium properties, heat transfer, pressure drop, and reactivity feedback models. In order to validate the newly developed MARS-LMR, KAERI has joined the International Atomic Energy Agency (IAEA) coordinated research project (CRP) on “Benchmark Analysis of an EBR-II Shutdown Heat Removal Test (SHRT)”. Argonne National Laboratory (ANL) has technically supported and participated in this program. One of benchmark analysis tests is SHRT-45R, which is an unprotected loss of flow test in an EBR-II. So, sodium natural circulation and reactivity feedbacks are major phenomena of interest. A benchmark analysis was conducted using MARS-LMR with original input data provided by ANL. MARS-LMR well predicts the core flow and power change by reactivity feedbacks in the core. Except the results of the XX10, the temperature and flow in the XX09 agreed well with the experiments. Moreover, sensitivity tests were carried out for a decay heat model, reactivity feedback model, inter-subassembly heat transfer, internal heat structures and so on, to evaluate their sensitivity and get a better prediction. The decay heat model of ANS-94 shows

  8. Assessment calculation of MARS-LMR using EBR-II SHRT-45R

    International Nuclear Information System (INIS)

    Choi, C.; Ha, K.S.

    2016-01-01

    Highlights: • Neutronic and thermal-hydraulic behavior predicted by MARS-LMR is validated with EBR-II SHRT-45R test data. • Decay heat model of ANS-94 give better prediction of the fission power. • The core power is well predicted by reactivity feedback during initial transient, however, the predicted power after approximately 200 s is over-estimated. The study of the reactivity feedback model of the EBR-II is necessary for the better calculation of the power. • Heat transfer between inter-subassemblies is the most important parameter, especially, a low flow and power subassembly, like non-fueled subassembly. - Abstract: KAERI has designed a prototype Gen-IV SFR (PGSFR) with metallic fuel. And the safety analysis code for the PGSFR, MARS-LMR, is based on the MARS code, and supplemented with various liquid metal related features including sodium properties, heat transfer, pressure drop, and reactivity feedback models. In order to validate the newly developed MARS-LMR, KAERI has joined the International Atomic Energy Agency (IAEA) coordinated research project (CRP) on “Benchmark Analysis of an EBR-II Shutdown Heat Removal Test (SHRT)”. Argonne National Laboratory (ANL) has technically supported and participated in this program. One of benchmark analysis tests is SHRT-45R, which is an unprotected loss of flow test in an EBR-II. So, sodium natural circulation and reactivity feedbacks are major phenomena of interest. A benchmark analysis was conducted using MARS-LMR with original input data provided by ANL. MARS-LMR well predicts the core flow and power change by reactivity feedbacks in the core. Except the results of the XX10, the temperature and flow in the XX09 agreed well with the experiments. Moreover, sensitivity tests were carried out for a decay heat model, reactivity feedback model, inter-subassembly heat transfer, internal heat structures and so on, to evaluate their sensitivity and get a better prediction. The decay heat model of ANS-94 shows

  9. Measurement and Model Validation of Nanofluid Specific Heat Capacity with Differential Scanning Calorimetry

    Directory of Open Access Journals (Sweden)

    Harry O'Hanley

    2012-01-01

    Full Text Available Nanofluids are being considered for heat transfer applications; therefore it is important to know their thermophysical properties accurately. In this paper we focused on nanofluid specific heat capacity. Currently, there exist two models to predict a nanofluid specific heat capacity as a function of nanoparticle concentration and material. Model I is a straight volume-weighted average; Model II is based on the assumption of thermal equilibrium between the particles and the surrounding fluid. These two models give significantly different predictions for a given system. Using differential scanning calorimetry (DSC, a robust experimental methodology for measuring the heat capacity of fluids, the specific heat capacities of water-based silica, alumina, and copper oxide nanofluids were measured. Nanoparticle concentrations were varied between 5 wt% and 50 wt%. Test results were found to be in excellent agreement with Model II, while the predictions of Model I deviated very significantly from the data. Therefore, Model II is recommended for nanofluids.

  10. Bayesian Calibration, Validation and Uncertainty Quantification for Predictive Modelling of Tumour Growth: A Tutorial.

    Science.gov (United States)

    Collis, Joe; Connor, Anthony J; Paczkowski, Marcin; Kannan, Pavitra; Pitt-Francis, Joe; Byrne, Helen M; Hubbard, Matthew E

    2017-04-01

    In this work, we present a pedagogical tumour growth example, in which we apply calibration and validation techniques to an uncertain, Gompertzian model of tumour spheroid growth. The key contribution of this article is the discussion and application of these methods (that are not commonly employed in the field of cancer modelling) in the context of a simple model, whose deterministic analogue is widely known within the community. In the course of the example, we calibrate the model against experimental data that are subject to measurement errors, and then validate the resulting uncertain model predictions. We then analyse the sensitivity of the model predictions to the underlying measurement model. Finally, we propose an elementary learning approach for tuning a threshold parameter in the validation procedure in order to maximize predictive accuracy of our validated model.

  11. Validation of nuclear models used in space radiation shielding applications

    International Nuclear Information System (INIS)

    Norman, Ryan B.; Blattnig, Steve R.

    2013-01-01

    A program of verification and validation has been undertaken to assess the applicability of models to space radiation shielding applications and to track progress as these models are developed over time. In this work, simple validation metrics applicable to testing both model accuracy and consistency with experimental data are developed. The developed metrics treat experimental measurement uncertainty as an interval and are therefore applicable to cases in which epistemic uncertainty dominates the experimental data. To demonstrate the applicability of the metrics, nuclear physics models used by NASA for space radiation shielding applications are compared to an experimental database consisting of over 3600 experimental cross sections. A cumulative uncertainty metric is applied to the question of overall model accuracy, while a metric based on the median uncertainty is used to analyze the models from the perspective of model development by examining subsets of the model parameter space.

  12. Diet History Questionnaire II FAQs | EGRP/DCCPS/NCI/NIH

    Science.gov (United States)

    Answers to general questions about the Diet History Questionnaire II (DHQ II), as well as those related to DHQ II administration, validation, scanning, nutrient estimates, calculations, DHQ II modification, data quality, and more.

  13. Predicting the ungauged basin : Model validation and realism assessment

    NARCIS (Netherlands)

    Van Emmerik, T.H.M.; Mulder, G.; Eilander, D.; Piet, M.; Savenije, H.H.G.

    2015-01-01

    The hydrological decade on Predictions in Ungauged Basins (PUB) led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of

  14. Predicting the ungauged basin: model validation and realism assessment

    NARCIS (Netherlands)

    van Emmerik, Tim; Mulder, Gert; Eilander, Dirk; Piet, Marijn; Savenije, Hubert

    2015-01-01

    The hydrological decade on Predictions in Ungauged Basins (PUB) led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of

  15. Validation of Inhibition Effect in the Cellulose Hydrolysis: a Dynamic Modelling Approach

    DEFF Research Database (Denmark)

    Morales Rodriguez, Ricardo; Tsai, Chien-Tai; Meyer, Anne S.

    2011-01-01

    Enzymatic hydrolysis is one of the main steps in the processing of bioethanol from lignocellulosic raw materials. However, complete understanding of the underlying phenomena is still under development. Hence, this study has focused on validation of the inhibition effects in the cellulosic biomass...... for parameter estimation (calibration) and validation purposes. The model predictions using calibrated parameters have shown good agreement with the validation data sets, which provides credibility to the model structure and the parameter values....

  16. Standardization of radioimmunoassay for dosage of angiotensin II (ang-II) and its methodological evaluation; Padronizacao do radioimunoensaio para dosagem de angiotensina II (ang-II) e sua validacao metodologica

    Energy Technology Data Exchange (ETDEWEB)

    Mantovani, Milene; Mecawi, Andre S.; Elias, Lucila L.K.; Antunes-Rodrigues, Jose, E-mail: llelias@fmrp.usp.b, E-mail: antunes@fmrp.usp.b [Universidade de Sao Paulo (FMRP/USP), Ribeirao Preto, SP (Brazil). Faculdade de Medicina

    2011-10-26

    This paper standardizes the radioimmunoassay (RIA) for dosage of ANG-II of rats, after experimental conditions of saline hypertonic (2%), treating with losartan (antagonist of ANG-II), hydric privation, and acute hemorrhage (25%). After that, the plasmatic ANG-II was extracted for dosage of RIA, whose sensitiveness was of 1.95 pg/m L, with detection of 1.95 to 1000 pg/m L. The treatment with saline reduced the concentration of ANG-II, while the administration pf losartan, the hydric administration and the hemorrhage increase the values, related to the control group. Those results indicate variations in the plasmatic concentration of ANG-II according to the experimental protocols, validating the method for evaluation of activity renin-angiotensin

  17. A process improvement model for software verification and validation

    Science.gov (United States)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  18. Thermal hydraulic model validation for HOR mixed core fuel management

    International Nuclear Information System (INIS)

    Gibcus, H.P.M.; Vries, J.W. de; Leege, P.F.A. de

    1997-01-01

    A thermal-hydraulic core management model has been developed for the Hoger Onderwijsreactor (HOR), a 2 MW pool-type university research reactor. The model was adopted for safety analysis purposes in the framework of HEU/LEU core conversion studies. It is applied in the thermal-hydraulic computer code SHORT (Steady-state HOR Thermal-hydraulics) which is presently in use in designing core configurations and for in-core fuel management. An elaborate measurement program was performed for establishing the core hydraulic characteristics for a variety of conditions. The hydraulic data were obtained with a dummy fuel element with special equipment allowing a.o. direct measurement of the true core flow rate. Using these data the thermal-hydraulic model was validated experimentally. The model, experimental tests, and model validation are discussed. (author)

  19. PowerShades II. Optimisation and validation of highly transparent photovoltaic. Final report

    Energy Technology Data Exchange (ETDEWEB)

    2010-07-15

    The objective of the project is continued development and validation of a novel Danish photovoltaic product with the work title ''PowerShade''. The PowerShade insulating glazing unit (IGU) is a combination of a strong solar shading device and a power producing photovoltaic coating. The core technology in the PowerShade IGU is a thin film silicon photovoltaic generator applied to a micro structured substrate. The geometry of the substrate provides the unique combination of properties that characterizes the PowerShade module - strong progressive shading, high transparency, and higher electrical output than other semitransparent photovoltaic products with similar transparencies. The project activities fall in two categories, namely development of the processing/product and validation of the product properties. The development part of the project is focussed on increasing the efficiency of the photovoltaic generator by changing from a single-stack type cell to a tandem-stack type cell. The inclusion of PowerShade cells in insulating glazing (IG) units is also addressed in this project. The validation part of the project aims at validation of stability, thermal and optical properties as well as validation of the electrical yield of the product. The validation of thermal and optical properties has been done using full size modules installed in a test facility built during the 2006-08 ''PowerShades'' project. The achieved results will be vital in the coming realisation of a commercial product. Initial processing steps have been automated, and more efficient tandem-type solar cells have been developed. A damp heat test of an IGU has been carried out without any degradation of the solar cell. The PowerShade module assembly concept has been further developed and discussed with different automation equipment vendors and a pick-and-place tool developed. PowerShade's influence on the indoor climate has been modelled and verified by

  20. Validated TRNSYS Model for Solar Assisted Space Heating System

    International Nuclear Information System (INIS)

    Abdalla, Nedal

    2014-01-01

    The present study involves a validated TRNSYS model for solar assisted space heating system as applied to a residential building in Jordan using new detailed radiation models of the TRNSYS 17.1 and geometric building model Trnsys3d for the Google SketchUp 3D drawing program. The annual heating load for a building (Solar House) which is located at the Royal ScientiFIc Society (RS5) in Jordan is estimated under climatological conditions of Amman. The aim of this Paper is to compare measured thermal performance of the Solar House with that modeled using TRNSYS. The results showed that the annual measured space heating load for the building was 6,188 kWh while the heati.ng load for the modeled building was 6,391 kWh. Moreover, the measured solar fraction for the solar system was 50% while the modeled solar fraction was 55%. A comparison of modeled and measured data resulted in percentage mean absolute errors for solar energy for space heating, auxiliary heating and solar fraction of 13%, 7% and 10%, respectively. The validated model will be useful for long-term performance simulation under different weather and operating conditions.(author)

  1. Validation of statistical models for creep rupture by parametric analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bolton, J., E-mail: john.bolton@uwclub.net [65, Fisher Ave., Rugby, Warks CV22 5HW (United Kingdom)

    2012-01-15

    Statistical analysis is an efficient method for the optimisation of any candidate mathematical model of creep rupture data, and for the comparative ranking of competing models. However, when a series of candidate models has been examined and the best of the series has been identified, there is no statistical criterion to determine whether a yet more accurate model might be devised. Hence there remains some uncertainty that the best of any series examined is sufficiently accurate to be considered reliable as a basis for extrapolation. This paper proposes that models should be validated primarily by parametric graphical comparison to rupture data and rupture gradient data. It proposes that no mathematical model should be considered reliable for extrapolation unless the visible divergence between model and data is so small as to leave no apparent scope for further reduction. This study is based on the data for a 12% Cr alloy steel used in BS PD6605:1998 to exemplify its recommended statistical analysis procedure. The models considered in this paper include a) a relatively simple model, b) the PD6605 recommended model and c) a more accurate model of somewhat greater complexity. - Highlights: Black-Right-Pointing-Pointer The paper discusses the validation of creep rupture models derived from statistical analysis. Black-Right-Pointing-Pointer It demonstrates that models can be satisfactorily validated by a visual-graphic comparison of models to data. Black-Right-Pointing-Pointer The method proposed utilises test data both as conventional rupture stress and as rupture stress gradient. Black-Right-Pointing-Pointer The approach is shown to be more reliable than a well-established and widely used method (BS PD6605).

  2. Validation of a two-fluid model used for the simulation of dense fluidized beds; Validation d`un modele a deux fluides applique a la simulation des lits fluidises denses

    Energy Technology Data Exchange (ETDEWEB)

    Boelle, A.

    1997-02-17

    A two-fluid model applied to the simulation of gas-solid dense fluidized beds is validated on micro scale and on macro scale. Phase coupling is carried out in the momentum and energy transport equation of both phases. The modeling is built on the kinetic theory of granular media in which the gas action has been taken into account in order to get correct expressions of transport coefficients. A description of hydrodynamic interactions between particles in high Stokes number flow is also incorporated in the model. The micro scale validation uses Lagrangian numerical simulations viewed as numerical experiments. The first validation case refers to a gas particle simple shear flow. It allows to validate the competition between two dissipation mechanisms: drag and particle collisions. The second validation case is concerted with sedimenting particles in high Stokes number flow. It allows to validate our approach of hydrodynamic interactions. This last case had led us to develop an original Lagrangian simulation with a two-way coupling between the fluid and the particles. The macro scale validation uses the results of Eulerian simulations of dense fluidized bed. Bed height, particles circulation and spontaneous created bubbles characteristics are studied and compared to experimental measurement, both looking at physical and numerical parameters. (author) 159 refs.

  3. Basic Modelling principles and Validation of Software for Prediction of Collision Damage

    DEFF Research Database (Denmark)

    Simonsen, Bo Cerup

    2000-01-01

    This report describes basic modelling principles, the theoretical background and validation examples for the collision damage prediction module in the ISESO stand-alone software.......This report describes basic modelling principles, the theoretical background and validation examples for the collision damage prediction module in the ISESO stand-alone software....

  4. Gap Conductance model Validation in the TASS/SMR-S code using MARS code

    International Nuclear Information System (INIS)

    Ahn, Sang Jun; Yang, Soo Hyung; Chung, Young Jong; Lee, Won Jae

    2010-01-01

    Korea Atomic Energy Research Institute (KAERI) has been developing the TASS/SMR-S (Transient and Setpoint Simulation/Small and Medium Reactor) code, which is a thermal hydraulic code for the safety analysis of the advanced integral reactor. An appropriate work to validate the applicability of the thermal hydraulic models within the code should be demanded. Among the models, the gap conductance model which is describes the thermal gap conductivity between fuel and cladding was validated through the comparison with MARS code. The validation of the gap conductance model was performed by evaluating the variation of the gap temperature and gap width as the changed with the power fraction. In this paper, a brief description of the gap conductance model in the TASS/SMR-S code is presented. In addition, calculated results to validate the gap conductance model are demonstrated by comparing with the results of the MARS code with the test case

  5. Fission product release from nuclear fuel II. Validation of ASTEC/ELSA on analytical and large scale experiments

    International Nuclear Information System (INIS)

    Brillant, G.; Marchetto, C.; Plumecocq, W.

    2013-01-01

    Highlights: • A wide range of experiments is presented for the ASTEC/ELSA code validation. • Analytical tests such as AECL, ORNL and VERCORS are considered. • A large-scale experiment, PHEBUS FPT1, is considered. • The good agreement with measurements shows the efficiency of the ASTEC modelling. • Improvements concern the FP release modelling from MOX and high burn-up UO 2 fuels. - Abstract: This article is the second of two articles dedicated to the mechanisms of fission product release from a degraded core. The models of fission product release from nuclear fuel in the ASTEC code have been described in detail in the first part of this work (Brillant et al., this issue). In this contribution, the validation of ELSA, the module of ASTEC that deals with fission product and structural material release from a degraded core, is presented. A large range of experimental tests, with various temperature and conditions for the fuel surrounding atmosphere (oxidising and reducing), is thus simulated with the ASTEC code. The validation database includes several analytical experiments with both bare fuel (e.g. MCE1 experiments) and cladded fuel (e.g. HCE3, VERCORS). Furthermore, the PHEBUS large-scale experiments are used for the validation of ASTEC. The rather satisfactory comparison between ELSA calculations and experimental measurements demonstrates the efficiency of the analytical models to describe fission product release in severe accident conditions

  6. Assessment model validity document FARF31

    International Nuclear Information System (INIS)

    Elert, Mark; Gylling Bjoern; Lindgren, Maria

    2004-08-01

    The prime goal of model validation is to build confidence in the model concept and that the model is fit for its intended purpose. In other words: Does the model predict transport in fractured rock adequately to be used in repository performance assessments. Are the results reasonable for the type of modelling tasks the model is designed for. Commonly, in performance assessments a large number of realisations of flow and transport is made to cover the associated uncertainties. Thus, the flow and transport including radioactive chain decay are preferably calculated in the same model framework. A rather sophisticated concept is necessary to be able to model flow and radionuclide transport in the near field and far field of a deep repository, also including radioactive chain decay. In order to avoid excessively long computational times there is a need for well-based simplifications. For this reason, the far field code FARF31 is made relatively simple, and calculates transport by using averaged entities to represent the most important processes. FARF31 has been shown to be suitable for the performance assessments within the SKB studies, e.g. SR 97. Among the advantages are that it is a fast, simple and robust code, which enables handling of many realisations with wide spread in parameters in combination with chain decay of radionuclides. Being a component in the model chain PROPER, it is easy to assign statistical distributions to the input parameters. Due to the formulation of the advection-dispersion equation in FARF31 it is possible to perform the groundwater flow calculations separately.The basis for the modelling is a stream tube, i.e. a volume of rock including fractures with flowing water, with the walls of the imaginary stream tube defined by streamlines. The transport within the stream tube is described using a dual porosity continuum approach, where it is assumed that rock can be divided into two distinct domains with different types of porosity

  7. Validation Assessment of a Glass-to-Metal Seal Finite-Element Model

    Energy Technology Data Exchange (ETDEWEB)

    Jamison, Ryan Dale [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Buchheit, Thomas E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Emery, John M [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Romero, Vicente J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Stavig, Mark E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Newton, Clay S. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Brown, Arthur [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-10-01

    Sealing glasses are ubiquitous in high pressure and temperature engineering applications, such as hermetic feed-through electrical connectors. A common connector technology are glass-to-metal seals where a metal shell compresses a sealing glass to create a hermetic seal. Though finite-element analysis has been used to understand and design glass-to-metal seals for many years, there has been little validation of these models. An indentation technique was employed to measure the residual stress on the surface of a simple glass-to-metal seal. Recently developed rate- dependent material models of both Schott 8061 and 304L VAR stainless steel have been applied to a finite-element model of the simple glass-to-metal seal. Model predictions of residual stress based on the evolution of material models are shown. These model predictions are compared to measured data. Validity of the finite- element predictions is discussed. It will be shown that the finite-element model of the glass-to-metal seal accurately predicts the mean residual stress in the glass near the glass-to-metal interface and is valid for this quantity of interest.

  8. Validation of an employee satisfaction model: A structural equation model approach

    OpenAIRE

    Ophillia Ledimo; Nico Martins

    2015-01-01

    The purpose of this study was to validate an employee satisfaction model and to determine the relationships between the different dimensions of the concept, using the structural equation modelling approach (SEM). A cross-sectional quantitative survey design was used to collect data from a random sample of (n=759) permanent employees of a parastatal organisation. Data was collected using the Employee Satisfaction Survey (ESS) to measure employee satisfaction dimensions. Following the steps of ...

  9. Assessment model validity document - HYDRASTAR. A stochastic continuum program for groundwater flow

    Energy Technology Data Exchange (ETDEWEB)

    Gylling, B. [Kemakta Konsult AB, Stockholm (Sweden); Eriksson, Lars [Equa Simulation AB, Sundbyberg (Sweden)

    2001-12-01

    The prevailing document addresses validation of the stochastic continuum model HYDRASTAR designed for Monte Carlo simulations of groundwater flow in fractured rocks. Here, validation is defined as a process to demonstrate that a model concept is fit for its purpose. Preferably, the validation is carried out by comparison of model predictions with independent field observations and experimental measurements. In addition, other sources can also be used to confirm that the model concept gives acceptable results. One method is to compare results with the ones achieved using other model concepts for the same set of input data. Another method is to compare model results with analytical solutions. The model concept HYDRASTAR has been used in several studies including performance assessments of hypothetical repositories for spent nuclear fuel. In the performance assessments, the main tasks for HYDRASTAR have been to calculate groundwater travel time distributions, repository flux distributions, path lines and their exit locations. The results have then been used by other model concepts to calculate the near field release and far field transport. The aim and framework for the validation process includes describing the applicability of the model concept for its purpose in order to build confidence in the concept. Preferably, this is made by comparisons of simulation results with the corresponding field experiments or field measurements. Here, two comparisons with experimental results are reported. In both cases the agreement was reasonably fair. In the broader and more general context of the validation process, HYDRASTAR results have been compared with other models and analytical solutions. Commonly, the approximation calculations agree well with the medians of model ensemble results. Additional indications that HYDRASTAR is suitable for its purpose were obtained from the comparisons with results from other model concepts. Several verification studies have been made for

  10. Evaluation factors for verification and validation of low-level waste disposal site models

    International Nuclear Information System (INIS)

    Moran, M.S.; Mezga, L.J.

    1982-01-01

    The purpose of this paper is to identify general evaluation factors to be used to verify and validate LLW disposal site performance models in order to assess their site-specific applicability and to determine their accuracy and sensitivity. It is intended that the information contained in this paper be employed by model users involved with LLW site performance model verification and validation. It should not be construed as providing protocols, but rather as providing a framework for the preparation of specific protocols or procedures. A brief description of each evaluation factor is provided. The factors have been categorized according to recommended use during either the model verification or the model validation process. The general responsibilities of the developer and user are provided. In many cases it is difficult to separate the responsibilities of the developer and user, but the user is ultimately accountable for both verification and validation processes. 4 refs

  11. SDSS-II: Determination of shape and color parameter coefficients for SALT-II fit model

    Energy Technology Data Exchange (ETDEWEB)

    Dojcsak, L.; Marriner, J.; /Fermilab

    2010-08-01

    In this study we look at the SALT-II model of Type IA supernova analysis, which determines the distance moduli based on the known absolute standard candle magnitude of the Type IA supernovae. We take a look at the determination of the shape and color parameter coefficients, {alpha} and {beta} respectively, in the SALT-II model with the intrinsic error that is determined from the data. Using the SNANA software package provided for the analysis of Type IA supernovae, we use a standard Monte Carlo simulation to generate data with known parameters to use as a tool for analyzing the trends in the model based on certain assumptions about the intrinsic error. In order to find the best standard candle model, we try to minimize the residuals on the Hubble diagram by calculating the correct shape and color parameter coefficients. We can estimate the magnitude of the intrinsic errors required to obtain results with {chi}{sup 2}/degree of freedom = 1. We can use the simulation to estimate the amount of color smearing as indicated by the data for our model. We find that the color smearing model works as a general estimate of the color smearing, and that we are able to use the RMS distribution in the variables as one method of estimating the correct intrinsic errors needed by the data to obtain the correct results for {alpha} and {beta}. We then apply the resultant intrinsic error matrix to the real data and show our results.

  12. Validation of elk resource selection models with spatially independent data

    Science.gov (United States)

    Priscilla K. Coe; Bruce K. Johnson; Michael J. Wisdom; John G. Cook; Marty Vavra; Ryan M. Nielson

    2011-01-01

    Knowledge of how landscape features affect wildlife resource use is essential for informed management. Resource selection functions often are used to make and validate predictions about landscape use; however, resource selection functions are rarely validated with data from landscapes independent of those from which the models were built. This problem has severely...

  13. Japanese proposal and contribution for IAEA/CRP on RIPL-II

    International Nuclear Information System (INIS)

    Fukahori, T.

    1999-01-01

    The Japanese Nuclear Data Committee (JNDC) organized an evaluating Calculation Support System Working Group (ECSS-WG) to investigate the subjects related to the RIPL-II project. The activities of this working group for validating model parameter and the development of the integrated nuclear data evaluation system (INDES) and its parameter database are briefly presented in this report

  14. Neutrinoless double beta decay in type I+II seesaw models

    Energy Technology Data Exchange (ETDEWEB)

    Borah, Debasish [Department of Physics, Tezpur University,Tezpur-784028 (India); Dasgupta, Arnab [Institute of Physics, Sachivalaya Marg,Bhubaneshwar-751005 (India)

    2015-11-30

    We study neutrinoless double beta decay in left-right symmetric extension of the standard model with type I and type II seesaw origin of neutrino masses. Due to the enhanced gauge symmetry as well as extended scalar sector, there are several new physics sources of neutrinoless double beta decay in this model. Ignoring the left-right gauge boson mixing and heavy-light neutrino mixing, we first compute the contributions to neutrinoless double beta decay for type I and type II dominant seesaw separately and compare with the standard light neutrino contributions. We then repeat the exercise by considering the presence of both type I and type II seesaw, having non-negligible contributions to light neutrino masses and show the difference in results from individual seesaw cases. Assuming the new gauge bosons and scalars to be around a TeV, we constrain different parameters of the model including both heavy and light neutrino masses from the requirement of keeping the new physics contribution to neutrinoless double beta decay amplitude below the upper limit set by the GERDA experiment and also satisfying bounds from lepton flavor violation, cosmology and colliders.

  15. User's Guide To CHEAP0 II-Economic Analysis of Stand Prognosis Model Outputs

    Science.gov (United States)

    Joseph E. Horn; E. Lee Medema; Ervin G. Schuster

    1986-01-01

    CHEAP0 II provides supplemental economic analysis capability for users of version 5.1 of the Stand Prognosis Model, including recent regeneration and insect outbreak extensions. Although patterned after the old CHEAP0 model, CHEAP0 II has more features and analytic capabilities, especially for analysis of existing and uneven-aged stands....

  16. Assessment of perioperative mortality risk in patients with infective endocarditis undergoing cardiac surgery: performance of the EuroSCORE I and II logistic models.

    Science.gov (United States)

    Madeira, Sérgio; Rodrigues, Ricardo; Tralhão, António; Santos, Miguel; Almeida, Carla; Marques, Marta; Ferreira, Jorge; Raposo, Luís; Neves, José; Mendes, Miguel

    2016-02-01

    The European System for Cardiac Operative Risk Evaluation (EuroSCORE) has been established as a tool for assisting decision-making in surgical patients and as a benchmark for quality assessment. Infective endocarditis often requires surgical treatment and is associated with high mortality. This study was undertaken to (i) validate both versions of the EuroSCORE, the older logistic EuroSCORE I and the recently developed EuroSCORE II and to compare their performances; (ii) identify predictors other than those included in the EuroSCORE models that might further improve their performance. We retrospectively studied 128 patients from a single-centre registry who underwent heart surgery for active infective endocarditis between January 2007 and November 2014. Binary logistic regression was used to find independent predictors of mortality and to create a new prediction model. Discrimination and calibration of models were assessed by receiver-operating characteristic curve analysis, calibration curves and the Hosmer-Lemeshow test. The observed perioperative mortality was 16.4% (n = 21). The median EuroSCORE I and EuroSCORE II were 13.9% interquartile range (IQ) (7.0-35.0) and 6.6% IQ (3.5-18.2), respectively. Discriminative power was numerically higher for EuroSCORE II {area under the curve (AUC) of 0.83 [95% confidence interval (CI), 0.75-0.91]} than for EuroSCORE I [0.75 (95% CI, 0.66-0.85), P = 0.09]. The Hosmer-Lemeshow test showed good calibration for EuroSCORE II (P = 0.08) but not for EuroSCORE I (P = 0.04). EuroSCORE I tended to over-predict and EuroSCORE II to under-predict mortality. Among the variables known to be associated with greater infective endocarditis severity, only prosthetic valve infective endocarditis remained an independent predictor of mortality [odds ratio (OR) 6.6; 95% CI, 1.1-39.5; P = 0.04]. The new model including the EuroSCORE II variables and variables known to be associated with greater infective endocarditis severity showed an AUC of 0

  17. Nonparametric model validations for hidden Markov models with applications in financial econometrics.

    Science.gov (United States)

    Zhao, Zhibiao

    2011-06-01

    We address the nonparametric model validation problem for hidden Markov models with partially observable variables and hidden states. We achieve this goal by constructing a nonparametric simultaneous confidence envelope for transition density function of the observable variables and checking whether the parametric density estimate is contained within such an envelope. Our specification test procedure is motivated by a functional connection between the transition density of the observable variables and the Markov transition kernel of the hidden states. Our approach is applicable for continuous time diffusion models, stochastic volatility models, nonlinear time series models, and models with market microstructure noise.

  18. Validity of information security policy models

    Directory of Open Access Journals (Sweden)

    Joshua Onome Imoniana

    Full Text Available Validity is concerned with establishing evidence for the use of a method to be used with a particular set of population. Thus, when we address the issue of application of security policy models, we are concerned with the implementation of a certain policy, taking into consideration the standards required, through attribution of scores to every item in the research instrument. En today's globalized economic scenarios, the implementation of information security policy, in an information technology environment, is a condition sine qua non for the strategic management process of any organization. Regarding this topic, various studies present evidences that, the responsibility for maintaining a policy rests primarily with the Chief Security Officer. The Chief Security Officer, in doing so, strives to enhance the updating of technologies, in order to meet all-inclusive business continuity planning policies. Therefore, for such policy to be effective, it has to be entirely embraced by the Chief Executive Officer. This study was developed with the purpose of validating specific theoretical models, whose designs were based on literature review, by sampling 10 of the Automobile Industries located in the ABC region of Metropolitan São Paulo City. This sampling was based on the representativeness of such industries, particularly with regards to each one's implementation of information technology in the region. The current study concludes, presenting evidence of the discriminating validity of four key dimensions of the security policy, being such: the Physical Security, the Logical Access Security, the Administrative Security, and the Legal & Environmental Security. On analyzing the Alpha of Crombach structure of these security items, results not only attest that the capacity of those industries to implement security policies is indisputable, but also, the items involved, homogeneously correlate to each other.

  19. Validation and selection of ODE based systems biology models: how to arrive at more reliable decisions.

    Science.gov (United States)

    Hasdemir, Dicle; Hoefsloot, Huub C J; Smilde, Age K

    2015-07-08

    Most ordinary differential equation (ODE) based modeling studies in systems biology involve a hold-out validation step for model validation. In this framework a pre-determined part of the data is used as validation data and, therefore it is not used for estimating the parameters of the model. The model is assumed to be validated if the model predictions on the validation dataset show good agreement with the data. Model selection between alternative model structures can also be performed in the same setting, based on the predictive power of the model structures on the validation dataset. However, drawbacks associated with this approach are usually under-estimated. We have carried out simulations by using a recently published High Osmolarity Glycerol (HOG) pathway from S.cerevisiae to demonstrate these drawbacks. We have shown that it is very important how the data is partitioned and which part of the data is used for validation purposes. The hold-out validation strategy leads to biased conclusions, since it can lead to different validation and selection decisions when different partitioning schemes are used. Furthermore, finding sensible partitioning schemes that would lead to reliable decisions are heavily dependent on the biology and unknown model parameters which turns the problem into a paradox. This brings the need for alternative validation approaches that offer flexible partitioning of the data. For this purpose, we have introduced a stratified random cross-validation (SRCV) approach that successfully overcomes these limitations. SRCV leads to more stable decisions for both validation and selection which are not biased by underlying biological phenomena. Furthermore, it is less dependent on the specific noise realization in the data. Therefore, it proves to be a promising alternative to the standard hold-out validation strategy.

  20. Understanding variability of the Southern Ocean overturning circulation in CORE-II models

    Science.gov (United States)

    Downes, S. M.; Spence, P.; Hogg, A. M.

    2018-03-01

    The current generation of climate models exhibit a large spread in the steady-state and projected Southern Ocean upper and lower overturning circulation, with mechanisms for deep ocean variability remaining less well understood. Here, common Southern Ocean metrics in twelve models from the Coordinated Ocean-ice Reference Experiment Phase II (CORE-II) are assessed over a 60 year period. Specifically, stratification, surface buoyancy fluxes, and eddies are linked to the magnitude of the strengthening trend in the upper overturning circulation, and a decreasing trend in the lower overturning circulation across the CORE-II models. The models evolve similarly in the upper 1 km and the deep ocean, with an almost equivalent poleward intensification trend in the Southern Hemisphere westerly winds. However, the models differ substantially in their eddy parameterisation and surface buoyancy fluxes. In general, models with a larger heat-driven water mass transformation where deep waters upwell at the surface ( ∼ 55°S) transport warmer waters into intermediate depths, thus weakening the stratification in the upper 2 km. Models with a weak eddy induced overturning and a warm bias in the intermediate waters are more likely to exhibit larger increases in the upper overturning circulation, and more significant weakening of the lower overturning circulation. We find the opposite holds for a cool model bias in intermediate depths, combined with a more complex 3D eddy parameterisation that acts to reduce isopycnal slope. In summary, the Southern Ocean overturning circulation decadal trends in the coarse resolution CORE-II models are governed by biases in surface buoyancy fluxes and the ocean density field, and the configuration of the eddy parameterisation.

  1. The application of the PARCS neutronics code to the Atucha-I and Atucha-II NPPs

    Energy Technology Data Exchange (ETDEWEB)

    Ward, Andrew; Collins, Ben; Xu, Yunlin; Downar, Thomas [Purdue University, West Lafayette, IN (United States); Madariaga, Marcelo [Autoridad Nuclear Regulatoria, Buenos Aires (Argentina)

    2008-07-01

    In order to analyze Central Nuclear Atucha II (CNA-II) with coupled RELAP5/PARCS, extensive benchmarking of the neutronics codes HELIOS and PARCS was completed. This benchmarking was performed using a range of test problems designed in collaboration with NA-SA. HELIOS has been previously used to model Candu systems, but the results were validated for this case as well. The validation of both HELIOS and PARCS was performed primarily by comparisons to MCNP results for the same problems. Though originally designed to model light water systems, the capability of the PARCS was validated for predicting the performance of a Pressurized Heavy Water Reactor. The other noteworthy issue was the control rods. Because the insertion of the rods is oblique, a special routine was added to PARCS to treat this effect. Lattice level and Core level calculations were compared to the corresponding NA-SA codes WIMS and PUMA. In all cases there was good agreement in the results which provided confidence that the neutronics methods and the core neutronics modelling would not be a significant source of error in coupled RELAP5/PARCS calculations. (authors)

  2. Validation of the Continuum of Care Conceptual Model for Athletic Therapy

    Directory of Open Access Journals (Sweden)

    Mark R. Lafave

    2015-01-01

    Full Text Available Utilization of conceptual models in field-based emergency care currently borrows from existing standards of medical and paramedical professions. The purpose of this study was to develop and validate a comprehensive conceptual model that could account for injuries ranging from nonurgent to catastrophic events including events that do not follow traditional medical or prehospital care protocols. The conceptual model should represent the continuum of care from the time of initial injury spanning to an athlete’s return to participation in their sport. Finally, the conceptual model should accommodate both novices and experts in the AT profession. This paper chronicles the content validation steps of the Continuum of Care Conceptual Model for Athletic Therapy (CCCM-AT. The stages of model development were domain and item generation, content expert validation using a three-stage modified Ebel procedure, and pilot testing. Only the final stage of the modified Ebel procedure reached a priori 80% consensus on three domains of interest: (1 heading descriptors; (2 the order of the model; (3 the conceptual model as a whole. Future research is required to test the use of the CCCM-AT in order to understand its efficacy in teaching and practice within the AT discipline.

  3. Cross validation for the classical model of structured expert judgment

    International Nuclear Information System (INIS)

    Colson, Abigail R.; Cooke, Roger M.

    2017-01-01

    We update the 2008 TU Delft structured expert judgment database with data from 33 professionally contracted Classical Model studies conducted between 2006 and March 2015 to evaluate its performance relative to other expert aggregation models. We briefly review alternative mathematical aggregation schemes, including harmonic weighting, before focusing on linear pooling of expert judgments with equal weights and performance-based weights. Performance weighting outperforms equal weighting in all but 1 of the 33 studies in-sample. True out-of-sample validation is rarely possible for Classical Model studies, and cross validation techniques that split calibration questions into a training and test set are used instead. Performance weighting incurs an “out-of-sample penalty” and its statistical accuracy out-of-sample is lower than that of equal weighting. However, as a function of training set size, the statistical accuracy of performance-based combinations reaches 75% of the equal weight value when the training set includes 80% of calibration variables. At this point the training set is sufficiently powerful to resolve differences in individual expert performance. The information of performance-based combinations is double that of equal weighting when the training set is at least 50% of the set of calibration variables. Previous out-of-sample validation work used a Total Out-of-Sample Validity Index based on all splits of the calibration questions into training and test subsets, which is expensive to compute and includes small training sets of dubious value. As an alternative, we propose an Out-of-Sample Validity Index based on averaging the product of statistical accuracy and information over all training sets sized at 80% of the calibration set. Performance weighting outperforms equal weighting on this Out-of-Sample Validity Index in 26 of the 33 post-2006 studies; the probability of 26 or more successes on 33 trials if there were no difference between performance

  4. Verifying and Validating Simulation Models

    Energy Technology Data Exchange (ETDEWEB)

    Hemez, Francois M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-02-23

    This presentation is a high-level discussion of the Verification and Validation (V&V) of computational models. Definitions of V&V are given to emphasize that “validation” is never performed in a vacuum; it accounts, instead, for the current state-of-knowledge in the discipline considered. In particular comparisons between physical measurements and numerical predictions should account for their respective sources of uncertainty. The differences between error (bias), aleatoric uncertainty (randomness) and epistemic uncertainty (ignorance, lack-of- knowledge) are briefly discussed. Four types of uncertainty in physics and engineering are discussed: 1) experimental variability, 2) variability and randomness, 3) numerical uncertainty and 4) model-form uncertainty. Statistical sampling methods are available to propagate, and analyze, variability and randomness. Numerical uncertainty originates from the truncation error introduced by the discretization of partial differential equations in time and space. Model-form uncertainty is introduced by assumptions often formulated to render a complex problem more tractable and amenable to modeling and simulation. The discussion concludes with high-level guidance to assess the “credibility” of numerical simulations, which stems from the level of rigor with which these various sources of uncertainty are assessed and quantified.

  5. Validation of a multi-objective, predictive urban traffic model

    NARCIS (Netherlands)

    Wilmink, I.R.; Haak, P. van den; Woldeab, Z.; Vreeswijk, J.

    2013-01-01

    This paper describes the results of the verification and validation of the ecoStrategic Model, which was developed, implemented and tested in the eCoMove project. The model uses real-time and historical traffic information to determine the current, predicted and desired state of traffic in a

  6. Animal models of binge drinking, current challenges to improve face validity.

    Science.gov (United States)

    Jeanblanc, Jérôme; Rolland, Benjamin; Gierski, Fabien; Martinetti, Margaret P; Naassila, Mickael

    2018-05-05

    Binge drinking (BD), i.e., consuming a large amount of alcohol in a short period of time, is an increasing public health issue. Though no clear definition has been adopted worldwide the speed of drinking seems to be a keystone of this behavior. Developing relevant animal models of BD is a priority for gaining a better characterization of the neurobiological and psychobiological mechanisms underlying this dangerous and harmful behavior. Until recently, preclinical research on BD has been conducted mostly using forced administration of alcohol, but more recent studies used scheduled access to alcohol, to model more voluntary excessive intakes, and to achieve signs of intoxications that mimic the human behavior. The main challenges for future research are discussed regarding the need of good face validity, construct validity and predictive validity of animal models of BD. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. Field validation of the contaminant transport model, FEMA

    International Nuclear Information System (INIS)

    Wong, K.-F.V.

    1986-01-01

    The work describes the validation with field data of a finite element model of material transport through aquifers (FEMA). Field data from the Idaho Chemical Processing Plant, Idaho, USA and from the 58th Street landfill in Miami, Florida, USA are used. In both cases the model was first calibrated and then integrated over a span of eight years to check on the predictive capability of the model. Both predictive runs gave results that matched well with available data. (author)

  8. Elasto-dynamic analysis of a gear pump-Part III: Experimental validation procedure and model extension to helical gears

    Science.gov (United States)

    Mucchi, E.; Dalpiaz, G.

    2015-01-01

    This work concerns external gear pumps for automotive applications, which operate at high speed and low pressure. In previous works of the authors (Part I and II, [1,2]), a non-linear lumped-parameter kineto-elastodynamic model for the prediction of the dynamic behaviour of external gear pumps was presented. It takes into account the most important phenomena involved in the operation of this kind of machine. The two main sources of noise and vibration are considered: pressure pulsation and gear meshing. The model has been used in order to foresee the influence of working conditions and design modifications on vibration generation. The model's experimental validation is a difficult task. Thus, Part III proposes a novel methodology for the validation carried out by the comparison of simulations and experimental results concerning forces and moments: it deals with the external and inertial components acting on the gears, estimated by the model, and the reactions and inertial components on the pump casing and the test plate, obtained by measurements. The validation is carried out comparing the level of the time synchronous average in the time domain and the waterfall maps in the frequency domain, with particular attention to identify system resonances. The validation results are satisfactory globally, but discrepancies are still present. Moreover, the assessed model has been properly modified for the application to a new virtual pump prototype with helical gears in order to foresee gear accelerations and dynamic forces. Part IV is focused on improvements in the modelling and analysis of the phenomena bound to the pressure evolution around the gears in order to achieve results closer to the measured values. As a matter of fact, the simulation results have shown that a variable meshing stiffness has a notable contribution on the dynamic behaviour of the pump but this is not as important as the pressure phenomena. As a consequence, the original model was modified with the

  9. TBscore II

    DEFF Research Database (Denmark)

    Rudolf, Frauke; Lemvik, Grethe; Abate, Ebba

    2013-01-01

    Abstract Background: The TBscore, based on simple signs and symptoms, was introduced to predict unsuccessful outcome in tuberculosis patients on treatment. A recent inter-observer variation study showed profound variation in some variables. Further, some variables depend on a physician assessing...... them, making the score less applicable. The aim of the present study was to simplify the TBscore. Methods: Inter-observer variation assessment and exploratory factor analysis were combined to develop a simplified score, the TBscore II. To validate TBscore II we assessed the association between start...

  10. Validation of a regional distribution model in environmental risk assessment of substances

    Energy Technology Data Exchange (ETDEWEB)

    Berding, V.

    2000-06-26

    The regional distribution model SimpleBox proposed in the TGD (Technical Guidance Document) and implemented in the EUSES software (European Union System for the Evaluation of Substances) was validated. The aim of this investigation was to determine the applicability and weaknesses of the model and to make proposals for improvement. The validation was performed using the scheme set up by SCHWARTZ (2000) of which the main aspects are the division into internal and external validation, i.e. into generic and task-specific properties of the model. These two validation parts contain the scrutiny of theory, sensitivity analyses, comparison of predicted environmental concentrations with measured ones by means of scenario analyses, uncertainty analyses and comparison with alternative models. Generally, the model employed is a reasonable compromise between complexity and simplification. Simpler models are applicable, too, but in many cases the results can deviate considerably from the measured values. For the sewage treatment model, it could be shown that its influence on the predicted concentration is very low and a much simpler model fulfils its purpose in a similar way. It is proposed to improve the model in several ways, e.g. by including the pH/pK-correction for dissociating substances or by alternative estimations functions for partition coefficients. But the main focus for future improvements should be on the amelioration of release estimations and substance characteristics as degradation rates and partition coefficients.

  11. Neuro-evolutionary computing paradigm for Painlevé equation-II in nonlinear optics

    Science.gov (United States)

    Ahmad, Iftikhar; Ahmad, Sufyan; Awais, Muhammad; Ul Islam Ahmad, Siraj; Asif Zahoor Raja, Muhammad

    2018-05-01

    The aim of this study is to investigate the numerical treatment of the Painlevé equation-II arising in physical models of nonlinear optics through artificial intelligence procedures by incorporating a single layer structure of neural networks optimized with genetic algorithms, sequential quadratic programming and active set techniques. We constructed a mathematical model for the nonlinear Painlevé equation-II with the help of networks by defining an error-based cost function in mean square sense. The performance of the proposed technique is validated through statistical analyses by means of the one-way ANOVA test conducted on a dataset generated by a large number of independent runs.

  12. External validation of EPIWIN biodegradation models.

    Science.gov (United States)

    Posthumus, R; Traas, T P; Peijnenburg, W J G M; Hulzebos, E M

    2005-01-01

    The BIOWIN biodegradation models were evaluated for their suitability for regulatory purposes. BIOWIN includes the linear and non-linear BIODEG and MITI models for estimating the probability of rapid aerobic biodegradation and an expert survey model for primary and ultimate biodegradation estimation. Experimental biodegradation data for 110 newly notified substances were compared with the estimations of the different models. The models were applied separately and in combinations to determine which model(s) showed the best performance. The results of this study were compared with the results of other validation studies and other biodegradation models. The BIOWIN models predict not-readily biodegradable substances with high accuracy in contrast to ready biodegradability. In view of the high environmental concern of persistent chemicals and in view of the large number of not-readily biodegradable chemicals compared to the readily ones, a model is preferred that gives a minimum of false positives without a corresponding high percentage false negatives. A combination of the BIOWIN models (BIOWIN2 or BIOWIN6) showed the highest predictive value for not-readily biodegradability. However, the highest score for overall predictivity with lowest percentage false predictions was achieved by applying BIOWIN3 (pass level 2.75) and BIOWIN6.

  13. Validation of the STAFF-5 computer model

    International Nuclear Information System (INIS)

    Fletcher, J.F.; Fields, S.R.

    1981-04-01

    STAFF-5 is a dynamic heat-transfer-fluid-flow stress model designed for computerized prediction of the temperature-stress performance of spent LWR fuel assemblies under storage/disposal conditions. Validation of the temperature calculating abilities of this model was performed by comparing temperature calculations under specified conditions to experimental data from the Engine Maintenance and Dissassembly (EMAD) Fuel Temperature Test Facility and to calculations performed by Battelle Pacific Northwest Laboratory (PNL) using the HYDRA-1 model. The comparisons confirmed the ability of STAFF-5 to calculate representative fuel temperatures over a considerable range of conditions, as a first step in the evaluation and prediction of fuel temperature-stress performance

  14. The Validation of a Beta-Binomial Model for Overdispersed Binomial Data.

    Science.gov (United States)

    Kim, Jongphil; Lee, Ji-Hyun

    2017-01-01

    The beta-binomial model has been widely used as an analytically tractable alternative that captures the overdispersion of an intra-correlated, binomial random variable, X . However, the model validation for X has been rarely investigated. As a beta-binomial mass function takes on a few different shapes, the model validation is examined for each of the classified shapes in this paper. Further, the mean square error (MSE) is illustrated for each shape by the maximum likelihood estimator (MLE) based on a beta-binomial model approach and the method of moments estimator (MME) in order to gauge when and how much the MLE is biased.

  15. Homology modeling and docking of AahII-Nanobody complexes reveal the epitope binding site on AahII scorpion toxin.

    Science.gov (United States)

    Ksouri, Ayoub; Ghedira, Kais; Ben Abderrazek, Rahma; Shankar, B A Gowri; Benkahla, Alia; Bishop, Ozlem Tastan; Bouhaouala-Zahar, Balkiss

    2018-02-19

    Scorpion envenoming and its treatment is a public health problem in many parts of the world due to highly toxic venom polypeptides diffusing rapidly within the body of severely envenomed victims. Recently, 38 AahII-specific Nanobody sequences (Nbs) were retrieved from which the performance of NbAahII10 nanobody candidate, to neutralize the most poisonous venom compound namely AahII acting on sodium channels, was established. Herein, structural computational approach is conducted to elucidate the Nb-AahII interactions that support the biological characteristics, using Nb multiple sequence alignment (MSA) followed by modeling and molecular docking investigations (RosettaAntibody, ZDOCK software tools). Sequence and structural analysis showed two dissimilar residues of NbAahII10 CDR1 (Tyr27 and Tyr29) and an inserted polar residue Ser30 that appear to play an important role. Indeed, CDR3 region of NbAahII10 is characterized by a specific Met104 and two negatively charged residues Asp115 and Asp117. Complex dockings reveal that NbAahII17 and NbAahII38 share one common binding site on the surface of the AahII toxin divergent from the NbAahII10 one's. At least, a couple of NbAahII10 - AahII residue interactions (Gln38 - Asn44 and Arg62, His64, respectively) are mainly involved in the toxic AahII binding site. Altogether, this study gives valuable insights in the design and development of next generation of antivenom. Copyright © 2018 Elsevier Inc. All rights reserved.

  16. Bayesian leave-one-out cross-validation approximations for Gaussian latent variable models

    DEFF Research Database (Denmark)

    Vehtari, Aki; Mononen, Tommi; Tolvanen, Ville

    2016-01-01

    The future predictive performance of a Bayesian model can be estimated using Bayesian cross-validation. In this article, we consider Gaussian latent variable models where the integration over the latent values is approximated using the Laplace method or expectation propagation (EP). We study...... the properties of several Bayesian leave-one-out (LOO) cross-validation approximations that in most cases can be computed with a small additional cost after forming the posterior approximation given the full data. Our main objective is to assess the accuracy of the approximative LOO cross-validation estimators...

  17. Recent validation studies for two NRPB environmental transfer models

    International Nuclear Information System (INIS)

    Brown, J.; Simmonds, J.R.

    1991-01-01

    The National Radiological Protection Board (NRPB) developed a dynamic model for the transfer of radionuclides through terrestrial food chains some years ago. This model, now called FARMLAND, predicts both instantaneous and time integrals of concentration of radionuclides in a variety of foods. The model can be used to assess the consequences of both accidental and routine releases of radioactivity to the environment; and results can be obtained as a function of time. A number of validation studies have been carried out on FARMLAND. In these the model predictions have been compared with a variety of sets of environmental measurement data. Some of these studies will be outlined in the paper. A model to predict external radiation exposure from radioactivity deposited on different surfaces in the environment has also been developed at NRPB. This model, called EXPURT (EXPosure from Urban Radionuclide Transfer), can be used to predict radiation doses as a function of time following deposition in a variety of environments, ranging from rural to inner-city areas. This paper outlines validation studies and future extensions to be carried out on EXPURT. (12 refs., 4 figs.)

  18. International Literature Review on WHODAS II (World Health Organization Disability Assessment Schedule II

    Directory of Open Access Journals (Sweden)

    Federici, Stefano

    2009-06-01

    Full Text Available This review is a critical analysis regarding the study and utilization of the World Health Organization Disability Assessment Schedule II (WHODAS II as a basis for establishing specific criteria for evaluating relevant international scientific literature.The WHODAS II is an instrument developed by the World Health Organisation in order to assess behavioural limitations and restrictions related to an individual’s participation, independent from a medical diagnosis. This instrument was developed by the WHO’s Assessment, Classification and Epidemiology Group within the framework of the WHO/NIH Joint Project on Assessment and Classification of Disablements. To ascertain the international dissemination level of for WHODAS II’s utilization and, at the same time, analyse the studies regarding the psychometric validation of the WHODAS II translation and adaptation in other languages and geographical contests. Particularly, our goal is to highlight which psychometric features have been investigated, focusing on the factorial structure, the reliability, and the validity of this instrument. International literature was researched through the main data bases of indexed scientific production: the Cambridge Scientific Abstracts – CSA, PubMed, and Google Scholar, from 1990 through to December 2008.The following search terms were used:“whodas”, in the field query, plus “title” and “abstract”.The WHODAS II has been used in 54 studies, of which 51 articles are published in international journals, 2 conference abstracts, and one dissertation abstract. Nevertheless, only 7 articles are published in journals and conference proceedings regarding disability and rehabilitation. Others have been published in medical and psychiatric journals, with the aim of indentifying comorbidity correlations in clinical diagnosis concerning patients with mental illness. Just 8 out of 51 articles have studied the psychometric properties of the WHODAS II. The

  19. NRPB models for calculating the transfer of radionuclides through the environment. Verification and validation

    International Nuclear Information System (INIS)

    Attwood, C.; Barraclough, I.; Brown, J.

    1998-06-01

    There is a wide range of models available at NRPB to predict the transfer of radionuclides through the environment. Such models form an essential part of assessments of the radiological impact of releases of radionuclides into the environment. These models cover: the atmosphere; the aquatic environment; the geosphere; the terrestrial environment including foodchains. It is important that the models used for radiological impact assessments are robust, reliable and suitable for the assessment being undertaken. During model development it is, therefore, important that the model is both verified and validated. Verification of a model involves ensuring that it has been implemented correctly, while validation consists of demonstrating that the model is an adequate representation of the real environment. The extent to which a model can be verified depends on its complexity and whether similar models exist. For relatively simple models verification is straightforward, but for more complex models verification has to form part of the development, coding and testing of the model within quality assurance procedures. Validation of models should ideally consist of comparisons between the results of the models and experimental or environmental measurement data that were not used to develop the model. This is more straightforward for some models than for others depending on the quantity and type of data available. Validation becomes increasingly difficult for models which are intended to predict environmental transfer at long times or at great distances. It is, therefore, necessary to adopt qualitative validation techniques to ensure that the model is an adequate representation of the real environment. This report summarises the models used at NRPB to predict the transfer of radionuclides through the environment as part of a radiological impact assessment. It outlines the work carried out to verify and validate the models. The majority of these models are not currently available

  20. Validation by simulation of a clinical trial model using the standardized mean and variance criteria.

    Science.gov (United States)

    Abbas, Ismail; Rovira, Joan; Casanovas, Josep

    2006-12-01

    To develop and validate a model of a clinical trial that evaluates the changes in cholesterol level as a surrogate marker for lipodystrophy in HIV subjects under alternative antiretroviral regimes, i.e., treatment with Protease Inhibitors vs. a combination of nevirapine and other antiretroviral drugs. Five simulation models were developed based on different assumptions, on treatment variability and pattern of cholesterol reduction over time. The last recorded cholesterol level, the difference from the baseline, the average difference from the baseline and level evolution, are the considered endpoints. Specific validation criteria based on a 10% minus or plus standardized distance in means and variances were used to compare the real and the simulated data. The validity criterion was met by all models for considered endpoints. However, only two models met the validity criterion when all endpoints were considered. The model based on the assumption that within-subjects variability of cholesterol levels changes over time is the one that minimizes the validity criterion, standardized distance equal to or less than 1% minus or plus. Simulation is a useful technique for calibration, estimation, and evaluation of models, which allows us to relax the often overly restrictive assumptions regarding parameters required by analytical approaches. The validity criterion can also be used to select the preferred model for design optimization, until additional data are obtained allowing an external validation of the model.

  1. Validation of advanced NSSS simulator model for loss-of-coolant accidents

    Energy Technology Data Exchange (ETDEWEB)

    Kao, S.P.; Chang, S.K.; Huang, H.C. [Nuclear Training Branch, Northeast Utilities, Waterford, CT (United States)

    1995-09-01

    The replacement of the NSSS (Nuclear Steam Supply System) model on the Millstone 2 full-scope simulator has significantly increased its fidelity to simulate adverse conditions in the RCS. The new simulator NSSS model is a real-time derivative of the Nuclear Plant Analyzer by ABB. The thermal-hydraulic model is a five-equation, non-homogeneous model for water, steam, and non-condensible gases. The neutronic model is a three-dimensional nodal diffusion model. In order to certify the new NSSS model for operator training, an extensive validation effort has been performed by benchmarking the model performance against RELAP5/MOD2. This paper presents the validation results for the cases of small-and large-break loss-of-coolant accidents (LOCA). Detailed comparisons in the phenomena of reflux-condensation, phase separation, and two-phase natural circulation are discussed.

  2. Comparison of measured and calculated composition of irradiated EBR-II blanket assemblies

    International Nuclear Information System (INIS)

    Grimm, K. N.

    1998-01-01

    In anticipation of processing irradiated EBR-II depleted uranium blanket subassemblies in the Fuel Conditioning Facility (FCF) at ANL-West, it has been possible to obtain a limited set of destructive chemical analyses of samples from a single EBR-II blanket subassembly. Comparison of calculated values with these measurements is being used to validate a depletion methodology based on a limited number of generic models of EBR-II to simulate the irradiation history of these subassemblies. Initial comparisons indicate these methods are adequate to meet the operations and material control and accountancy (MC and A) requirements for the FCF, but also indicate several shortcomings which may be corrected or improved

  3. Statistical Analysis Methods for Physics Models Verification and Validation

    CERN Document Server

    De Luca, Silvia

    2017-01-01

    The validation and verification process is a fundamental step for any software like Geant4 and GeantV, which aim to perform data simulation using physics models and Monte Carlo techniques. As experimental physicists, we have to face the problem to compare the results obtained using simulations with what the experiments actually observed. One way to solve the problem is to perform a consistency test. Within the Geant group, we developed a C++ compact library which will be added to the automated validation process on the Geant Validation Portal

  4. Isotopes as validation tools for global climate models

    International Nuclear Information System (INIS)

    Henderson-Sellers, A.

    2001-01-01

    Global Climate Models (GCMs) are the predominant tool with which we predict the future climate. In order that people can have confidence in such predictions, GCMs require validation. As almost every available item of meteorological data has been exploited in the construction and tuning of GCMs to date, independent validation is very difficult. This paper explores the use of isotopes as a novel and fully independent means of evaluating GCMs. The focus is the Amazon Basin which has a long history of isotope collection and analysis and also of climate modelling: both having been reported for over thirty years. Careful consideration of the results of GCM simulations of Amazonian deforestation and climate change suggests that the recent stable isotope record is more consistent with the predicted effects of greenhouse warming, possibly combined with forest removal, than with GCM predictions of the effects of deforestation alone

  5. Validation of a Wave-Body Interaction Model by Experimental Tests

    DEFF Research Database (Denmark)

    Ferri, Francesco; Kramer, Morten; Pecher, Arthur

    2013-01-01

    Within the wave energy field, numerical simulation has recently acquired a worldwide consent as being a useful tool, besides physical model testing. The main goal of this work is the validation of a numerical model by experimental results. The numerical model is based on a linear wave-body intera...

  6. Verification and Validation of Heat Transfer Model of AGREE Code

    Energy Technology Data Exchange (ETDEWEB)

    Tak, N. I. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Seker, V.; Drzewiecki, T. J.; Downar, T. J. [Department of Nuclear Engineering and Radiological Sciences, Univ. of Michigan, Michigan (United States); Kelly, J. M. [US Nuclear Regulatory Commission, Washington (United States)

    2013-05-15

    The AGREE code was originally developed as a multi physics simulation code to perform design and safety analysis of Pebble Bed Reactors (PBR). Currently, additional capability for the analysis of Prismatic Modular Reactor (PMR) core is in progress. Newly implemented fluid model for a PMR core is based on a subchannel approach which has been widely used in the analyses of light water reactor (LWR) cores. A hexagonal fuel (or graphite block) is discretized into triangular prism nodes having effective conductivities. Then, a meso-scale heat transfer model is applied to the unit cell geometry of a prismatic fuel block. Both unit cell geometries of multi-hole and pin-in-hole types of prismatic fuel blocks are considered in AGREE. The main objective of this work is to verify and validate the heat transfer model newly implemented for a PMR core in the AGREE code. The measured data in the HENDEL experiment were used for the validation of the heat transfer model for a pin-in-hole fuel block. However, the HENDEL tests were limited to only steady-state conditions of pin-in-hole fuel blocks. There exist no available experimental data regarding a heat transfer in multi-hole fuel blocks. Therefore, numerical benchmarks using conceptual problems are considered to verify the heat transfer model of AGREE for multi-hole fuel blocks as well as transient conditions. The CORONA and GAMMA+ codes were used to compare the numerical results. In this work, the verification and validation study were performed for the heat transfer model of the AGREE code using the HENDEL experiment and the numerical benchmarks of selected conceptual problems. The results of the present work show that the heat transfer model of AGREE is accurate and reliable for prismatic fuel blocks. Further validation of AGREE is in progress for a whole reactor problem using the HTTR safety test data such as control rod withdrawal tests and loss-of-forced convection tests.

  7. Liver stiffness value-based risk estimation of late recurrence after curative resection of hepatocellular carcinoma: development and validation of a predictive model.

    Directory of Open Access Journals (Sweden)

    Kyu Sik Jung

    Full Text Available Preoperative liver stiffness (LS measurement using transient elastography (TE is useful for predicting late recurrence after curative resection of hepatocellular carcinoma (HCC. We developed and validated a novel LS value-based predictive model for late recurrence of HCC.Patients who were due to undergo curative resection of HCC between August 2006 and January 2010 were prospectively enrolled and TE was performed prior to operations by study protocol. The predictive model of late recurrence was constructed based on a multiple logistic regression model. Discrimination and calibration were used to validate the model.Among a total of 139 patients who were finally analyzed, late recurrence occurred in 44 patients, with a median follow-up of 24.5 months (range, 12.4-68.1. We developed a predictive model for late recurrence of HCC using LS value, activity grade II-III, presence of multiple tumors, and indocyanine green retention rate at 15 min (ICG R15, which showed fairly good discrimination capability with an area under the receiver operating characteristic curve (AUROC of 0.724 (95% confidence intervals [CIs], 0.632-0.816. In the validation, using a bootstrap method to assess discrimination, the AUROC remained largely unchanged between iterations, with an average AUROC of 0.722 (95% CIs, 0.718-0.724. When we plotted a calibration chart for predicted and observed risk of late recurrence, the predicted risk of late recurrence correlated well with observed risk, with a correlation coefficient of 0.873 (P<0.001.A simple LS value-based predictive model could estimate the risk of late recurrence in patients who underwent curative resection of HCC.

  8. SWAT application in intensive irrigation systems: Model modification, calibration and validation

    Science.gov (United States)

    Dechmi, Farida; Burguete, Javier; Skhiri, Ahmed

    2012-11-01

    SummaryThe Soil and Water Assessment Tool (SWAT) is a well established, distributed, eco-hydrologic model. However, using the study case of an agricultural intensive irrigated watershed, it was shown that all the model versions are not able to appropriately reproduce the total streamflow in such system when the irrigation source is outside the watershed. The objective of this study was to modify the SWAT2005 version for correctly simulating the main hydrological processes. Crop yield, total streamflow, total suspended sediment (TSS) losses and phosphorus load calibration and validation were performed using field survey information and water quantity and quality data recorded during 2008 and 2009 years in Del Reguero irrigated watershed in Spain. The goodness of the calibration and validation results was assessed using five statistical measures, including the Nash-Sutcliffe efficiency (NSE). Results indicated that the average annual crop yield and actual evapotranspiration estimations were quite satisfactory. On a monthly basis, the values of NSE were 0.90 (calibration) and 0.80 (validation) indicating that the modified model could reproduce accurately the observed streamflow. The TSS losses were also satisfactorily estimated (NSE = 0.72 and 0.52 for the calibration and validation steps). The monthly temporal patterns and all the statistical parameters indicated that the modified SWAT-IRRIG model adequately predicted the total phosphorus (TP) loading. Therefore, the model could be used to assess the impacts of different best management practices on nonpoint phosphorus losses in irrigated systems.

  9. External validation of multivariable prediction models: a systematic review of methodological conduct and reporting

    Science.gov (United States)

    2014-01-01

    Background Before considering whether to use a multivariable (diagnostic or prognostic) prediction model, it is essential that its performance be evaluated in data that were not used to develop the model (referred to as external validation). We critically appraised the methodological conduct and reporting of external validation studies of multivariable prediction models. Methods We conducted a systematic review of articles describing some form of external validation of one or more multivariable prediction models indexed in PubMed core clinical journals published in 2010. Study data were extracted in duplicate on design, sample size, handling of missing data, reference to the original study developing the prediction models and predictive performance measures. Results 11,826 articles were identified and 78 were included for full review, which described the evaluation of 120 prediction models. in participant data that were not used to develop the model. Thirty-three articles described both the development of a prediction model and an evaluation of its performance on a separate dataset, and 45 articles described only the evaluation of an existing published prediction model on another dataset. Fifty-seven percent of the prediction models were presented and evaluated as simplified scoring systems. Sixteen percent of articles failed to report the number of outcome events in the validation datasets. Fifty-four percent of studies made no explicit mention of missing data. Sixty-seven percent did not report evaluating model calibration whilst most studies evaluated model discrimination. It was often unclear whether the reported performance measures were for the full regression model or for the simplified models. Conclusions The vast majority of studies describing some form of external validation of a multivariable prediction model were poorly reported with key details frequently not presented. The validation studies were characterised by poor design, inappropriate handling

  10. Object-oriented simulation model of a parabolic trough solar collector: Static and dynamic validation

    Science.gov (United States)

    Ubieta, Eduardo; Hoyo, Itzal del; Valenzuela, Loreto; Lopez-Martín, Rafael; Peña, Víctor de la; López, Susana

    2017-06-01

    A simulation model of a parabolic-trough solar collector developed in Modelica® language is calibrated and validated. The calibration is performed in order to approximate the behavior of the solar collector model to a real one due to the uncertainty in some of the system parameters, i.e. measured data is used during the calibration process. Afterwards, the validation of this calibrated model is done. During the validation, the results obtained from the model are compared to the ones obtained during real operation in a collector from the Plataforma Solar de Almeria (PSA).

  11. Experimental Testing Procedures and Dynamic Model Validation for Vanadium Redox Flow Battery Storage System

    DEFF Research Database (Denmark)

    Baccino, Francesco; Marinelli, Mattia; Nørgård, Per Bromand

    2013-01-01

    The paper aims at characterizing the electrochemical and thermal parameters of a 15 kW/320 kWh vanadium redox flow battery (VRB) installed in the SYSLAB test facility of the DTU Risø Campus and experimentally validating the proposed dynamic model realized in Matlab-Simulink. The adopted testing...... efficiency of the battery system. The test procedure has general validity and could also be used for other storage technologies. The storage model proposed and described is suitable for electrical studies and can represent a general model in terms of validity. Finally, the model simulation outputs...

  12. Pecan nutshell as biosorbent to remove Cu(II), Mn(II) and Pb(II) from aqueous solutions.

    Science.gov (United States)

    Vaghetti, Julio C P; Lima, Eder C; Royer, Betina; da Cunha, Bruna M; Cardoso, Natali F; Brasil, Jorge L; Dias, Silvio L P

    2009-02-15

    In the present study we reported for the first time the feasibility of pecan nutshell (PNS, Carya illinoensis) as an alternative biosorbent to remove Cu(II), Mn(II) and Pb(II) metallic ions from aqueous solutions. The ability of PNS to remove the metallic ions was investigated by using batch biosorption procedure. The effects such as, pH, biosorbent dosage on the adsorption capacities of PNS were studied. Four kinetic models were tested, being the adsorption kinetics better fitted to fractionary-order kinetic model. Besides that, the kinetic data were also fitted to intra-particle diffusion model, presenting three linear regions, indicating that the kinetics of adsorption should follow multiple sorption rates. The equilibrium data were fitted to Langmuir, Freundlich, Sips and Redlich-Peterson isotherm models. Taking into account a statistical error function, the data were best fitted to Sips isotherm model. The maximum biosorption capacities of PNS were 1.35, 1.78 and 0.946mmolg(-1) for Cu(II), Mn(II) and Pb(II), respectively.

  13. German validation of the Conners Adult ADHD Rating Scales (CAARS) II: reliability, validity, diagnostic sensitivity and specificity.

    Science.gov (United States)

    Christiansen, H; Kis, B; Hirsch, O; Matthies, S; Hebebrand, J; Uekermann, J; Abdel-Hamid, M; Kraemer, M; Wiltfang, J; Graf, E; Colla, M; Sobanski, E; Alm, B; Rösler, M; Jacob, C; Jans, T; Huss, M; Schimmelmann, B G; Philipsen, A

    2012-07-01

    The German version of the Conners Adult ADHD Rating Scales (CAARS) has proven to show very high model fit in confirmative factor analyses with the established factors inattention/memory problems, hyperactivity/restlessness, impulsivity/emotional lability, and problems with self-concept in both large healthy control and ADHD patient samples. This study now presents data on the psychometric properties of the German CAARS-self-report (CAARS-S) and observer-report (CAARS-O) questionnaires. CAARS-S/O and questions on sociodemographic variables were filled out by 466 patients with ADHD, 847 healthy control subjects that already participated in two prior studies, and a total of 896 observer data sets were available. Cronbach's-alpha was calculated to obtain internal reliability coefficients. Pearson correlations were performed to assess test-retest reliability, and concurrent, criterion, and discriminant validity. Receiver Operating Characteristics (ROC-analyses) were used to establish sensitivity and specificity for all subscales. Coefficient alphas ranged from .74 to .95, and test-retest reliability from .85 to .92 for the CAARS-S, and from .65 to .85 for the CAARS-O. All CAARS subscales, except problems with self-concept correlated significantly with the Barrett Impulsiveness Scale (BIS), but not with the Wender Utah Rating Scale (WURS). Criterion validity was established with ADHD subtype and diagnosis based on DSM-IV criteria. Sensitivity and specificity were high for all four subscales. The reported results confirm our previous study and show that the German CAARS-S/O do indeed represent a reliable and cross-culturally valid measure of current ADHD symptoms in adults. Copyright © 2011 Elsevier Masson SAS. All rights reserved.

  14. Model-Based Method for Sensor Validation

    Science.gov (United States)

    Vatan, Farrokh

    2012-01-01

    Fault detection, diagnosis, and prognosis are essential tasks in the operation of autonomous spacecraft, instruments, and in situ platforms. One of NASA s key mission requirements is robust state estimation. Sensing, using a wide range of sensors and sensor fusion approaches, plays a central role in robust state estimation, and there is a need to diagnose sensor failure as well as component failure. Sensor validation can be considered to be part of the larger effort of improving reliability and safety. The standard methods for solving the sensor validation problem are based on probabilistic analysis of the system, from which the method based on Bayesian networks is most popular. Therefore, these methods can only predict the most probable faulty sensors, which are subject to the initial probabilities defined for the failures. The method developed in this work is based on a model-based approach and provides the faulty sensors (if any), which can be logically inferred from the model of the system and the sensor readings (observations). The method is also more suitable for the systems when it is hard, or even impossible, to find the probability functions of the system. The method starts by a new mathematical description of the problem and develops a very efficient and systematic algorithm for its solution. The method builds on the concepts of analytical redundant relations (ARRs).

  15. Theoretical models for Type I and Type II supernova

    International Nuclear Information System (INIS)

    Woosley, S.E.; Weaver, T.A.

    1985-01-01

    Recent theoretical progress in understanding the origin and nature of Type I and Type II supernovae is discussed. New Type II presupernova models characterized by a variety of iron core masses at the time of collapse are presented and the sensitivity to the reaction rate 12 C(α,γ) 16 O explained. Stars heavier than about 20 M/sub solar/ must explode by a ''delayed'' mechanism not directly related to the hydrodynamical core bounce and a subset is likely to leave black hole remnants. The isotopic nucleosynthesis expected from these massive stellar explosions is in striking agreement with the sun. Type I supernovae result when an accreting white dwarf undergoes a thermonuclear explosion. The critical role of the velocity of the deflagration front in determining the light curve, spectrum, and, especially, isotopic nucleosynthesis in these models is explored. 76 refs., 8 figs

  16. Validating a perceptual distraction model using a personal two-zone sound system

    DEFF Research Database (Denmark)

    Rämö, Jussi; Christensen, Lasse; Bech, Søren

    2017-01-01

    This paper focuses on validating a perceptual distraction model, which aims to predict user's perceived distraction caused by audio-on-audio interference. Originally, the distraction model was trained with music targets and interferers using a simple loudspeaker setup, consisting of only two...... sound zones within the sound-zone system. Thus, validating the model using a different sound-zone system with both speech-on-music and music-on-speech stimuli sets. The results show that the model performance is equally good in both zones, i.e., with both speech- on-music and music-on-speech stimuli...

  17. Implementation and automated validation of the minimal Z' model in FeynRules

    International Nuclear Information System (INIS)

    Basso, L.; Christensen, N.D.; Duhr, C.; Fuks, B.; Speckner, C.

    2012-01-01

    We describe the implementation of a well-known class of U(1) gauge models, the 'minimal' Z' models, in FeynRules. We also describe a new automated validation tool for FeynRules models which is controlled by a web interface and allows the user to run a complete set of 2 → 2 processes on different matrix element generators, different gauges, and compare between them all. If existing, the comparison with independent implementations is also possible. This tool has been used to validate our implementation of the 'minimal' Z' models. (authors)

  18. Developing a model for hospital inherent safety assessment: Conceptualization and validation.

    Science.gov (United States)

    Yari, Saeed; Akbari, Hesam; Gholami Fesharaki, Mohammad; Khosravizadeh, Omid; Ghasemi, Mohammad; Barsam, Yalda; Akbari, Hamed

    2018-01-01

    Paying attention to the safety of hospitals, as the most crucial institute for providing medical and health services wherein a bundle of facilities, equipment, and human resource exist, is of significant importance. The present research aims at developing a model for assessing hospitals' safety based on principles of inherent safety design. Face validity (30 experts), content validity (20 experts), construct validity (268 examples), convergent validity, and divergent validity have been employed to validate the prepared questionnaire; and the items analysis, the Cronbach's alpha test, ICC test (to measure reliability of the test), composite reliability coefficient have been used to measure primary reliability. The relationship between variables and factors has been confirmed at 0.05 significance level by conducting confirmatory factor analysis (CFA) and structural equations modeling (SEM) technique with the use of Smart-PLS. R-square and load factors values, which were higher than 0.67 and 0.300 respectively, indicated the strong fit. Moderation (0.970), simplification (0.959), substitution (0.943), and minimization (0.5008) have had the most weights in determining the inherent safety of hospital respectively. Moderation, simplification, and substitution, among the other dimensions, have more weight on the inherent safety, while minimization has the less weight, which could be due do its definition as to minimize the risk.

  19. A proposed strategy for the validation of ground-water flow and solute transport models

    International Nuclear Information System (INIS)

    Davis, P.A.; Goodrich, M.T.

    1991-01-01

    Ground-water flow and transport models can be thought of as a combination of conceptual and mathematical models and the data that characterize a given system. The judgment of the validity or invalidity of a model depends both on the adequacy of the data and the model structure (i.e., the conceptual and mathematical model). This report proposes a validation strategy for testing both components independently. The strategy is based on the philosophy that a model cannot be proven valid, only invalid or not invalid. In addition, the authors believe that a model should not be judged in absence of its intended purpose. Hence, a flow and transport model may be invalid for one purpose but not invalid for another. 9 refs

  20. Sewer solids separation by sedimentation--the problem of modeling, validation and transferability.

    Science.gov (United States)

    Kutzner, R; Brombach, H; Geiger, W F

    2007-01-01

    Sedimentation of sewer solids in tanks, ponds and similar devices is the most relevant process for the treatment of stormwater and combined sewer overflows in urban collecting systems. In the past a lot of research work was done to develop deterministic models for the description of this separation process. But these modern models are not commonly accepted in Germany until today. Water Authorities are sceptical with regard to model validation and transferability. Within this paper it is checked whether this scepticism is reasonable. A framework-proposal for the validation of mathematical models with zero or one dimensional spatial resolution for particle separation processes for stormwater and combined sewer overflow treatment is presented. This proposal was applied to publications of repute on sewer solids separation by sedimentation. The result was that none of the investigated models described in literature passed the validation entirely. There is an urgent need for future research in sewer solids sedimentation and remobilization!

  1. Dynamic modeling and experimental validation for direct contact membrane distillation (DCMD) process

    KAUST Repository

    Eleiwi, Fadi

    2016-02-01

    This work proposes a mathematical dynamic model for the direct contact membrane distillation (DCMD) process. The model is based on a 2D Advection–Diffusion Equation (ADE), which describes the heat and mass transfer mechanisms that take place inside the DCMD module. The model studies the behavior of the process in the time varying and the steady state phases, contributing to understanding the process performance, especially when it is driven by intermittent energy supply, such as the solar energy. The model is experimentally validated in the steady state phase, where the permeate flux is measured for different feed inlet temperatures and the maximum absolute error recorded is 2.78 °C. Moreover, experimental validation includes the time variation phase, where the feed inlet temperature ranges from 30 °C to 75 °C with 0.1 °C increment every 2min. The validation marks relative error to be less than 5%, which leads to a strong correlation between the model predictions and the experiments.

  2. Dynamic modeling and experimental validation for direct contact membrane distillation (DCMD) process

    KAUST Repository

    Eleiwi, Fadi; Ghaffour, NorEddine; Alsaadi, Ahmad Salem; Francis, Lijo; Laleg-Kirati, Taous-Meriem

    2016-01-01

    This work proposes a mathematical dynamic model for the direct contact membrane distillation (DCMD) process. The model is based on a 2D Advection–Diffusion Equation (ADE), which describes the heat and mass transfer mechanisms that take place inside the DCMD module. The model studies the behavior of the process in the time varying and the steady state phases, contributing to understanding the process performance, especially when it is driven by intermittent energy supply, such as the solar energy. The model is experimentally validated in the steady state phase, where the permeate flux is measured for different feed inlet temperatures and the maximum absolute error recorded is 2.78 °C. Moreover, experimental validation includes the time variation phase, where the feed inlet temperature ranges from 30 °C to 75 °C with 0.1 °C increment every 2min. The validation marks relative error to be less than 5%, which leads to a strong correlation between the model predictions and the experiments.

  3. Validating Animal Models

    Directory of Open Access Journals (Sweden)

    Nina Atanasova

    2015-06-01

    Full Text Available In this paper, I respond to the challenge raised against contemporary experimental neurobiology according to which the field is in a state of crisis because of the multiple experimental protocols employed in different laboratories and strengthening their reliability that presumably preclude the validity of neurobiological knowledge. I provide an alternative account of experimentation in neurobiology which makes sense of its experimental practices. I argue that maintaining a multiplicity of experimental protocols and strengthening their reliability are well justified and they foster rather than preclude the validity of neurobiological knowledge. Thus, their presence indicates thriving rather than crisis of experimental neurobiology.

  4. Context discovery using attenuated Bloom codes: model description and validation

    NARCIS (Netherlands)

    Liu, F.; Heijenk, Geert

    A novel approach to performing context discovery in ad-hoc networks based on the use of attenuated Bloom filters is proposed in this report. In order to investigate the performance of this approach, a model has been developed. This document describes the model and its validation. The model has been

  5. Validation of the WATEQ4 geochemical model for uranium

    International Nuclear Information System (INIS)

    Krupka, K.M.; Jenne, E.A.; Deutsch, W.J.

    1983-09-01

    As part of the Geochemical Modeling and Nuclide/Rock/Groundwater Interactions Studies Program, a study was conducted to partially validate the WATEQ4 aqueous speciation-solubility geochemical model for uranium. The solubility controls determined with the WATEQ4 geochemical model were in excellent agreement with those laboratory studies in which the solids schoepite [UO 2 (OH) 2 . H 2 O], UO 2 (OH) 2 , and rutherfordine ((UO 2 CO 3 ) were identified as actual solubility controls for uranium. The results of modeling solution analyses from laboratory studies of uranyl phosphate solids, however, identified possible errors in the characterization of solids in the original solubility experiments. As part of this study, significant deficiencies in the WATEQ4 thermodynamic data base for uranium solutes and solids were corrected. Revisions included recalculation of selected uranium reactions. Additionally, thermodynamic data for the hydroxyl complexes of U(VI), including anionic (VI) species, were evaluated (to the extent permitted by the available data). Vanadium reactions were also added to the thermodynamic data base because uranium-vanadium solids can exist in natural ground-water systems. This study is only a partial validation of the WATEQ4 geochemical model because the available laboratory solubility studies do not cover the range of solid phases, alkaline pH values, and concentrations of inorganic complexing ligands needed to evaluate the potential solubility of uranium in ground waters associated with various proposed nuclear waste repositories. Further validation of this or other geochemical models for uranium will require careful determinations of uraninite solubility over the pH range of 7 to 10 under highly reducing conditions and of uranyl hydroxide and phosphate solubilities over the pH range of 7 to 10 under oxygenated conditions

  6. A practical guide for operational validation of discrete simulation models

    Directory of Open Access Journals (Sweden)

    Fabiano Leal

    2011-04-01

    Full Text Available As the number of simulation experiments increases, the necessity for validation and verification of these models demands special attention on the part of the simulation practitioners. By analyzing the current scientific literature, it is observed that the operational validation description presented in many papers does not agree on the importance designated to this process and about its applied techniques, subjective or objective. With the expectation of orienting professionals, researchers and students in simulation, this article aims to elaborate a practical guide through the compilation of statistical techniques in the operational validation of discrete simulation models. Finally, the guide's applicability was evaluated by using two study objects, which represent two manufacturing cells, one from the automobile industry and the other from a Brazilian tech company. For each application, the guide identified distinct steps, due to the different aspects that characterize the analyzed distributions

  7. Development and validation of models for bubble coalescence and breakup. Final report

    International Nuclear Information System (INIS)

    Liao, Y.; Lucas, D.

    2013-02-01

    A new generalized model for bubble coalescence and breakup has been developed. It is based on physical considerations and takes into account various mechanisms that can lead to bubble coalescence and breakup. First, in a detailed literature review, the available models were compiled and analyzed. It turned out that many of them show a contradictory behaviour. None of these models allows the prediction of the evolution of bubble size distributions along a pipe flow for a wide range of combinations of flow rates of the gas and the liquid phase. The new model has been extensively studied in a simplified Test-Solver. Although this does not cover all details of a developing flow along the pipe, it allows - in contrast to a CFD code - to conduct a large number of variational calculations to investigate the influence of individual sizes and models. Coalescence and breakup cannot be considered separately from other phenomena and models that reflect these phenomena. There are close interactions with the turbulence of the liquid phase and the momentum exchange between phases. Since the dissipation rate of turbulent kinetic energy is a direct input parameter for the new model, the turbulence modelling has been studied very carefully. To validate the model, a special experimental series for air-water flows was used, conducted at the TOPFLOW facility in an 8-meter long DN200 pipe. The data are characterized by high quality and were produced within the TOPFLOW-II project. The test series aims to provide a basis for the work presented here. Predicting the evolution of the bubble size distribution along the pipe could be improved significantly in comparison to the previous standard models for bubble coalescence and breakup implemented in CFX. However some quantitative discrepancies remain. The full model equations as well as an implementation as ''User-FORTRAN'' in CFX are available and can be used for further work on the simulation of poly-disperse bubbly flows.

  8. The Adsorption of Cd(II on Manganese Oxide Investigated by Batch and Modeling Techniques

    Directory of Open Access Journals (Sweden)

    Xiaoming Huang

    2017-09-01

    Full Text Available Manganese (Mn oxide is a ubiquitous metal oxide in sub-environments. The adsorption of Cd(II on Mn oxide as function of adsorption time, pH, ionic strength, temperature, and initial Cd(II concentration was investigated by batch techniques. The adsorption kinetics showed that the adsorption of Cd(II on Mn oxide can be satisfactorily simulated by pseudo-second-order kinetic model with high correlation coefficients (R2 > 0.999. The adsorption of Cd(II on Mn oxide significantly decreased with increasing ionic strength at pH < 5.0, whereas Cd(II adsorption was independent of ionic strength at pH > 6.0, which indicated that outer-sphere and inner-sphere surface complexation dominated the adsorption of Cd(II on Mn oxide at pH < 5.0 and pH > 6.0, respectively. The maximum adsorption capacity of Mn oxide for Cd(II calculated from Langmuir model was 104.17 mg/g at pH 6.0 and 298 K. The thermodynamic parameters showed that the adsorption of Cd(II on Mn oxide was an endothermic and spontaneous process. According to the results of surface complexation modeling, the adsorption of Cd(II on Mn oxide can be satisfactorily simulated by ion exchange sites (X2Cd at low pH and inner-sphere surface complexation sites (SOCd+ and (SO2CdOH− species at high pH conditions. The finding presented herein plays an important role in understanding the fate and transport of heavy metals at the water–mineral interface.

  9. Development and Validation of a Predictive Model for Functional Outcome After Stroke Rehabilitation: The Maugeri Model.

    Science.gov (United States)

    Scrutinio, Domenico; Lanzillo, Bernardo; Guida, Pietro; Mastropasqua, Filippo; Monitillo, Vincenzo; Pusineri, Monica; Formica, Roberto; Russo, Giovanna; Guarnaschelli, Caterina; Ferretti, Chiara; Calabrese, Gianluigi

    2017-12-01

    Prediction of outcome after stroke rehabilitation may help clinicians in decision-making and planning rehabilitation care. We developed and validated a predictive tool to estimate the probability of achieving improvement in physical functioning (model 1) and a level of independence requiring no more than supervision (model 2) after stroke rehabilitation. The models were derived from 717 patients admitted for stroke rehabilitation. We used multivariable logistic regression analysis to build each model. Then, each model was prospectively validated in 875 patients. Model 1 included age, time from stroke occurrence to rehabilitation admission, admission motor and cognitive Functional Independence Measure scores, and neglect. Model 2 included age, male gender, time since stroke onset, and admission motor and cognitive Functional Independence Measure score. Both models demonstrated excellent discrimination. In the derivation cohort, the area under the curve was 0.883 (95% confidence intervals, 0.858-0.910) for model 1 and 0.913 (95% confidence intervals, 0.884-0.942) for model 2. The Hosmer-Lemeshow χ 2 was 4.12 ( P =0.249) and 1.20 ( P =0.754), respectively. In the validation cohort, the area under the curve was 0.866 (95% confidence intervals, 0.840-0.892) for model 1 and 0.850 (95% confidence intervals, 0.815-0.885) for model 2. The Hosmer-Lemeshow χ 2 was 8.86 ( P =0.115) and 34.50 ( P =0.001), respectively. Both improvement in physical functioning (hazard ratios, 0.43; 0.25-0.71; P =0.001) and a level of independence requiring no more than supervision (hazard ratios, 0.32; 0.14-0.68; P =0.004) were independently associated with improved 4-year survival. A calculator is freely available for download at https://goo.gl/fEAp81. This study provides researchers and clinicians with an easy-to-use, accurate, and validated predictive tool for potential application in rehabilitation research and stroke management. © 2017 American Heart Association, Inc.

  10. Verification and Validation of FAARR Model and Data Envelopment Analysis Models for United States Army Recruiting

    National Research Council Canada - National Science Library

    Piskator, Gene

    1998-01-01

    ...) model and to develop a Data Envelopment Analysis (DEA) modeling strategy. First, the FAARR model was verified using a simulation of a known production function and validated using sensitivity analysis and ex-post forecasts...

  11. Towards a realistic approach to validation of reactive transport models for performance assessment

    International Nuclear Information System (INIS)

    Siegel, M.D.

    1993-01-01

    Performance assessment calculations are based on geochemical models that assume that interactions among radionuclides, rocks and groundwaters under natural conditions, can be estimated or bound by data obtained from laboratory-scale studies. The data include radionuclide distribution coefficients, measured in saturated batch systems of powdered rocks, and retardation factors measured in short-term column experiments. Traditional approaches to model validation cannot be applied in a straightforward manner to the simple reactive transport models that use these data. An approach to model validation in support of performance assessment is described in this paper. It is based on a recognition of different levels of model validity and is compatible with the requirements of current regulations for high-level waste disposal. Activities that are being carried out in support of this approach include (1) laboratory and numerical experiments to test the validity of important assumptions inherent in current performance assessment methodologies,(2) integrated transport experiments, and (3) development of a robust coupled reaction/transport code for sensitivity analyses using massively parallel computers

  12. Results from the Savannah River Laboratory model validation workshop

    International Nuclear Information System (INIS)

    Pepper, D.W.

    1981-01-01

    To evaluate existing and newly developed air pollution models used in DOE-funded laboratories, the Savannah River Laboratory sponsored a model validation workshop. The workshop used Kr-85 measurements and meteorology data obtained at SRL during 1975 to 1977. Individual laboratories used models to calculate daily, weekly, monthly or annual test periods. Cumulative integrated air concentrations were reported at each grid point and at each of the eight sampler locations

  13. The Adsorption of Cd(II) on Manganese Oxide Investigated by Batch and Modeling Techniques.

    Science.gov (United States)

    Huang, Xiaoming; Chen, Tianhu; Zou, Xuehua; Zhu, Mulan; Chen, Dong; Pan, Min

    2017-09-28

    Manganese (Mn) oxide is a ubiquitous metal oxide in sub-environments. The adsorption of Cd(II) on Mn oxide as function of adsorption time, pH, ionic strength, temperature, and initial Cd(II) concentration was investigated by batch techniques. The adsorption kinetics showed that the adsorption of Cd(II) on Mn oxide can be satisfactorily simulated by pseudo-second-order kinetic model with high correlation coefficients (R² > 0.999). The adsorption of Cd(II) on Mn oxide significantly decreased with increasing ionic strength at pH adsorption was independent of ionic strength at pH > 6.0, which indicated that outer-sphere and inner-sphere surface complexation dominated the adsorption of Cd(II) on Mn oxide at pH 6.0, respectively. The maximum adsorption capacity of Mn oxide for Cd(II) calculated from Langmuir model was 104.17 mg/g at pH 6.0 and 298 K. The thermodynamic parameters showed that the adsorption of Cd(II) on Mn oxide was an endothermic and spontaneous process. According to the results of surface complexation modeling, the adsorption of Cd(II) on Mn oxide can be satisfactorily simulated by ion exchange sites (X₂Cd) at low pH and inner-sphere surface complexation sites (SOCd⁺ and (SO)₂CdOH - species) at high pH conditions. The finding presented herein plays an important role in understanding the fate and transport of heavy metals at the water-mineral interface.

  14. The Adsorption of Cd(II) on Manganese Oxide Investigated by Batch and Modeling Techniques

    Science.gov (United States)

    Huang, Xiaoming; Chen, Tianhu; Zou, Xuehua; Zhu, Mulan; Chen, Dong

    2017-01-01

    Manganese (Mn) oxide is a ubiquitous metal oxide in sub-environments. The adsorption of Cd(II) on Mn oxide as function of adsorption time, pH, ionic strength, temperature, and initial Cd(II) concentration was investigated by batch techniques. The adsorption kinetics showed that the adsorption of Cd(II) on Mn oxide can be satisfactorily simulated by pseudo-second-order kinetic model with high correlation coefficients (R2 > 0.999). The adsorption of Cd(II) on Mn oxide significantly decreased with increasing ionic strength at pH adsorption was independent of ionic strength at pH > 6.0, which indicated that outer-sphere and inner-sphere surface complexation dominated the adsorption of Cd(II) on Mn oxide at pH 6.0, respectively. The maximum adsorption capacity of Mn oxide for Cd(II) calculated from Langmuir model was 104.17 mg/g at pH 6.0 and 298 K. The thermodynamic parameters showed that the adsorption of Cd(II) on Mn oxide was an endothermic and spontaneous process. According to the results of surface complexation modeling, the adsorption of Cd(II) on Mn oxide can be satisfactorily simulated by ion exchange sites (X2Cd) at low pH and inner-sphere surface complexation sites (SOCd+ and (SO)2CdOH− species) at high pH conditions. The finding presented herein plays an important role in understanding the fate and transport of heavy metals at the water–mineral interface. PMID:28956849

  15. Implementation and validation of the condensation model for containment hydrogen distribution studies

    International Nuclear Information System (INIS)

    Ravva, Srinivasa Rao; Iyer, Kannan N.; Gupta, S.K.; Gaikwad, Avinash J.

    2014-01-01

    Highlights: • A condensation model based on diffusion was implemented in FLUENT. • Validation of a condensation model for the H 2 distribution studies was performed. • Multi-component diffusion is used in the present work. • Appropriate grid and turbulence model were identified. - Abstract: This paper aims at the implementation details of a condensation model in the CFD code FLUENT and its validation so that it can be used in performing the containment hydrogen distribution studies. In such studies, computational fluid dynamics simulations are necessary for obtaining accurate predictions. While steam condensation plays an important role, commercial CFD codes such as FLUENT do not have an in-built condensation model. Therefore, a condensation model was developed and implemented in the FLUENT code through user defined functions (UDFs) for the sink terms in the mass, momentum, energy and species balance equations together with associated turbulence quantities viz., kinetic energy and dissipation rate. The implemented model was validated against the ISP-47 test of TOSQAN facility using the standard wall functions and enhanced wall treatment approaches. The best suitable grid size and the turbulence model for the low density gas (He) distribution studies are brought out in this paper

  16. System Advisor Model: Flat Plate Photovoltaic Performance Modeling Validation Report

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, Janine [National Renewable Energy Lab. (NREL), Golden, CO (United States); Whitmore, Jonathan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Kaffine, Leah [National Renewable Energy Lab. (NREL), Golden, CO (United States); Blair, Nate [National Renewable Energy Lab. (NREL), Golden, CO (United States); Dobos, Aron P. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2013-12-01

    The System Advisor Model (SAM) is a free software tool that performs detailed analysis of both system performance and system financing for a variety of renewable energy technologies. This report provides detailed validation of the SAM flat plate photovoltaic performance model by comparing SAM-modeled PV system generation data to actual measured production data for nine PV systems ranging from 75 kW to greater than 25 MW in size. The results show strong agreement between SAM predictions and field data, with annualized prediction error below 3% for all fixed tilt cases and below 8% for all one axis tracked cases. The analysis concludes that snow cover and system outages are the primary sources of disagreement, and other deviations resulting from seasonal biases in the irradiation models and one axis tracking issues are discussed in detail.

  17. Validation of Hydrodynamic Numerical Model of a Pitching Wave Energy Converter

    DEFF Research Database (Denmark)

    López, Maria del Pilar Heras; Thomas, Sarah; Kramer, Morten Mejlhede

    2017-01-01

    Validation of numerical model is essential in the development of new technologies. Commercial software and codes available simulating wave energy converters (WECs) have not been proved to work for all the available and upcoming technologies yet. The present paper presents the first stages...... of the validation process of a hydrodynamic numerical model for a pitching wave energy converter. The development of dry tests, wave flume and wave basin experiments are going to be explained, lessons learned shared and results presented....

  18. Validating soil phosphorus routines in the SWAT model

    Science.gov (United States)

    Phosphorus transfer from agricultural soils to surface waters is an important environmental issue. Commonly used models like SWAT have not always been updated to reflect improved understanding of soil P transformations and transfer to runoff. Our objective was to validate the ability of the P routin...

  19. An integrated approach for the validation of energy and environmental system analysis models : used in the validation of the Flexigas Excel BioGas model

    NARCIS (Netherlands)

    Pierie, Frank; van Someren, Christian; Liu, Wen; Bekkering, Jan; Hengeveld, Evert Jan; Holstein, J.; Benders, René M.J.; Laugs, Gideon A.H.; van Gemert, Wim; Moll, Henri C.

    2016-01-01

    A review has been completed for a verification and validation (V&V) of the (Excel) BioGas simulator or EBS model. The EBS model calculates the environmental impact of biogas production pathways using Material and Energy Flow Analysis, time dependent dynamics, geographic information, and Life Cycle

  20. Improvement and Validation of Weld Residual Stress Modelling Procedure

    International Nuclear Information System (INIS)

    Zang, Weilin; Gunnars, Jens; Dong, Pingsha; Hong, Jeong K.

    2009-06-01

    The objective of this work is to identify and evaluate improvements for the residual stress modelling procedure currently used in Sweden. There is a growing demand to eliminate any unnecessary conservatism involved in residual stress assumptions. The study was focused on the development and validation of an improved weld residual stress modelling procedure, by taking advantage of the recent advances in residual stress modelling and stress measurement techniques. The major changes applied in the new weld residual stress modelling procedure are: - Improved procedure for heat source calibration based on use of analytical solutions. - Use of an isotropic hardening model where mixed hardening data is not available. - Use of an annealing model for improved simulation of strain relaxation in re-heated material. The new modelling procedure is demonstrated to capture the main characteristics of the through thickness stress distributions by validation to experimental measurements. Three austenitic stainless steel butt-welds cases are analysed, covering a large range of pipe geometries. From the cases it is evident that there can be large differences between the residual stresses predicted using the new procedure, and the earlier procedure or handbook recommendations. Previously recommended profiles could give misleading fracture assessment results. The stress profiles according to the new procedure agree well with the measured data. If data is available then a mixed hardening model should be used

  1. Improvement and Validation of Weld Residual Stress Modelling Procedure

    Energy Technology Data Exchange (ETDEWEB)

    Zang, Weilin; Gunnars, Jens (Inspecta Technology AB, Stockholm (Sweden)); Dong, Pingsha; Hong, Jeong K. (Center for Welded Structures Research, Battelle, Columbus, OH (United States))

    2009-06-15

    The objective of this work is to identify and evaluate improvements for the residual stress modelling procedure currently used in Sweden. There is a growing demand to eliminate any unnecessary conservatism involved in residual stress assumptions. The study was focused on the development and validation of an improved weld residual stress modelling procedure, by taking advantage of the recent advances in residual stress modelling and stress measurement techniques. The major changes applied in the new weld residual stress modelling procedure are: - Improved procedure for heat source calibration based on use of analytical solutions. - Use of an isotropic hardening model where mixed hardening data is not available. - Use of an annealing model for improved simulation of strain relaxation in re-heated material. The new modelling procedure is demonstrated to capture the main characteristics of the through thickness stress distributions by validation to experimental measurements. Three austenitic stainless steel butt-welds cases are analysed, covering a large range of pipe geometries. From the cases it is evident that there can be large differences between the residual stresses predicted using the new procedure, and the earlier procedure or handbook recommendations. Previously recommended profiles could give misleading fracture assessment results. The stress profiles according to the new procedure agree well with the measured data. If data is available then a mixed hardening model should be used

  2. Calibration and validation of earthquake catastrophe models. Case study: Impact Forecasting Earthquake Model for Algeria

    Science.gov (United States)

    Trendafiloski, G.; Gaspa Rebull, O.; Ewing, C.; Podlaha, A.; Magee, B.

    2012-04-01

    Calibration and validation are crucial steps in the production of the catastrophe models for the insurance industry in order to assure the model's reliability and to quantify its uncertainty. Calibration is needed in all components of model development including hazard and vulnerability. Validation is required to ensure that the losses calculated by the model match those observed in past events and which could happen in future. Impact Forecasting, the catastrophe modelling development centre of excellence within Aon Benfield, has recently launched its earthquake model for Algeria as a part of the earthquake model for the Maghreb region. The earthquake model went through a detailed calibration process including: (1) the seismic intensity attenuation model by use of macroseismic observations and maps from past earthquakes in Algeria; (2) calculation of the country-specific vulnerability modifiers by use of past damage observations in the country. The use of Benouar, 1994 ground motion prediction relationship was proven as the most appropriate for our model. Calculation of the regional vulnerability modifiers for the country led to 10% to 40% larger vulnerability indexes for different building types compared to average European indexes. The country specific damage models also included aggregate damage models for residential, commercial and industrial properties considering the description of the buildings stock given by World Housing Encyclopaedia and the local rebuilding cost factors equal to 10% for damage grade 1, 20% for damage grade 2, 35% for damage grade 3, 75% for damage grade 4 and 100% for damage grade 5. The damage grades comply with the European Macroseismic Scale (EMS-1998). The model was validated by use of "as-if" historical scenario simulations of three past earthquake events in Algeria M6.8 2003 Boumerdes, M7.3 1980 El-Asnam and M7.3 1856 Djidjelli earthquake. The calculated return periods of the losses for client market portfolio align with the

  3. Artificial neural network (ANN) approach for modeling Zn(II) adsorption in batch process

    Energy Technology Data Exchange (ETDEWEB)

    Yildiz, Sayiter [Engineering Faculty, Cumhuriyet University, Sivas (Turkmenistan)

    2017-09-15

    Artificial neural networks (ANN) were applied to predict adsorption efficiency of peanut shells for the removal of Zn(II) ions from aqueous solutions. Effects of initial pH, Zn(II) concentrations, temperature, contact duration and adsorbent dosage were determined in batch experiments. The sorption capacities of the sorbents were predicted with the aid of equilibrium and kinetic models. The Zn(II) ions adsorption onto peanut shell was better defined by the pseudo-second-order kinetic model, for both initial pH, and temperature. The highest R{sup 2} value in isotherm studies was obtained from Freundlich isotherm for the inlet concentration and from Temkin isotherm for the sorbent amount. The high R{sup 2} values prove that modeling the adsorption process with ANN is a satisfactory approach. The experimental results and the predicted results by the model with the ANN were found to be highly compatible with each other.

  4. Artificial neural network (ANN) approach for modeling Zn(II) adsorption in batch process

    International Nuclear Information System (INIS)

    Yildiz, Sayiter

    2017-01-01

    Artificial neural networks (ANN) were applied to predict adsorption efficiency of peanut shells for the removal of Zn(II) ions from aqueous solutions. Effects of initial pH, Zn(II) concentrations, temperature, contact duration and adsorbent dosage were determined in batch experiments. The sorption capacities of the sorbents were predicted with the aid of equilibrium and kinetic models. The Zn(II) ions adsorption onto peanut shell was better defined by the pseudo-second-order kinetic model, for both initial pH, and temperature. The highest R"2 value in isotherm studies was obtained from Freundlich isotherm for the inlet concentration and from Temkin isotherm for the sorbent amount. The high R"2 values prove that modeling the adsorption process with ANN is a satisfactory approach. The experimental results and the predicted results by the model with the ANN were found to be highly compatible with each other.

  5. Three phase heat and mass transfer model for unsaturated soil freezing process: Part 2 - model validation

    Science.gov (United States)

    Zhang, Yaning; Xu, Fei; Li, Bingxi; Kim, Yong-Song; Zhao, Wenke; Xie, Gongnan; Fu, Zhongbin

    2018-04-01

    This study aims to validate the three-phase heat and mass transfer model developed in the first part (Three phase heat and mass transfer model for unsaturated soil freezing process: Part 1 - model development). Experimental results from studies and experiments were used for the validation. The results showed that the correlation coefficients for the simulated and experimental water contents at different soil depths were between 0.83 and 0.92. The correlation coefficients for the simulated and experimental liquid water contents at different soil temperatures were between 0.95 and 0.99. With these high accuracies, the developed model can be well used to predict the water contents at different soil depths and temperatures.

  6. Validation of NEPTUNE-CFD two-phase flow models using experimental data

    International Nuclear Information System (INIS)

    Perez-Manes, Jorge; Sanchez Espinoza, Victor Hugo; Bottcher, Michael; Stieglitz, Robert; Sergio Chiva Vicent

    2014-01-01

    This paper deals with the validation of the two-phase flow models of the CFD code NEPTUNE-CFD using experimental data provided by the OECD BWR BFBT and PSBT Benchmark. Since the two-phase models of CFD codes are extensively being improved, the validation is a key step for the acceptability of such codes. The validation work is performed in the frame of the European NURISP Project and it was focused on the steady state and transient void fraction tests. The influence of different NEPTUNE-CFD model parameters on the void fraction prediction is investigated and discussed in detail. Due to the coupling of heat conduction solver SYRTHES with NEPTUNE-CFD, the description of the coupled fluid dynamics and heat transfer between the fuel rod and the fluid is improved significantly. The averaged void fraction predicted by NEPTUNE-CFD for selected PSBT and BFBT tests is in good agreement with the experimental data. Finally, areas for future improvements of the NEPTUNE-CFD code were identified, too. (authors)

  7. Flight Testing an Iced Business Jet for Flight Simulation Model Validation

    Science.gov (United States)

    Ratvasky, Thomas P.; Barnhart, Billy P.; Lee, Sam; Cooper, Jon

    2007-01-01

    A flight test of a business jet aircraft with various ice accretions was performed to obtain data to validate flight simulation models developed through wind tunnel tests. Three types of ice accretions were tested: pre-activation roughness, runback shapes that form downstream of the thermal wing ice protection system, and a wing ice protection system failure shape. The high fidelity flight simulation models of this business jet aircraft were validated using a software tool called "Overdrive." Through comparisons of flight-extracted aerodynamic forces and moments to simulation-predicted forces and moments, the simulation models were successfully validated. Only minor adjustments in the simulation database were required to obtain adequate match, signifying the process used to develop the simulation models was successful. The simulation models were implemented in the NASA Ice Contamination Effects Flight Training Device (ICEFTD) to enable company pilots to evaluate flight characteristics of the simulation models. By and large, the pilots confirmed good similarities in the flight characteristics when compared to the real airplane. However, pilots noted pitch up tendencies at stall with the flaps extended that were not representative of the airplane and identified some differences in pilot forces. The elevator hinge moment model and implementation of the control forces on the ICEFTD were identified as a driver in the pitch ups and control force issues, and will be an area for future work.

  8. Pre-engineering Spaceflight Validation of Environmental Models and the 2005 HZETRN Simulation Code

    Science.gov (United States)

    Nealy, John E.; Cucinotta, Francis A.; Wilson, John W.; Badavi, Francis F.; Dachev, Ts. P.; Tomov, B. T.; Walker, Steven A.; DeAngelis, Giovanni; Blattnig, Steve R.; Atwell, William

    2006-01-01

    The HZETRN code has been identified by NASA for engineering design in the next phase of space exploration highlighting a return to the Moon in preparation for a Mars mission. In response, a new series of algorithms beginning with 2005 HZETRN, will be issued by correcting some prior limitations and improving control of propagated errors along with established code verification processes. Code validation processes will use new/improved low Earth orbit (LEO) environmental models with a recently improved International Space Station (ISS) shield model to validate computational models and procedures using measured data aboard ISS. These validated models will provide a basis for flight-testing the designs of future space vehicles and systems of the Constellation program in the LEO environment.

  9. Derivation and validation of a simple clinical risk-model in heart failure based on 6 minute walk test performance and NT-proBNP status--do we need specificity for sex and beta-blockers?

    Science.gov (United States)

    Frankenstein, L; Goode, K; Ingle, L; Remppis, A; Schellberg, D; Nelles, M; Katus, H A; Clark, A L; Cleland, J G F; Zugck, C

    2011-02-17

    It is unclear whether risk prediction strategies in chronic heart failure (CHF) need to be specific for sex or beta-blockers. We examined this problem and developed and validated the consequent risk models based on 6-minute-walk-test and NT-proBNP. The derivation cohort comprised 636 German patients with systolic dysfunction. They were validated against 676 British patients with similar aetiology. ROC-curves for 1-year mortality identified cut-off values separately for specificity (none, sex, beta-blocker, both). Patients were grouped according to number of cut-offs met (group I/II/III - 0/1/2 cut-offs). Widest separation between groups was achieved with sex- and beta-blocker-specific cut offs. In the derivation population, 1-year mortality was 0%, 8%, 31% for group I, II and III, respectively. In the validation population, 1-year rates in the three risk groups were 2%, 7%, 14%, respectively, after application of the same cut-offs. Risk stratification for CHF should perhaps take sex and beta-blocker usage into account. We derived and independently validated relevant risk models based on 6-minute-walk-tests and NT-proBNP. Specifying sex and use of beta-blockers identified three distinct sub-groups with widely differing prognosis. In clinical practice, it may be appropriate to tailor the intensity of follow-up and/or the treatment strategy according to the risk-group. Copyright © 2009 Elsevier Ireland Ltd. All rights reserved.

  10. Validating a perceptual distraction model in a personal two-zone sound system

    DEFF Research Database (Denmark)

    Rämö, Jussi; Christensen, Lasse; Bech, Søren

    2017-01-01

    This paper focuses on validating a perceptual distraction model, which aims to predict user’s perceived distraction caused by audio-on-audio interference, e.g., two competing audio sources within the same listening space. Originally, the distraction model was trained with music-on-music stimuli...... that the model performance is equally good in both zones, i.e., with both speech-on-music and music-on-speech stimuli, and comparable to the previous validation round (RMSE approximately 10%). The results further confirm that the distraction model can be used as a valuable tool in evaluating and optimizing...

  11. A Component-Based Modeling and Validation Method for PLC Systems

    Directory of Open Access Journals (Sweden)

    Rui Wang

    2014-05-01

    Full Text Available Programmable logic controllers (PLCs are complex embedded systems that are widely used in industry. This paper presents a component-based modeling and validation method for PLC systems using the behavior-interaction-priority (BIP framework. We designed a general system architecture and a component library for a type of device control system. The control software and hardware of the environment were all modeled as BIP components. System requirements were formalized as monitors. Simulation was carried out to validate the system model. A realistic example from industry of the gates control system was employed to illustrate our strategies. We found a couple of design errors during the simulation, which helped us to improve the dependability of the original systems. The results of experiment demonstrated the effectiveness of our approach.

  12. Modeling Type II-P/II-L Supernovae Interacting with Recent Episodic Mass Ejections from Their Presupernova Stars with MESA and SNEC

    Science.gov (United States)

    Das, Sanskriti; Ray, Alak

    2017-12-01

    We show how dense, compact, discrete shells of circumstellar gas immediately outside of red supergiants affect the optical light curves of Type II-P/II-L supernovae (SNe), using the example of SN 2013ej. Earlier efforts in the literature had used an artificial circumstellar medium (CSM) stitched to the surface of an evolved star that had not gone through a phase of late-stage heavy mass loss, which, in essence, is the original source of the CSM. In contrast, we allow enhanced mass-loss rate from the modeled star during the 16O and 28Si burning stages and construct the CSM from the resulting mass-loss history in a self-consistent way. Once such evolved pre-SN stars are exploded, we find that the models with early interaction between the shock and the dense CSM reproduce light curves far better than those without that mass loss and, hence, having no nearby dense CSM. The required explosion energy for the progenitors with a dense CSM is reduced by almost a factor of two compared to those without the CSM. Our model, with a more realistic CSM profile and presupernova and explosion parameters, fits observed data much better throughout the rise, plateau, and radioactive tail phases as compared to previous studies. This points to an intermediate class of supernovae between Type II-P/II-L and Type II-n SNe with the characteristics of simultaneous UV and optical peak, slow decline after peak, and a longer plateau.

  13. Fuel temperature influence on the performance of a last generation common-rail diesel ballistic injector. Part II: 1D model development, validation and analysis

    International Nuclear Information System (INIS)

    Payri, R.; Salvador, F.J.; Carreres, M.; De la Morena, J.

    2016-01-01

    Highlights: • A 1D model of a solenoid common-rail ballistic injector is implemented in AMESim. • A detailed dimensional and a hydraulic characterization lead to a fair validation. • Fuel temperature influence on injector dynamics is assessed through 1D simulations. • Temperature impacts through changes in inlet orifice regime and viscous friction. • Cold fuel temperature leads to a slower injection opening due to high viscosity. - Abstract: A one-dimensional model of a solenoid-driven common-rail diesel injector has been developed in order to study the influence of fuel temperature on the injection process. The model has been implemented after a thorough characterization of the injector, both from the dimensional and the hydraulic point of view. In this sense, experimental tools for the determination of the geometry of the injector lines and orifices have been described in the paper, together with the hydraulic setup introduced to characterize the flow behaviour through the calibrated orifices. An extensive validation of the model has been performed by comparing the modelled mass flow rate against the experimental results introduced in the first part of the paper, which were performed for different engine-like operating conditions involving a wide range of fuel temperatures, injection pressures and energizing times. In that first part of the study, an important influence of the fuel temperature was reported, especially in terms of the dynamic behaviour of the injector, due to its ballistic nature. The results from the model have allowed to explain and further extend the findings of the experimental study by analyzing key features of the injector dynamics, such as the pressure drop established in the control volume due to the control orifices performance or the forces due to viscous friction, also assessing their influence on the needle lift laws.

  14. Line profile studies of hydrodynamical models of cometary compact H II regions

    International Nuclear Information System (INIS)

    Zhu, Feng-Yao; Zhu, Qing-Feng

    2015-01-01

    We simulate the evolution of cometary H II regions based on several champagne flow models and bow shock models, and calculate the profiles of the [Ne II] fine-structure line at 12.81 μm, the H30α recombination line and the [Ne III] fine-structure line at 15.55 μm for these models at different inclinations of 0°, 30° and 60°. We find that the profiles in the bow shock models are generally different from those in the champagne flow models, but the profiles in the bow shock models with lower stellar velocity (≤ 5 km s −1 ) are similar to those in the champagne flow models. In champagne flow models, both the velocity of peak flux and the flux weighted central velocities of all three lines point outward from molecular clouds. In bow shock models, the directions of these velocities depend on the speed of stars. The central velocities of these lines are consistent with the stellar motion in the high stellar speed cases, but they are opposite directions from the stellar motion in the low speed cases. We notice that the line profiles from the slit along the symmetrical axis of the projected 2D image of these models are useful for distinguishing bow shock models from champagne flow models. It is also confirmed by the calculation that the flux weighted central velocity and the line luminosity of the [Ne III] line can be estimated from the [Ne II] line and the H30α line. (paper)

  15. Standardization of radioimmunoassay for dosage of angiotensin II (ang-II) and its methodological evaluation

    International Nuclear Information System (INIS)

    Mantovani, Milene; Mecawi, Andre S.; Elias, Lucila L.K.; Antunes-Rodrigues, Jose

    2011-01-01

    This paper standardizes the radioimmunoassay (RIA) for dosage of ANG-II of rats, after experimental conditions of saline hypertonic (2%), treating with losartan (antagonist of ANG-II), hydric privation, and acute hemorrhage (25%). After that, the plasmatic ANG-II was extracted for dosage of RIA, whose sensitiveness was of 1.95 pg/m L, with detection of 1.95 to 1000 pg/m L. The treatment with saline reduced the concentration of ANG-II, while the administration pf losartan, the hydric administration and the hemorrhage increase the values, related to the control group. Those results indicate variations in the plasmatic concentration of ANG-II according to the experimental protocols, validating the method for evaluation of activity renin-angiotensin

  16. Modeling of surge in free-spool centrifugal compressors : experimental validation

    NARCIS (Netherlands)

    Gravdahl, J.T.; Willems, F.P.T.; Jager, de A.G.; Egeland, O.

    2004-01-01

    The derivation of a compressor characteristic, and the experimental validation of a dynamic model for a variable speed centrifugal compressor using this characteristic, are presented. The dynamic compressor model of Fink et al. is used, and a variable speed compressor characteristic is derived by

  17. Sample size calculation to externally validate scoring systems based on logistic regression models.

    Directory of Open Access Journals (Sweden)

    Antonio Palazón-Bru

    Full Text Available A sample size containing at least 100 events and 100 non-events has been suggested to validate a predictive model, regardless of the model being validated and that certain factors can influence calibration of the predictive model (discrimination, parameterization and incidence. Scoring systems based on binary logistic regression models are a specific type of predictive model.The aim of this study was to develop an algorithm to determine the sample size for validating a scoring system based on a binary logistic regression model and to apply it to a case study.The algorithm was based on bootstrap samples in which the area under the ROC curve, the observed event probabilities through smooth curves, and a measure to determine the lack of calibration (estimated calibration index were calculated. To illustrate its use for interested researchers, the algorithm was applied to a scoring system, based on a binary logistic regression model, to determine mortality in intensive care units.In the case study provided, the algorithm obtained a sample size with 69 events, which is lower than the value suggested in the literature.An algorithm is provided for finding the appropriate sample size to validate scoring systems based on binary logistic regression models. This could be applied to determine the sample size in other similar cases.

  18. Borderline personality disorder subscale (Chinese version) of the structured clinical interview for DSM-IV axis II personality disorders: a validation study in Cantonese-speaking Hong Kong Chinese.

    Science.gov (United States)

    Wong, H M; Chow, L Y

    2011-06-01

    Borderline personality disorder is an important but under-recognised clinical entity, for which there are only a few available diagnostic instruments in the Chinese language. None has been tested for its psychometric properties in the Cantonese-speaking population in Hong Kong. The present study aimed to assess the validity of the Chinese version of the Borderline Personality Disorder subscale of the Structured Clinical Interview for the Diagnostic and Statistical Manual of Mental Disorders Axis II Personality Disorders (SCID-II) in Cantonese-speaking Hong Kong Chinese. A convenience sampling method was used. The subjects were seen by a multidisciplinary clinical team, who arrived at a best-estimate diagnosis and then by application of the SCID-II rater using the Chinese version of the Borderline Personality Disorder subscale. The study was carried out at the psychiatric clinic of the Prince of Wales Hospital in Hong Kong. A total of 87 patients of Chinese ethnicity aged 18 to 64 years who attended the clinic in April 2007 were recruited. The aforementioned patient parameters were used to examine the internal consistency, best-estimate clinical diagnosis-SCID diagnosis agreement, sensitivity, and specificity of the Chinese version of the subscale. The Borderline Personality Disorder subscale (Chinese version) of SCID-II had an internal consistency of 0.82 (Cronbach's alpha coefficient), best-estimate clinical diagnosis-SCID diagnosis agreement of 0.82 (kappa), sensitivity of 0.92, and specificity of 0.94. The Borderline Personality Disorder subscale (Chinese version) of the SCID-II rater had reasonable validity when applied to Cantonese-speaking Chinese subjects in Hong Kong.

  19. [Reliability and Validity of the Korean Version of the Perinatal Post-Traumatic Stress Disorder Questionnaire].

    Science.gov (United States)

    Park, Yu Kyung; Ju, Hyeon Ok; Na, Hunjoo

    2016-02-01

    The Perinatal Post-Traumatic Stress Disorder Questionnaire (PPQ) was designed to measure post-traumatic symptoms related to childbirth and symptoms during postnatal period. The purpose of this study was to develop a translated Korean version of the PPQ and to evaluate reliability and validity of the Korean PPQ. Participants were 196 mothers at one to 18 months after giving childbirth and data were collected through e-mails. The PPQ was translated into Korean using translation guideline from World Health Organization. For this study Cronbach's alpha and split-half reliability were used to evaluate the reliability of the PPQ. Exploratory Factor Analysis (EFA), Confirmatory Factor Analysis (CFA), and known-group validity were conducted to examine construct validity. Correlations of the PPQ with Impact of Event Scale (IES), Beck Depression Inventory II (BDI-II), and Beck Anxiety Inventory (BAI) were used to test a criterion validity of the PPQ. Cronbach's alpha and Spearman-Brown split-half correlation coefficient were 0.91 and 0.77, respectively. EFA identified a 3-factor solution including arousal, avoidance, and intrusion factors and CFA revealed the strongest support for the 3-factor model. The correlations of the PPQ with IES, BDI-II, and BAI were .99, .60, and .72, respectively, pointing to criterion validity of a high level. The Korean version PPQ is a useful tool for screening and assessing mothers' experiencing emotional distress related to child birth and during the postnatal period. The PPQ also reflects Post Traumatic Stress Disorder's diagnostic standards well.

  20. Earth as an extrasolar planet: Earth model validation using EPOXI earth observations.

    Science.gov (United States)

    Robinson, Tyler D; Meadows, Victoria S; Crisp, David; Deming, Drake; A'hearn, Michael F; Charbonneau, David; Livengood, Timothy A; Seager, Sara; Barry, Richard K; Hearty, Thomas; Hewagama, Tilak; Lisse, Carey M; McFadden, Lucy A; Wellnitz, Dennis D

    2011-06-01

    The EPOXI Discovery Mission of Opportunity reused the Deep Impact flyby spacecraft to obtain spatially and temporally resolved visible photometric and moderate resolution near-infrared (NIR) spectroscopic observations of Earth. These remote observations provide a rigorous validation of whole-disk Earth model simulations used to better understand remotely detectable extrasolar planet characteristics. We have used these data to upgrade, correct, and validate the NASA Astrobiology Institute's Virtual Planetary Laboratory three-dimensional line-by-line, multiple-scattering spectral Earth model. This comprehensive model now includes specular reflectance from the ocean and explicitly includes atmospheric effects such as Rayleigh scattering, gas absorption, and temperature structure. We have used this model to generate spatially and temporally resolved synthetic spectra and images of Earth for the dates of EPOXI observation. Model parameters were varied to yield an optimum fit to the data. We found that a minimum spatial resolution of ∼100 pixels on the visible disk, and four categories of water clouds, which were defined by using observed cloud positions and optical thicknesses, were needed to yield acceptable fits. The validated model provides a simultaneous fit to Earth's lightcurve, absolute brightness, and spectral data, with a root-mean-square (RMS) error of typically less than 3% for the multiwavelength lightcurves and residuals of ∼10% for the absolute brightness throughout the visible and NIR spectral range. We have extended our validation into the mid-infrared by comparing the model to high spectral resolution observations of Earth from the Atmospheric Infrared Sounder, obtaining a fit with residuals of ∼7% and brightness temperature errors of less than 1 K in the atmospheric window. For the purpose of understanding the observable characteristics of the distant Earth at arbitrary viewing geometry and observing cadence, our validated forward model can be

  1. Validation of a Hot Water Distribution Model Using Laboratory and Field Data

    Energy Technology Data Exchange (ETDEWEB)

    Backman, C.; Hoeschele, M.

    2013-07-01

    Characterizing the performance of hot water distribution systems is a critical step in developing best practice guidelines for the design and installation of high performance hot water systems. Developing and validating simulation models is critical to this effort, as well as collecting accurate input data to drive the models. In this project, the ARBI team validated the newly developed TRNSYS Type 604 pipe model against both detailed laboratory and field distribution system performance data. Validation efforts indicate that the model performs very well in handling different pipe materials, insulation cases, and varying hot water load conditions. Limitations of the model include the complexity of setting up the input file and long simulation run times. In addition to completing validation activities, this project looked at recent field hot water studies to better understand use patterns and potential behavioral changes as homeowners convert from conventional storage water heaters to gas tankless units. Based on these datasets, we conclude that the current Energy Factor test procedure overestimates typical use and underestimates the number of hot water draws. This has implications for both equipment and distribution system performance. Gas tankless water heaters were found to impact how people use hot water, but the data does not necessarily suggest an increase in usage. Further study in hot water usage and patterns is needed to better define these characteristics in different climates and home vintages.

  2. Experimental validation of a thermodynamic boiler model under steady state and dynamic conditions

    International Nuclear Information System (INIS)

    Carlon, Elisa; Verma, Vijay Kumar; Schwarz, Markus; Golicza, Laszlo; Prada, Alessandro; Baratieri, Marco; Haslinger, Walter; Schmidl, Christoph

    2015-01-01

    Highlights: • Laboratory tests on two commercially available pellet boilers. • Steady state and a dynamic load cycle tests. • Pellet boiler model calibration based on data registered in stationary operation. • Boiler model validation with reference to both stationary and dynamic operation. • Validated model suitable for coupled simulation of building and heating system. - Abstract: Nowadays dynamic building simulation is an essential tool for the design of heating systems for residential buildings. The simulation of buildings heated by biomass systems, first of all needs detailed boiler models, capable of simulating the boiler both as a stand-alone appliance and as a system component. This paper presents the calibration and validation of a boiler model by means of laboratory tests. The chosen model, i.e. TRNSYS “Type 869”, has been validated for two commercially available pellet boilers of 6 and 12 kW nominal capacities. Two test methods have been applied: the first is a steady state test at nominal load and the second is a load cycle test including stationary operation at different loads as well as transient operation. The load cycle test is representative of the boiler operation in the field and characterises the boiler’s stationary and dynamic behaviour. The model had been calibrated based on laboratory data registered during stationary operation at different loads and afterwards it was validated by simulating both the stationary and the dynamic tests. Selected parameters for the validation were the heat transfer rates to water and the water temperature profiles inside the boiler and at the boiler outlet. Modelling results showed better agreement with experimental data during stationary operation rather than during dynamic operation. Heat transfer rates to water were predicted with a maximum deviation of 10% during the stationary operation, and a maximum deviation of 30% during the dynamic load cycle. However, for both operational regimes the

  3. Validation of Slosh Modeling Approach Using STAR-CCM+

    Science.gov (United States)

    Benson, David J.; Ng, Wanyi

    2018-01-01

    Without an adequate understanding of propellant slosh, the spacecraft attitude control system may be inadequate to control the spacecraft or there may be an unexpected loss of science observation time due to higher slosh settling times. Computational fluid dynamics (CFD) is used to model propellant slosh. STAR-CCM+ is a commercially available CFD code. This paper seeks to validate the CFD modeling approach via a comparison between STAR-CCM+ liquid slosh modeling results and experimental, empirically, and analytically derived results. The geometries examined are a bare right cylinder tank and a right cylinder with a single ring baffle.

  4. Atmospheric corrosion: statistical validation of models

    International Nuclear Information System (INIS)

    Diaz, V.; Martinez-Luaces, V.; Guineo-Cobs, G.

    2003-01-01

    In this paper we discuss two different methods for validation of regression models, applied to corrosion data. One of them is based on the correlation coefficient and the other one is the statistical test of lack of fit. Both methods are used here to analyse fitting of bi logarithmic model in order to predict corrosion for very low carbon steel substrates in rural and urban-industrial atmospheres in Uruguay. Results for parameters A and n of the bi logarithmic model are reported here. For this purpose, all repeated values were used instead of using average values as usual. Modelling is carried out using experimental data corresponding to steel substrates under the same initial meteorological conditions ( in fact, they are put in the rack at the same time). Results of correlation coefficient are compared with the lack of it tested at two different signification levels (α=0.01 and α=0.05). Unexpected differences between them are explained and finally, it is possible to conclude, at least in the studied atmospheres, that the bi logarithmic model does not fit properly the experimental data. (Author) 18 refs

  5. Computing Models of CDF and D0 in Run II

    International Nuclear Information System (INIS)

    Lammel, S.

    1997-05-01

    The next collider run of the Fermilab Tevatron, Run II, is scheduled for autumn of 1999. Both experiments, the Collider Detector at Fermilab (CDF) and the D0 experiment are being modified to cope with the higher luminosity and shorter bunchspacing of the Tevatron. New detector components, higher event complexity, and an increased data volume require changes from the data acquisition systems up to the analysis systems. In this paper we present a summary of the computing models of the two experiments for Run II

  6. Computing Models of CDF and D0 in Run II

    International Nuclear Information System (INIS)

    Lammel, S.

    1997-01-01

    The next collider run of the Fermilab Tevatron, Run II, is scheduled for autumn of 1999. Both experiments, the Collider Detector at Fermilab (CDF) and the D0 experiment are being modified to cope with the higher luminosity and shorter bunch spacing of the Tevatron. New detector components, higher event complexity, and an increased data volume require changes from the data acquisition systems up to the analysis systems. In this paper we present a summary of the computing models of the two experiments for Run II

  7. Reactor core modeling practice: Operational requirements, model characteristics, and model validation

    International Nuclear Information System (INIS)

    Zerbino, H.

    1997-01-01

    The physical models implemented in power plant simulators have greatly increased in performance and complexity in recent years. This process has been enabled by the ever increasing computing power available at affordable prices. This paper describes this process from several angles: First the operational requirements which are more critical from the point of view of model performance, both for normal and off-normal operating conditions; A second section discusses core model characteristics in the light of the solutions implemented by Thomson Training and Simulation (TT and S) in several full-scope simulators recently built and delivered for Dutch, German, and French nuclear power plants; finally we consider the model validation procedures, which are of course an integral part of model development, and which are becoming more and more severe as performance expectations increase. As a conclusion, it may be asserted that in the core modeling field, as in other areas, the general improvement in the quality of simulation codes has resulted in a fairly rapid convergence towards mainstream engineering-grade calculations. This is remarkable performance in view of the stringent real-time requirements which the simulation codes must satisfy as well as the extremely wide range of operating conditions that they are called upon to cover with good accuracy. (author)

  8. A New Statistical Method to Determine the Degree of Validity of Health Economic Model Outcomes against Empirical Data.

    Science.gov (United States)

    Corro Ramos, Isaac; van Voorn, George A K; Vemer, Pepijn; Feenstra, Talitha L; Al, Maiwenn J

    2017-09-01

    The validation of health economic (HE) model outcomes against empirical data is of key importance. Although statistical testing seems applicable, guidelines for the validation of HE models lack guidance on statistical validation, and actual validation efforts often present subjective judgment of graphs and point estimates. To discuss the applicability of existing validation techniques and to present a new method for quantifying the degrees of validity statistically, which is useful for decision makers. A new Bayesian method is proposed to determine how well HE model outcomes compare with empirical data. Validity is based on a pre-established accuracy interval in which the model outcomes should fall. The method uses the outcomes of a probabilistic sensitivity analysis and results in a posterior distribution around the probability that HE model outcomes can be regarded as valid. We use a published diabetes model (Modelling Integrated Care for Diabetes based on Observational data) to validate the outcome "number of patients who are on dialysis or with end-stage renal disease." Results indicate that a high probability of a valid outcome is associated with relatively wide accuracy intervals. In particular, 25% deviation from the observed outcome implied approximately 60% expected validity. Current practice in HE model validation can be improved by using an alternative method based on assessing whether the model outcomes fit to empirical data at a predefined level of accuracy. This method has the advantage of assessing both model bias and parameter uncertainty and resulting in a quantitative measure of the degree of validity that penalizes models predicting the mean of an outcome correctly but with overly wide credible intervals. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  9. Model validation studies of water flow and radionuclide transport in vegetated soils using lysimeter data

    Energy Technology Data Exchange (ETDEWEB)

    Butler, A.; Jining Chen [Imperial College of Science, Technology and Medicine, London (United Kingdom)] [and others

    1996-09-01

    Model Uncertainty and Validation was one of the four themes of BIOMOVS II which had been identified by the programme's steering committee. It arose out of a concern that biosphere assessment models are generally simplified representations of highly complex environmental systems which, therefore, include a degree of uncertainty in their outputs. This uncertainty may be due to inadequate representations of the physical, chemical and biological processes; issues associated with scaling up highly non-linear systems; problems of model identification, in particular user interpretation. Therefore, during the course of the 5 year (1991-1996) BIOMOVS II programme a number of working sub-groups reestablished to address these issues. This document is the final report of the Prediction of Upward Migration of Radionuclides in Lysimeters sub-group which was established towards the end of the programme, late in 1994. It describes the 'blind' application of various hydrological and radiochemical transport models to experiment data derived from vegetated lysimeters. In order to investigate soil-to-plant transfer processes affecting the radionuclide migration from contaminated near surface water tables into arable crops, a lysimeter experiment has been undertaken at Imperial College, funded by UK Nirex Ltd. Detailed observations of climate, soil hydrology, plant growth and radiochemical migration were collected on the uptake of various radionuclides by a winter wheat crop. A selected set of data was made available to members of BIOMOVS II in order to allow them to test relevant components of current versions of assessment code. This was a challenging task owing to the rather unusual experimental design, in particular, the introduction of radionuclides at the base of the lysimeter, 5 cm below a fixed water table, and their subsequent upward migration through the soil. The comprehensive hydrological data set available provided various modelers, particularly those

  10. Model validation studies of water flow and radionuclide transport in vegetated soils using lysimeter data

    International Nuclear Information System (INIS)

    Butler, A.; Jining Chen

    1996-09-01

    Model Uncertainty and Validation was one of the four themes of BIOMOVS II which had been identified by the programme's steering committee. It arose out of a concern that biosphere assessment models are generally simplified representations of highly complex environmental systems which, therefore, include a degree of uncertainty in their outputs. This uncertainty may be due to inadequate representations of the physical, chemical and biological processes; issues associated with scaling up highly non-linear systems; problems of model identification, in particular user interpretation. Therefore, during the course of the 5 year (1991-1996) BIOMOVS II programme a number of working sub-groups reestablished to address these issues. This document is the final report of the Prediction of Upward Migration of Radionuclides in Lysimeters sub-group which was established towards the end of the programme, late in 1994. It describes the 'blind' application of various hydrological and radiochemical transport models to experiment data derived from vegetated lysimeters. In order to investigate soil-to-plant transfer processes affecting the radionuclide migration from contaminated near surface water tables into arable crops, a lysimeter experiment has been undertaken at Imperial College, funded by UK Nirex Ltd. Detailed observations of climate, soil hydrology, plant growth and radiochemical migration were collected on the uptake of various radionuclides by a winter wheat crop. A selected set of data was made available to members of BIOMOVS II in order to allow them to test relevant components of current versions of assessment code. This was a challenging task owing to the rather unusual experimental design, in particular, the introduction of radionuclides at the base of the lysimeter, 5 cm below a fixed water table, and their subsequent upward migration through the soil. The comprehensive hydrological data set available provided various modelers, particularly those involved in tritium

  11. Selection, calibration, and validation of models of tumor growth.

    Science.gov (United States)

    Lima, E A B F; Oden, J T; Hormuth, D A; Yankeelov, T E; Almeida, R C

    2016-11-01

    This paper presents general approaches for addressing some of the most important issues in predictive computational oncology concerned with developing classes of predictive models of tumor growth. First, the process of developing mathematical models of vascular tumors evolving in the complex, heterogeneous, macroenvironment of living tissue; second, the selection of the most plausible models among these classes, given relevant observational data; third, the statistical calibration and validation of models in these classes, and finally, the prediction of key Quantities of Interest (QOIs) relevant to patient survival and the effect of various therapies. The most challenging aspects of this endeavor is that all of these issues often involve confounding uncertainties: in observational data, in model parameters, in model selection, and in the features targeted in the prediction. Our approach can be referred to as "model agnostic" in that no single model is advocated; rather, a general approach that explores powerful mixture-theory representations of tissue behavior while accounting for a range of relevant biological factors is presented, which leads to many potentially predictive models. Then representative classes are identified which provide a starting point for the implementation of OPAL, the Occam Plausibility Algorithm (OPAL) which enables the modeler to select the most plausible models (for given data) and to determine if the model is a valid tool for predicting tumor growth and morphology ( in vivo ). All of these approaches account for uncertainties in the model, the observational data, the model parameters, and the target QOI. We demonstrate these processes by comparing a list of models for tumor growth, including reaction-diffusion models, phase-fields models, and models with and without mechanical deformation effects, for glioma growth measured in murine experiments. Examples are provided that exhibit quite acceptable predictions of tumor growth in laboratory

  12. Validation of a FAST Model of the SWAY Prototype Floating Wind Turbine

    Energy Technology Data Exchange (ETDEWEB)

    Koh, J. H. [Nanyang Technological Univ. (Singapore); Ng, E. Y. K. [Nanyang Technological Univ. (Singapore); Robertson, Amy [National Renewable Energy Lab. (NREL), Golden, CO (United States); Jonkman, Jason [National Renewable Energy Lab. (NREL), Golden, CO (United States); Driscoll, Frederick [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2016-06-01

    As part of a collaboration of the National Renewable Energy Laboratory (NREL) and SWAY AS, NREL installed scientific wind, wave, and motion measurement equipment on the spar-type 1/6.5th-scale prototype SWAY floating offshore wind system. The equipment enhanced SWAY's data collection and allowed SWAY to verify the concept and NREL to validate a FAST model of the SWAY design in an open-water condition. Nanyang Technological University (NTU), in collaboration with NREL, assisted with the validation. This final report gives an overview of the SWAY prototype and NREL and NTU's efforts to validate a model of the system. The report provides a summary of the different software tools used in the study, the modeling strategies, and the development of a FAST model of the SWAY prototype wind turbine, including justification of the modeling assumptions. Because of uncertainty in system parameters and modeling assumptions due to the complexity of the design, several system properties were tuned to better represent the system and improve the accuracy of the simulations. Calibration was performed using data from a static equilibrium test and free-decay tests.

  13. Refining and validating a conceptual model of Clinical Nurse Leader integrated care delivery.

    Science.gov (United States)

    Bender, Miriam; Williams, Marjory; Su, Wei; Hites, Lisle

    2017-02-01

    To empirically validate a conceptual model of Clinical Nurse Leader integrated care delivery. There is limited evidence of frontline care delivery models that consistently achieve quality patient outcomes. Clinical Nurse Leader integrated care delivery is a promising nursing model with a growing record of success. However, theoretical clarity is necessary to generate causal evidence of effectiveness. Sequential mixed methods. A preliminary Clinical Nurse Leader practice model was refined and survey items developed to correspond with model domains, using focus groups and a Delphi process with a multi-professional expert panel. The survey was administered in 2015 to clinicians and administrators involved in Clinical Nurse Leader initiatives. Confirmatory factor analysis and structural equation modelling were used to validate the measurement and model structure. Final sample n = 518. The model incorporates 13 components organized into five conceptual domains: 'Readiness for Clinical Nurse Leader integrated care delivery'; 'Structuring Clinical Nurse Leader integrated care delivery'; 'Clinical Nurse Leader Practice: Continuous Clinical Leadership'; 'Outcomes of Clinical Nurse Leader integrated care delivery'; and 'Value'. Sample data had good fit with specified model and two-level measurement structure. All hypothesized pathways were significant, with strong coefficients suggesting good fit between theorized and observed path relationships. The validated model articulates an explanatory pathway of Clinical Nurse Leader integrated care delivery, including Clinical Nurse Leader practices that result in improved care dynamics and patient outcomes. The validated model provides a basis for testing in practice to generate evidence that can be deployed across the healthcare spectrum. © 2016 John Wiley & Sons Ltd.

  14. Predictive Simulation of Material Failure Using Peridynamics -- Advanced Constitutive Modeling, Verification and Validation

    Science.gov (United States)

    2016-03-31

    AFRL-AFOSR-VA-TR-2016-0309 Predictive simulation of material failure using peridynamics- advanced constitutive modeling, verification , and validation... Self -explanatory. 8. PERFORMING ORGANIZATION REPORT NUMBER. Enter all unique alphanumeric report numbers assigned by the performing organization, e.g...for public release. Predictive simulation of material failure using peridynamics-advanced constitutive modeling, verification , and validation John T

  15. Adsorption of Pb(II), Cu(II), Cd(II), Zn(II), Ni(II), Fe(II), and As(V) on bacterially produced metal sulfides.

    Science.gov (United States)

    Jong, Tony; Parry, David L

    2004-07-01

    The adsorption of Pb(II), Cu(II), Cd(II), Zn(II), Ni(II), Fe(II) and As(V) onto bacterially produced metal sulfide (BPMS) material was investigated using a batch equilibrium method. It was found that the sulfide material had adsorptive properties comparable with those of other adsorbents with respect to the specific uptake of a range of metals and, the levels to which dissolved metal concentrations in solution can be reduced. The percentage of adsorption increased with increasing pH and adsorbent dose, but decreased with increasing initial dissolved metal concentration. The pH of the solution was the most important parameter controlling adsorption of Cd(II), Cu(II), Fe(II), Ni(II), Pb(II), Zn(II), and As(V) by BPMS. The adsorption data were successfully modeled using the Langmuir adsorption isotherm. Desorption experiments showed that the reversibility of adsorption was low, suggesting high-affinity adsorption governed by chemisorption. The mechanism of adsorption for the divalent metals was thought to be the formation of strong, inner-sphere complexes involving surface hydroxyl groups. However, the mechanism for the adsorption of As(V) by BPMS appears to be distinct from that of surface hydroxyl exchange. These results have important implications to the management of metal sulfide sludge produced by bacterial sulfate reduction.

  16. Studying the highly bent spectra of FR II-type radio galaxies with the KDA EXT model

    Science.gov (United States)

    Kuligowska, Elżbieta

    2018-04-01

    Context. The Kaiser, Dennett-Thorpe & Alexander (KDA, 1997, MNRAS, 292, 723) EXT model, that is, the extension of the KDA model of Fanaroff & Riley (FR) II-type source evolution, is applied and confronted with the observational data for selected FR II-type radio sources with significantly aged radio spectra. Aim. A sample of FR II-type radio galaxies with radio spectra strongly bent at their highest frequencies is used for testing the usefulness of the KDA EXT model. Methods: The dynamical evolution of FR II-type sources predicted with the KDA EXT model is briefly presented and discussed. The results are then compared to the ones obtained with the classical KDA approach, assuming the source's continuous injection and self-similarity. Results: The results and corresponding diagrams obtained for the eight sample sources indicate that the KDA EXT model predicts the observed radio spectra significantly better than the best spectral fit provided by the original KDA model.

  17. Validation of the hdm models forcrack initiation and development, rutting and roughness of the pavement

    Directory of Open Access Journals (Sweden)

    Ognjenović Slobodan

    2017-01-01

    Full Text Available Worldwide practice recommends validation of the HDM models with some other software that can be used for comparison of the forecasting results. The program package MATLAB is used in this case, as it enables for modelling of all the HDM models. A statistic validation of the results of the forecasts concerning the condition of the pavements in HDM with the on-field measuring results was also performed. This paper shall present the results of the validation of the coefficients of calibration of the deterioration models in HDM 4 on the Macedonian highways.

  18. Natural analogues and radionuclide transport model validation

    International Nuclear Information System (INIS)

    Lever, D.A.

    1987-08-01

    In this paper, some possible roles for natural analogues are discussed from the point of view of those involved with the development of mathematical models for radionuclide transport and with the use of these models in repository safety assessments. The characteristic features of a safety assessment are outlined in order to address the questions of where natural analogues can be used to improve our understanding of the processes involved and where they can assist in validating the models that are used. Natural analogues have the potential to provide useful information about some critical processes, especially long-term chemical processes and migration rates. There is likely to be considerable uncertainty and ambiguity associated with the interpretation of natural analogues, and thus it is their general features which should be emphasized, and models with appropriate levels of sophistication should be used. Experience gained in modelling the Koongarra uranium deposit in northern Australia is drawn upon. (author)

  19. Equilibrium and kinetic studies of Pb(II, Cd(II and Zn(II sorption by Lagenaria vulgaris shell

    Directory of Open Access Journals (Sweden)

    Mitić-Stojanović Dragana-Linda

    2012-01-01

    Full Text Available The sorption of lead, cadmium and zinc ions from aqueous solution by Lagenaria vulgaris shell biosorbent (LVB in batch system was investigated. The effect of relevant parameters such as contact time, biosorbent dosage and initial metal ions concentration was evaluated. The Pb(II, Cd(II and Zn(II sorption equilibrium (when 98% of initial metal ions were sorbed was attained within 15, 20 and 25 min, respectively. The pseudo first, pseudo-second order, Chrastil’s and intra-particle diffusion models were used to describe the kinetic data. The experimental data fitted the pseudo-second order kinetic model and intra-particle diffusion model. Removal efficiency of lead(II, cadmium(II and zinc(II ions rapidly increased with increasing biosorbent dose from 0.5 to 8.0 g dm-3. Optimal biosorbent dose was set to 4.0 g dm-3. An increase in the initial metal concentration increases the sorption capacity. The sorption data of investigated metal ions are fitted to Langmuir, Freundlich and Temkin isotherm models. Langmuir model best fitted the equilibrium data (r2 > 0.99. Maximal sorption capacities of LVB for Pb(II, Cd(II and Zn(II at 25.0±0.5°C were 0.130, 0.103 and 0.098 mM g-1, respectively. The desorption experiments showed that the LVB could be reused for six cycles with a minimum loss of the initial sorption capacity.

  20. Image decomposition as a tool for validating stress analysis models

    Directory of Open Access Journals (Sweden)

    Mottershead J.

    2010-06-01

    Full Text Available It is good practice to validate analytical and numerical models used in stress analysis for engineering design by comparison with measurements obtained from real components either in-service or in the laboratory. In reality, this critical step is often neglected or reduced to placing a single strain gage at the predicted hot-spot of stress. Modern techniques of optical analysis allow full-field maps of displacement, strain and, or stress to be obtained from real components with relative ease and at modest cost. However, validations continued to be performed only at predicted and, or observed hot-spots and most of the wealth of data is ignored. It is proposed that image decomposition methods, commonly employed in techniques such as fingerprinting and iris recognition, can be employed to validate stress analysis models by comparing all of the key features in the data from the experiment and the model. Image decomposition techniques such as Zernike moments and Fourier transforms have been used to decompose full-field distributions for strain generated from optical techniques such as digital image correlation and thermoelastic stress analysis as well as from analytical and numerical models by treating the strain distributions as images. The result of the decomposition is 101 to 102 image descriptors instead of the 105 or 106 pixels in the original data. As a consequence, it is relatively easy to make a statistical comparison of the image descriptors from the experiment and from the analytical/numerical model and to provide a quantitative assessment of the stress analysis.

  1. Model performance evaluation (validation and calibration) in model-based studies of therapeutic interventions for cardiovascular diseases : a review and suggested reporting framework.

    Science.gov (United States)

    Haji Ali Afzali, Hossein; Gray, Jodi; Karnon, Jonathan

    2013-04-01

    Decision analytic models play an increasingly important role in the economic evaluation of health technologies. Given uncertainties around the assumptions used to develop such models, several guidelines have been published to identify and assess 'best practice' in the model development process, including general modelling approach (e.g., time horizon), model structure, input data and model performance evaluation. This paper focuses on model performance evaluation. In the absence of a sufficient level of detail around model performance evaluation, concerns regarding the accuracy of model outputs, and hence the credibility of such models, are frequently raised. Following presentation of its components, a review of the application and reporting of model performance evaluation is presented. Taking cardiovascular disease as an illustrative example, the review investigates the use of face validity, internal validity, external validity, and cross model validity. As a part of the performance evaluation process, model calibration is also discussed and its use in applied studies investigated. The review found that the application and reporting of model performance evaluation across 81 studies of treatment for cardiovascular disease was variable. Cross-model validation was reported in 55 % of the reviewed studies, though the level of detail provided varied considerably. We found that very few studies documented other types of validity, and only 6 % of the reviewed articles reported a calibration process. Considering the above findings, we propose a comprehensive model performance evaluation framework (checklist), informed by a review of best-practice guidelines. This framework provides a basis for more accurate and consistent documentation of model performance evaluation. This will improve the peer review process and the comparability of modelling studies. Recognising the fundamental role of decision analytic models in informing public funding decisions, the proposed

  2. Lessons learned from recent geomagnetic disturbance model validation activities

    Science.gov (United States)

    Pulkkinen, A. A.; Welling, D. T.

    2017-12-01

    Due to concerns pertaining to geomagnetically induced current impact on ground-based infrastructure, there has been significantly elevated interest in applying models for local geomagnetic disturbance or "delta-B" predictions. Correspondingly there has been elevated need for testing the quality of the delta-B predictions generated by the modern empirical and physics-based models. To address this need, community-wide activities were launched under the GEM Challenge framework and one culmination of the activities was the validation and selection of models that were transitioned into operations at NOAA SWPC. The community-wide delta-B action is continued under the CCMC-facilitated International Forum for Space Weather Capabilities Assessment and its "Ground Magnetic Perturbations: dBdt, delta-B, GICs, FACs" working group. The new delta-B working group builds on the past experiences and expands the collaborations to cover the entire international space weather community. In this paper, we discuss the key lessons learned from the past delta-B validation exercises and lay out the path forward for building on those experience under the new delta-B working group.

  3. High Turbidity Solis Clear Sky Model: Development and Validation

    Directory of Open Access Journals (Sweden)

    Pierre Ineichen

    2018-03-01

    Full Text Available The Solis clear sky model is a spectral scheme based on radiative transfer calculations and the Lambert–Beer relation. Its broadband version is a simplified fast analytical version; it is limited to broadband aerosol optical depths lower than 0.45, which is a weakness when applied in countries with very high turbidity such as China or India. In order to extend the use of the original simplified version of the model for high turbidity values, we developed a new version of the broadband Solis model based on radiative transfer calculations, valid for turbidity values up to 7, for the three components, global, beam, and diffuse, and for the four aerosol types defined by Shettle and Fenn. A validation of low turbidity data acquired in Geneva shows slightly better results than the previous version. On data acquired at sites presenting higher turbidity data, the bias stays within ±4% for the beam and the global irradiances, and the standard deviation around 5% for clean and stable condition data and around 12% for questionable data and variable sky conditions.

  4. Validation of an O-18 leaf water enrichment model

    Energy Technology Data Exchange (ETDEWEB)

    Jaeggi, M.; Saurer, M.; Siegwolf, R.

    2002-03-01

    The seasonal trend in {delta}{sup 18}O{sub ol} in leaf organic matter of spruce needles of mature trees could be modelled for two years. The seasonality was mainly explained by the {delta}{sup 18}O of top-soil water, whereas between years differences were due to variation in air humidity. Application of a third year's data set improved the correlation between modelled and measured {delta}{sup 18}O{sub ol} and thus validated our extended Dongmann model. (author)

  5. Development and validation of logistic prognostic models by predefined SAS-macros

    Directory of Open Access Journals (Sweden)

    Ziegler, Christoph

    2006-02-01

    Full Text Available In medical decision making about therapies or diagnostic procedures in the treatment of patients the prognoses of the course or of the magnitude of diseases plays a relevant role. Beside of the subjective attitude of the clinician mathematical models can help in providing such prognoses. Such models are mostly multivariate regression models. In the case of a dichotomous outcome the logistic model will be applied as the standard model. In this paper we will describe SAS-macros for the development of such a model, for examination of the prognostic performance, and for model validation. The rational for this developmental approach of a prognostic modelling and the description of the macros can only given briefly in this paper. Much more details are given in. These 14 SAS-macros are a tool for setting up the whole process of deriving a prognostic model. Especially the possibility of validating the model by a standardized software tool gives an opportunity, which is not used in general in published prognostic models. Therefore, this can help to develop new models with good prognostic performance for use in medical applications.

  6. Modeling and validation of existing VAV system components

    Energy Technology Data Exchange (ETDEWEB)

    Nassif, N.; Kajl, S.; Sabourin, R. [Ecole de Technologie Superieure, Montreal, PQ (Canada)

    2004-07-01

    The optimization of supervisory control strategies and local-loop controllers can improve the performance of HVAC (heating, ventilating, air-conditioning) systems. In this study, the component model of the fan, the damper and the cooling coil were developed and validated against monitored data of an existing variable air volume (VAV) system installed at Montreal's Ecole de Technologie Superieure. The measured variables that influence energy use in individual HVAC models included: (1) outdoor and return air temperature and relative humidity, (2) supply air and water temperatures, (3) zone airflow rates, (4) supply duct, outlet fan, mixing plenum static pressures, (5) fan speed, and (6) minimum and principal damper and cooling and heating coil valve positions. The additional variables that were considered, but not measured were: (1) fan and outdoor airflow rate, (2) inlet and outlet cooling coil relative humidity, and (3) liquid flow rate through the heating or cooling coils. The paper demonstrates the challenges of the validation process when monitored data of existing VAV systems are used. 7 refs., 11 figs.

  7. Discriminating neutrino mass models using Type-II see-saw formula

    Indian Academy of Sciences (India)

    though a fuller analysis needs the full matrix form when all terms are present. This is followed by the normal hierarchical model (Type [III]) and inverted hierarchical model with opposite CP phase (Type [IIB]). γ ≃ 10−2 for both of them. Our main results on neutrino masses and mixings in Type-II see-saw formula are presented ...

  8. Simultaneous Determination of Cobalt (II and Nickel (II By First Order Derivative Spectrophotometry in Micellar Media

    Directory of Open Access Journals (Sweden)

    Rajni Rohilla

    2012-01-01

    Full Text Available A first-derivative spectrophotometry method for the simultaneous determination of Co (II and Ni (II with Alizarin Red S in presence of Triton X-100 is described. Measurements were made at the zero-crossing wavelengths at 549.0 nm for Co (II and 546.0 nm for Ni (II. The linearity is obtained in the range of 0.291- 4.676 μg/ml of Ni (II and 0.293- 4.124 μg/ml of Co (II in the presence of each other by using first derivative spectrophotometric method. The possible interfering effects of various ions were studied. The validity of the method was examined by using synthetic mixtures of Co (II and Ni (II. The developed derivative procedure, using the zero crossing technique, has been successfully applied for the simultaneous analysis of Co (II and Ni (II in spiked water samples.

  9. Effects of Task Performance and Task Complexity on the Validity of Computational Models of Attention

    NARCIS (Netherlands)

    Koning, L. de; Maanen, P.P. van; Dongen, K. van

    2008-01-01

    Computational models of attention can be used as a component of decision support systems. For accurate support, a computational model of attention has to be valid and robust. The effects of task performance and task complexity on the validity of three different computational models of attention were

  10. Competitive adsorption of copper(II), cadmium(II), lead(II) and zinc(II) onto basic oxygen furnace slag

    International Nuclear Information System (INIS)

    Xue Yongjie; Hou Haobo; Zhu Shujing

    2009-01-01

    Polluted and contaminated water can often contain more than one heavy metal species. It is possible that the behavior of a particular metal species in a solution system will be affected by the presence of other metals. In this study, we have investigated the adsorption of Cd(II), Cu(II), Pb(II), and Zn(II) onto basic oxygen furnace slag (BOF slag) in single- and multi-element solution systems as a function of pH and concentration, in a background solution of 0.01 M NaNO 3 . In adsorption edge experiments, the pH was varied from 2.0 to 13.0 with total metal concentration 0.84 mM in the single element system and 0.21 mM each of Cd(II), Cu(II), Pb(II), and Zn(II) in the multi-element system. The value of pH 50 (the pH at which 50% adsorption occurs) was found to follow the sequence Zn > Cu > Pb > Cd in single-element systems, but Pb > Cu > Zn > Cd in the multi-element system. Adsorption isotherms at pH 6.0 in the multi-element systems showed that there is competition among various metals for adsorption sites on BOF slag. The adsorption and potentiometric titrations data for various slag-metal systems were modeled using an extended constant-capacitance surface complexation model that assumed an ion-exchange process below pH 6.5 and the formation of inner-sphere surface complexes at higher pH. Inner-sphere complexation was more dominant for the Cu(II), Pb(II) and Zn(II) systems

  11. Competitive adsorption of copper(II), cadmium(II), lead(II) and zinc(II) onto basic oxygen furnace slag

    Energy Technology Data Exchange (ETDEWEB)

    Xue Yongjie [School of Resource and Environment Science, Wuhan University, Hubei, Wuhan (China); Wuhan Kaidi Electric Power Environmental Protection Co. Ltd., Hubei, Wuhan (China)], E-mail: xueyj@mail.whut.edu.cn; Hou Haobo; Zhu Shujing [School of Resource and Environment Science, Wuhan University, Hubei, Wuhan (China)

    2009-02-15

    Polluted and contaminated water can often contain more than one heavy metal species. It is possible that the behavior of a particular metal species in a solution system will be affected by the presence of other metals. In this study, we have investigated the adsorption of Cd(II), Cu(II), Pb(II), and Zn(II) onto basic oxygen furnace slag (BOF slag) in single- and multi-element solution systems as a function of pH and concentration, in a background solution of 0.01 M NaNO{sub 3}. In adsorption edge experiments, the pH was varied from 2.0 to 13.0 with total metal concentration 0.84 mM in the single element system and 0.21 mM each of Cd(II), Cu(II), Pb(II), and Zn(II) in the multi-element system. The value of pH{sub 50} (the pH at which 50% adsorption occurs) was found to follow the sequence Zn > Cu > Pb > Cd in single-element systems, but Pb > Cu > Zn > Cd in the multi-element system. Adsorption isotherms at pH 6.0 in the multi-element systems showed that there is competition among various metals for adsorption sites on BOF slag. The adsorption and potentiometric titrations data for various slag-metal systems were modeled using an extended constant-capacitance surface complexation model that assumed an ion-exchange process below pH 6.5 and the formation of inner-sphere surface complexes at higher pH. Inner-sphere complexation was more dominant for the Cu(II), Pb(II) and Zn(II) systems.

  12. Competitive adsorption of copper(II), cadmium(II), lead(II) and zinc(II) onto basic oxygen furnace slag.

    Science.gov (United States)

    Xue, Yongjie; Hou, Haobo; Zhu, Shujing

    2009-02-15

    Polluted and contaminated water can often contain more than one heavy metal species. It is possible that the behavior of a particular metal species in a solution system will be affected by the presence of other metals. In this study, we have investigated the adsorption of Cd(II), Cu(II), Pb(II), and Zn(II) onto basic oxygen furnace slag (BOF slag) in single- and multi-element solution systems as a function of pH and concentration, in a background solution of 0.01M NaNO(3). In adsorption edge experiments, the pH was varied from 2.0 to 13.0 with total metal concentration 0.84mM in the single element system and 0.21mM each of Cd(II), Cu(II), Pb(II), and Zn(II) in the multi-element system. The value of pH(50) (the pH at which 50% adsorption occurs) was found to follow the sequence Zn>Cu>Pb>Cd in single-element systems, but Pb>Cu>Zn>Cd in the multi-element system. Adsorption isotherms at pH 6.0 in the multi-element systems showed that there is competition among various metals for adsorption sites on BOF slag. The adsorption and potentiometric titrations data for various slag-metal systems were modeled using an extended constant-capacitance surface complexation model that assumed an ion-exchange process below pH 6.5 and the formation of inner-sphere surface complexes at higher pH. Inner-sphere complexation was more dominant for the Cu(II), Pb(II) and Zn(II) systems.

  13. Temporal validation for landsat-based volume estimation model

    Science.gov (United States)

    Renaldo J. Arroyo; Emily B. Schultz; Thomas G. Matney; David L. Evans; Zhaofei Fan

    2015-01-01

    Satellite imagery can potentially reduce the costs and time associated with ground-based forest inventories; however, for satellite imagery to provide reliable forest inventory data, it must produce consistent results from one time period to the next. The objective of this study was to temporally validate a Landsat-based volume estimation model in a four county study...

  14. Calcium-manganese oxides as structural and functional models for active site in oxygen evolving complex in photosystem II: lessons from simple models.

    Science.gov (United States)

    Najafpour, Mohammad Mahdi

    2011-01-01

    The oxygen evolving complex in photosystem II which induces the oxidation of water to dioxygen in plants, algae and certain bacteria contains a cluster of one calcium and four manganese ions. It serves as a model to split water by sunlight. Reports on the mechanism and structure of photosystem II provide a more detailed architecture of the oxygen evolving complex and the surrounding amino acids. One challenge in this field is the development of artificial model compounds to study oxygen evolution reaction outside the complicated environment of the enzyme. Calcium-manganese oxides as structural and functional models for the active site of photosystem II are explained and reviewed in this paper. Because of related structures of these calcium-manganese oxides and the catalytic centers of active site of the oxygen evolving complex of photosystem II, the study may help to understand more about mechanism of oxygen evolution by the oxygen evolving complex of photosystem II. Copyright © 2010 Elsevier B.V. All rights reserved.

  15. Computer augumented modelling studies of Pb(II, Cd(II, Hg(II, Co(II, Ni(II, Cu(II and Zn(II complexes of L-glutamic acid in 1,2-propanediol–water mixtures

    Directory of Open Access Journals (Sweden)

    MAHESWARA RAO VEGI

    2008-12-01

    Full Text Available Chemical speciation of Pb(II, Cd(II, Hg(II, Co(II, Ni(II, Cu(II and Zn(II complexes of L-glutamic acid was studied at 303 K in 0–60 vol. % 1,2-propanediol–water mixtures, whereby the ionic strength was maintained at 0.16 mol dm-3. The active forms of the ligand are LH3+, LH2 and LH–. The predominant detected species were ML, ML2, MLH, ML2H and ML2H2. The trend of the variation in the stability constants with changing dielectric constant of the medium is explained based on the cation stabilizing nature of the co-solvents, specific solvent–water interactions, charge dispersion and specific interactions of the co-solvent with the solute. The effect of systematic errors in the concentrations of the substances on the stability constants is in the order alkali > > acid > ligand > metal. The bioavailability and transportation of metals are explained based on distribution diagrams and stability constants.

  16. Validation of PV-RPM Code in the System Advisor Model.

    Energy Technology Data Exchange (ETDEWEB)

    Klise, Geoffrey Taylor [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lavrova, Olga [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Freeman, Janine [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-04-01

    This paper describes efforts made by Sandia National Laboratories (SNL) and the National Renewable Energy Laboratory (NREL) to validate the SNL developed PV Reliability Performance Model (PV - RPM) algorithm as implemented in the NREL System Advisor Model (SAM). The PV - RPM model is a library of functions that estimates component failure and repair in a photovoltaic system over a desired simulation period. The failure and repair distributions in this paper are probabilistic representations of component failure and repair based on data collected by SNL for a PV power plant operating in Arizona. The validation effort focuses on whether the failure and repair dist ributions used in the SAM implementation result in estimated failures that match the expected failures developed in the proof - of - concept implementation. Results indicate that the SAM implementation of PV - RPM provides the same results as the proof - of - concep t implementation, indicating the algorithms were reproduced successfully.

  17. Deployable and Conformal Planar Micro-Devices: Design and Model Validation

    Directory of Open Access Journals (Sweden)

    Jinda Zhuang

    2014-08-01

    Full Text Available We report a design concept for a deployable planar microdevice and the modeling and experimental validation of its mechanical behavior. The device consists of foldable membranes that are suspended between flexible stems and actuated by push-pull wires. Such a deployable device can be introduced into a region of interest in its compact “collapsed” state and then deployed to conformally cover a large two-dimensional surface area for minimally invasive biomedical operations and other engineering applications. We develop and experimentally validate theoretical models based on the energy minimization approach to examine the conformality and figures of merit of the device. The experimental results obtained using model contact surfaces agree well with the prediction and quantitatively highlight the importance of the membrane bending modulus in controlling surface conformality. The present study establishes an early foundation for the mechanical design of this and related deployable planar microdevice concepts.

  18. Modeling and validating HL7 FHIR profiles using semantic web Shape Expressions (ShEx).

    Science.gov (United States)

    Solbrig, Harold R; Prud'hommeaux, Eric; Grieve, Grahame; McKenzie, Lloyd; Mandel, Joshua C; Sharma, Deepak K; Jiang, Guoqian

    2017-03-01

    HL7 Fast Healthcare Interoperability Resources (FHIR) is an emerging open standard for the exchange of electronic healthcare information. FHIR resources are defined in a specialized modeling language. FHIR instances can currently be represented in either XML or JSON. The FHIR and Semantic Web communities are developing a third FHIR instance representation format in Resource Description Framework (RDF). Shape Expressions (ShEx), a formal RDF data constraint language, is a candidate for describing and validating the FHIR RDF representation. Create a FHIR to ShEx model transformation and assess its ability to describe and validate FHIR RDF data. We created the methods and tools that generate the ShEx schemas modeling the FHIR to RDF specification being developed by HL7 ITS/W3C RDF Task Force, and evaluated the applicability of ShEx in the description and validation of FHIR to RDF transformations. The ShEx models contributed significantly to workgroup consensus. Algorithmic transformations from the FHIR model to ShEx schemas and FHIR example data to RDF transformations were incorporated into the FHIR build process. ShEx schemas representing 109 FHIR resources were used to validate 511 FHIR RDF data examples from the Standards for Trial Use (STU 3) Ballot version. We were able to uncover unresolved issues in the FHIR to RDF specification and detect 10 types of errors and root causes in the actual implementation. The FHIR ShEx representations have been included in the official FHIR web pages for the STU 3 Ballot version since September 2016. ShEx can be used to define and validate the syntax of a FHIR resource, which is complementary to the use of RDF Schema (RDFS) and Web Ontology Language (OWL) for semantic validation. ShEx proved useful for describing a standard model of FHIR RDF data. The combination of a formal model and a succinct format enabled comprehensive review and automated validation. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Validation of a tuber blight (Phytophthora infestans) prediction model

    Science.gov (United States)

    Potato tuber blight caused by Phytophthora infestans accounts for significant losses in storage. There is limited published quantitative data on predicting tuber blight. We validated a tuber blight prediction model developed in New York with cultivars Allegany, NY 101, and Katahdin using independent...

  20. Technical Note: Calibration and validation of geophysical observation models

    NARCIS (Netherlands)

    Salama, M.S.; van der Velde, R.; van der Woerd, H.J.; Kromkamp, J.C.; Philippart, C.J.M.; Joseph, A.T.; O'Neill, P.E.; Lang, R.H.; Gish, T.; Werdell, P.J.; Su, Z.

    2012-01-01

    We present a method to calibrate and validate observational models that interrelate remotely sensed energy fluxes to geophysical variables of land and water surfaces. Coincident sets of remote sensing observation of visible and microwave radiations and geophysical data are assembled and subdivided

  1. System modelling to support accelerated fuel transfer rate at EBR-II

    International Nuclear Information System (INIS)

    Imel, G.R.; Houshyar, A.; Planchon, H.P.; Cutforth, D.C.

    1995-01-01

    The Experimental Breeder Reactor-II (EBR-II) ia a 62.5 MW(th) liquid metal reactor operated by Argonne National Laboratory for The United States Department of Energy. The reactor is located near Idaho Falls, Idaho at the Argonne-West site (ANL-W). Full power operation was achieved in 1964,- the reactor operated continuously since that time until October 1994 in a variety of configurations depending on the programmatic mission. A three year program was initiated in October, 1993 to replace the 330 depleted uranium blanket subassemblies (S/As) with stainless steel reflectors. It was intended to operate the reactor during the three year blanket unloading program, followed by about a half year of driver fuel unloading. However, in the summer of 1994, Congress dictacted that EBR-II be shut down October 1, and complete defueling without operation. To assist in the planning for resources needed for this defueling campaign, a mathematical model of the fuel handling sequence was developed utilizing the appropriate reliability factors and inherent mm constraints of each stage of the process. The model allows predictions of transfer rates under different scenarios. Additionally, it has facilitated planning of maintenance activities, as well as optimization of resources regarding manpower and modification effort. The model and its application is described in this paper

  2. Quantification of Dynamic Model Validation Metrics Using Uncertainty Propagation from Requirements

    Science.gov (United States)

    Brown, Andrew M.; Peck, Jeffrey A.; Stewart, Eric C.

    2018-01-01

    The Space Launch System, NASA's new large launch vehicle for long range space exploration, is presently in the final design and construction phases, with the first launch scheduled for 2019. A dynamic model of the system has been created and is critical for calculation of interface loads and natural frequencies and mode shapes for guidance, navigation, and control (GNC). Because of the program and schedule constraints, a single modal test of the SLS will be performed while bolted down to the Mobile Launch Pad just before the first launch. A Monte Carlo and optimization scheme will be performed to create thousands of possible models based on given dispersions in model properties and to determine which model best fits the natural frequencies and mode shapes from modal test. However, the question still remains as to whether this model is acceptable for the loads and GNC requirements. An uncertainty propagation and quantification (UP and UQ) technique to develop a quantitative set of validation metrics that is based on the flight requirements has therefore been developed and is discussed in this paper. There has been considerable research on UQ and UP and validation in the literature, but very little on propagating the uncertainties from requirements, so most validation metrics are "rules-of-thumb;" this research seeks to come up with more reason-based metrics. One of the main assumptions used to achieve this task is that the uncertainty in the modeling of the fixed boundary condition is accurate, so therefore that same uncertainty can be used in propagating the fixed-test configuration to the free-free actual configuration. The second main technique applied here is the usage of the limit-state formulation to quantify the final probabilistic parameters and to compare them with the requirements. These techniques are explored with a simple lumped spring-mass system and a simplified SLS model. When completed, it is anticipated that this requirements-based validation

  3. Heat Transfer Modeling and Validation for Optically Thick Alumina Fibrous Insulation

    Science.gov (United States)

    Daryabeigi, Kamran

    2009-01-01

    Combined radiation/conduction heat transfer through unbonded alumina fibrous insulation was modeled using the diffusion approximation for modeling the radiation component of heat transfer in the optically thick insulation. The validity of the heat transfer model was investigated by comparison to previously reported experimental effective thermal conductivity data over the insulation density range of 24 to 96 kg/cu m, with a pressure range of 0.001 to 750 torr (0.1 to 101.3 x 10(exp 3) Pa), and test sample hot side temperature range of 530 to 1360 K. The model was further validated by comparison to thermal conductivity measurements using the transient step heating technique on an insulation sample at a density of 144 kg/cu m over a pressure range of 0.001 to 760 torr, and temperature range of 290 to 1090 K.

  4. Reference methodologies for radioactive controlled discharges an activity within the IAEA's Program Environmental Modelling for Radiation Safety II (EMRAS II)

    International Nuclear Information System (INIS)

    Stocki, T.J.; Bergman, L.; Tellería, D.M.; Proehl, G.; Amado, V.; Curti, A.; Bonchuk, I.; Boyer, P.; Mourlon, C.; Chyly, P.; Heling, R.; Sági, L.; Kliaus, V.; Krajewski, P.; Latouche, G.; Lauria, D.C.; Newsome, L.; Smith, J.

    2011-01-01

    In January 2009, the IAEA EMRAS II (Environmental Modelling for Radiation Safety II) program was launched. The goal of the program is to develop, compare and test models for the assessment of radiological impacts to the public and the environment due to radionuclides being released or already existing in the environment; to help countries build and harmonize their capabilities; and to model the movement of radionuclides in the environment. Within EMRAS II, nine working groups are active; this paper will focus on the activities of Working Group 1: Reference Methodologies for Controlling Discharges of Routine Releases. Within this working group environmental transfer and dose assessment models are tested under different scenarios by participating countries and the results compared. This process allows each participating country to identify characteristics of their models that need to be refined. The goal of this working group is to identify reference methodologies for the assessment of exposures to the public due to routine discharges of radionuclides to the terrestrial and aquatic environments. Several different models are being applied to estimate the transfer of radionuclides in the environment for various scenarios. The first phase of the project involves a scenario of nuclear power reactor with a coastal location which routinely (continuously) discharges 60Co, 85Kr, 131I, and 137Cs to the atmosphere and 60Co, 137Cs, and 90Sr to the marine environment. In this scenario many of the parameters and characteristics of the representative group were given to the modelers and cannot be altered. Various models have been used by the different participants in this inter-comparison (PC-CREAM, CROM, IMPACT, CLRP POSEIDON, SYMBIOSE and others). This first scenario is to enable a comparison of the radionuclide transport and dose modelling. These scenarios will facilitate the development of reference methodologies for controlled discharges. (authors)

  5. Validating a model that predicts daily growth and feed quality of New Zealand dairy pastures.

    Science.gov (United States)

    Woodward, S J

    2001-09-01

    The Pasture Quality (PQ) model is a simple, mechanistic, dynamical system model that was designed to capture the essential biological processes in grazed grass-clover pasture, and to be optimised to derive improved grazing strategies for New Zealand dairy farms. While the individual processes represented in the model (photosynthesis, tissue growth, flowering, leaf death, decomposition, worms) were based on experimental data, this did not guarantee that the assembled model would accurately predict the behaviour of the system as a whole (i.e., pasture growth and quality). Validation of the whole model was thus a priority, since any strategy derived from the model could impact a farm business in the order of thousands of dollars per annum if adopted. This paper describes the process of defining performance criteria for the model, obtaining suitable data to test the model, and carrying out the validation analysis. The validation process highlighted a number of weaknesses in the model, which will lead to the model being improved. As a result, the model's utility will be enhanced. Furthermore, validation was found to have an unexpected additional benefit, in that despite the model's poor initial performance, support was generated for the model among field scientists involved in the wider project.

  6. Evaluation of Validity and Reliability for Hierarchical Scales Using Latent Variable Modeling

    Science.gov (United States)

    Raykov, Tenko; Marcoulides, George A.

    2012-01-01

    A latent variable modeling method is outlined, which accomplishes estimation of criterion validity and reliability for a multicomponent measuring instrument with hierarchical structure. The approach provides point and interval estimates for the scale criterion validity and reliability coefficients, and can also be used for testing composite or…

  7. Modeling and optimization by particle swarm embedded neural network for adsorption of zinc (II) by palm kernel shell based activated carbon from aqueous environment.

    Science.gov (United States)

    Karri, Rama Rao; Sahu, J N

    2018-01-15

    Zn (II) is one the common pollutant among heavy metals found in industrial effluents. Removal of pollutant from industrial effluents can be accomplished by various techniques, out of which adsorption was found to be an efficient method. Applications of adsorption limits itself due to high cost of adsorbent. In this regard, a low cost adsorbent produced from palm oil kernel shell based agricultural waste is examined for its efficiency to remove Zn (II) from waste water and aqueous solution. The influence of independent process variables like initial concentration, pH, residence time, activated carbon (AC) dosage and process temperature on the removal of Zn (II) by palm kernel shell based AC from batch adsorption process are studied systematically. Based on the design of experimental matrix, 50 experimental runs are performed with each process variable in the experimental range. The optimal values of process variables to achieve maximum removal efficiency is studied using response surface methodology (RSM) and artificial neural network (ANN) approaches. A quadratic model, which consists of first order and second order degree regressive model is developed using the analysis of variance and RSM - CCD framework. The particle swarm optimization which is a meta-heuristic optimization is embedded on the ANN architecture to optimize the search space of neural network. The optimized trained neural network well depicts the testing data and validation data with R 2 equal to 0.9106 and 0.9279 respectively. The outcomes indicates that the superiority of ANN-PSO based model predictions over the quadratic model predictions provided by RSM. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. When is the Anelastic Approximation a Valid Model for Compressible Convection?

    Science.gov (United States)

    Alboussiere, T.; Curbelo, J.; Labrosse, S.; Ricard, Y. R.; Dubuffet, F.

    2017-12-01

    Compressible convection is ubiquitous in large natural systems such Planetary atmospheres, stellar and planetary interiors. Its modelling is notoriously more difficult than the case when the Boussinesq approximation applies. One reason for that difficulty has been put forward by Ogura and Phillips (1961): the compressible equations generate sound waves with very short time scales which need to be resolved. This is why they introduced an anelastic model, based on an expansion of the solution around an isentropic hydrostatic profile. How accurate is that anelastic model? What are the conditions for its validity? To answer these questions, we have developed a numerical model for the full set of compressible equations and compared its solutions with those of the corresponding anelastic model. We considered a simple rectangular 2D Rayleigh-Bénard configuration and decided to restrict the analysis to infinite Prandtl numbers. This choice is valid for convection in the mantles of rocky planets, but more importantly lead to a zero Mach number. So we got rid of the question of the interference of acoustic waves with convection. In that simplified context, we used the entropy balances (that of the full set of equations and that of the anelastic model) to investigate the differences between exact and anelastic solutions. We found that the validity of the anelastic model is dictated by two conditions: first, the superadiabatic temperature difference must be small compared with the adiabatic temperature difference (as expected) ɛ = Δ TSA / delta Ta << 1, and secondly that the product of ɛ with the Nusselt number must be small.

  9. The customization of APACHE II for patients receiving orthotopic liver transplants

    Science.gov (United States)

    Moreno, Rui

    2002-01-01

    General outcome prediction models developed for use with large, multicenter databases of critically ill patients may not correctly estimate mortality if applied to a particular group of patients that was under-represented in the original database. The development of new diagnostic weights has been proposed as a method of adapting the general model – the Acute Physiology and Chronic Health Evaluation (APACHE) II in this case – to a new group of patients. Such customization must be empirically tested, because the original model cannot contain an appropriate set of predictive variables for the particular group. In this issue of Critical Care, Arabi and co-workers present the results of the validation of a modified model of the APACHE II system for patients receiving orthotopic liver transplants. The use of a highly heterogeneous database for which not all important variables were taken into account and of a sample too small to use the Hosmer–Lemeshow goodness-of-fit test appropriately makes their conclusions uncertain. PMID:12133174

  10. Qualitative Validation of the IMM Model for ISS and STS Programs

    Science.gov (United States)

    Kerstman, E.; Walton, M.; Reyes, D.; Boley, L.; Saile, L.; Young, M.; Arellano, J.; Garcia, Y.; Myers, J. G.

    2016-01-01

    To validate and further improve the Integrated Medical Model (IMM), medical event data were obtained from 32 ISS and 122 STS person-missions. Using the crew characteristics from these observed missions, IMM v4.0 was used to forecast medical events and medical resource utilization. The IMM medical condition incidence values were compared to the actual observed medical event incidence values, and the IMM forecasted medical resource utilization was compared to actual observed medical resource utilization. Qualitative comparisons of these parameters were conducted for both the ISS and STS programs. The results of these analyses will provide validation of IMM v4.0 and reveal areas of the model requiring adjustments to improve the overall accuracy of IMM outputs. This validation effort should result in enhanced credibility of the IMM and improved confidence in the use of IMM as a decision support tool for human space flight.

  11. Model-based Systems Engineering: Creation and Implementation of Model Validation Rules for MOS 2.0

    Science.gov (United States)

    Schmidt, Conrad K.

    2013-01-01

    Model-based Systems Engineering (MBSE) is an emerging modeling application that is used to enhance the system development process. MBSE allows for the centralization of project and system information that would otherwise be stored in extraneous locations, yielding better communication, expedited document generation and increased knowledge capture. Based on MBSE concepts and the employment of the Systems Modeling Language (SysML), extremely large and complex systems can be modeled from conceptual design through all system lifecycles. The Operations Revitalization Initiative (OpsRev) seeks to leverage MBSE to modernize the aging Advanced Multi-Mission Operations Systems (AMMOS) into the Mission Operations System 2.0 (MOS 2.0). The MOS 2.0 will be delivered in a series of conceptual and design models and documents built using the modeling tool MagicDraw. To ensure model completeness and cohesiveness, it is imperative that the MOS 2.0 models adhere to the specifications, patterns and profiles of the Mission Service Architecture Framework, thus leading to the use of validation rules. This paper outlines the process by which validation rules are identified, designed, implemented and tested. Ultimately, these rules provide the ability to maintain model correctness and synchronization in a simple, quick and effective manner, thus allowing the continuation of project and system progress.

  12. Multiphysics modelling and experimental validation of high concentration photovoltaic modules

    International Nuclear Information System (INIS)

    Theristis, Marios; Fernández, Eduardo F.; Sumner, Mike; O'Donovan, Tadhg S.

    2017-01-01

    Highlights: • A multiphysics modelling approach for concentrating photovoltaics was developed. • An experimental campaign was conducted to validate the models. • The experimental results were in good agreement with the models. • The multiphysics modelling allows the concentrator’s optimisation. - Abstract: High concentration photovoltaics, equipped with high efficiency multijunction solar cells, have great potential in achieving cost-effective and clean electricity generation at utility scale. Such systems are more complex compared to conventional photovoltaics because of the multiphysics effect that is present. Modelling the power output of such systems is therefore crucial for their further market penetration. Following this line, a multiphysics modelling procedure for high concentration photovoltaics is presented in this work. It combines an open source spectral model, a single diode electrical model and a three-dimensional finite element thermal model. In order to validate the models and the multiphysics modelling procedure against actual data, an outdoor experimental campaign was conducted in Albuquerque, New Mexico using a high concentration photovoltaic monomodule that is thoroughly described in terms of its geometry and materials. The experimental results were in good agreement (within 2.7%) with the predicted maximum power point. This multiphysics approach is relatively more complex when compared to empirical models, but besides the overall performance prediction it can also provide better understanding of the physics involved in the conversion of solar irradiance into electricity. It can therefore be used for the design and optimisation of high concentration photovoltaic modules.

  13. Combining satellite radar altimetry, SAR surface soil moisture and GRACE total storage changes for model calibration and validation in a large ungauged catchment

    DEFF Research Database (Denmark)

    Milzow, Christian; Krogh, Pernille Engelbredt; Bauer-Gottwein, Peter

    2010-01-01

    The availability of data is a major challenge for hydrological modelling in large parts of the world. Remote sensing data can be exploited to improve models of ungauged or poorly gauged catchments. In this study we combine three datasets for calibration and validation of a rainfall-runoff model...... of the ungauged Okavango catchment in Southern Africa: (i) Surface soil moisture (SSM) estimates derived from SAR measurements onboard the Envisat satellite; (ii) Radar altimetry measurements by Envisat providing river stages in the tributaries of the Okavango catchment, down to a minimum width of about one...... hundred meters; and (iii) Temporal changes of the Earth’s gravity field recorded by the Gravity Recovery and Climate Experiment (GRACE) caused by total water storage changes in the catchment. The SSM data are compared to simulated moisture conditions in the top soil layer. They cannot be used for model...

  14. A Decision Support System (GesCoN for Managing Fertigation in Vegetable Crops. Part IIModel calibration and validation under different environmental growing conditions on field grown tomato

    Directory of Open Access Journals (Sweden)

    Giulia eConversa

    2015-07-01

    Full Text Available The GesCoN model was evaluated for its capability to simulate growth, nitrogen uptake and productivity of open field tomato grown under different environmental and cultural conditions. Five datasets collected from experimental trials carried out in Foggia (IT were used for calibration and 13 datasets collected from trials conducted in Foggia, Perugia (IT and Florida (USA were used for validation. The goodness of fitting was performed by comparing the observed and simulated shoot dry weight (SDW and N crop uptake during crop seasons, total dry weight (TDW, N uptake and fresh yield (TFY. In SDW model calibration, the relative RMSE values fell within the good 10 to 15% range, percent BIAS (PBIAS ranged between -11.5% and 7.4%. The Nash-Sutcliffe efficiency (NSE was very close to the optimal value 1. In the N uptake calibration RRMSE and PBIAS were very low(7%, and -1.78, respectively and NSE close to 1. The validation of SDW (RRMSE=16.7%; NSE=0.96 and N uptake (RRMSE=16.8%; NSE=0.96 showed the good accuracy of GesCoN. A model under- or overestimation of the SDW and N uptake occurred when higher or a lower N rates and/or a more or less efficient system were used compared to the calibration trial. The in-season adjustment, using the SDWcheck procedure, greatly improved model simulations both in the calibration and in the validation phases. The TFY prediction was quite good except in Florida, where a large overestimation (+16% was linked to a different harvest index (0.53 compared the cultivars used for model calibration and validation in Italian areas. The soil water content at the 10-30 cm depth appears to be well simulated by the software, and the GesCoN proved to be able to adaptively control potential yield and DW accumulation under limited N soil availability scenarios and consequently to modify fertilizer application. The DSSwell simulate SDW accumulation and N uptake of different tomato genotypes grown under Mediterranean and subtropical

  15. Competition from Cu(II), Zn(II) and Cd(II) in Pb(II) binding to Suwannee River Fulvic Acid

    NARCIS (Netherlands)

    Chakraborty, P.; Chakrabarti, C.L.

    2008-01-01

    This is a study of trace metal competition in the complexation of Pb(II) by well-characterized humic substances, namely Suwannee River Fulvic Acid (SRFA) in model solutions. It was found that Cu(II) seems to compete with Pb(II) for strong binding sites of SRFA when present at the same concentration

  16. The effect of serum angiotensin II and angiotensin II type 1 receptor ...

    African Journals Online (AJOL)

    Ehab

    2012-06-18

    Jun 18, 2012 ... case-control cross sectional study which included 24 patients with pLN ..... significantly high levels (1000-fold) of Ang II .... initial validation of the Systemic Lupus International ... Fyhrquist F, Metsärinne K, Tikkanen I. Role of.

  17. DMFC performance and methanol cross-over: Experimental analysis and model validation

    Energy Technology Data Exchange (ETDEWEB)

    Casalegno, A.; Marchesi, R. [Dipartimento di Energia, Politecnico di Milano, Piazza Leonardo da Vinci 32, 20133 Milano (Italy)

    2008-10-15

    A combined experimental and modelling approach is proposed to analyze methanol cross-over and its effect on DMFC performance. The experimental analysis is performed in order to allow an accurate investigation of methanol cross-over influence on DMFC performance, hence measurements were characterized in terms of uncertainty and reproducibility. The findings suggest that methanol cross-over is mainly determined by diffusion transport and affects cell performance partly via methanol electro-oxidation at the cathode. The modelling analysis is carried out to further investigate methanol cross-over phenomenon. A simple model evaluates the effectiveness of two proposed interpretations regarding methanol cross-over and its effects. The model is validated using the experimental data gathered. Both the experimental analysis and the proposed and validated model allow a substantial step forward in the understanding of the main phenomena associated with methanol cross-over. The findings confirm the possibility to reduce methanol cross-over by optimizing anode feeding. (author)

  18. Modelling and validation of electromechanical shock absorbers

    Science.gov (United States)

    Tonoli, Andrea; Amati, Nicola; Girardello Detoni, Joaquim; Galluzzi, Renato; Gasparin, Enrico

    2013-08-01

    Electromechanical vehicle suspension systems represent a promising substitute to conventional hydraulic solutions. However, the design of electromechanical devices that are able to supply high damping forces without exceeding geometric dimension and mass constraints is a difficult task. All these challenges meet in off-road vehicle suspension systems, where the power density of the dampers is a crucial parameter. In this context, the present paper outlines a particular shock absorber configuration where a suitable electric machine and a transmission mechanism are utilised to meet off-road vehicle requirements. A dynamic model is used to represent the device. Subsequently, experimental tests are performed on an actual prototype to verify the functionality of the damper and validate the proposed model.

  19. Modelling of PEM Fuel Cell Performance: Steady-State and Dynamic Experimental Validation

    Directory of Open Access Journals (Sweden)

    Idoia San Martín

    2014-02-01

    Full Text Available This paper reports on the modelling of a commercial 1.2 kW proton exchange membrane fuel cell (PEMFC, based on interrelated electrical and thermal models. The electrical model proposed is based on the integration of the thermodynamic and electrochemical phenomena taking place in the FC whilst the thermal model is established from the FC thermal energy balance. The combination of both models makes it possible to predict the FC voltage, based on the current demanded and the ambient temperature. Furthermore, an experimental characterization is conducted and the parameters for the models associated with the FC electrical and thermal performance are obtained. The models are implemented in Matlab Simulink and validated in a number of operating environments, for steady-state and dynamic modes alike. In turn, the FC models are validated in an actual microgrid operating environment, through the series connection of 4 PEMFC. The simulations of the models precisely and accurately reproduce the FC electrical and thermal performance.

  20. 3D Core Model for simulation of nuclear power plants: Simulation requirements, model features, and validation

    International Nuclear Information System (INIS)

    Zerbino, H.

    1999-01-01

    In 1994-1996, Thomson Training and Simulation (TT and S) earned out the D50 Project, which involved the design and construction of optimized replica simulators for one Dutch and three German Nuclear Power Plants. It was recognized early on that the faithful reproduction of the Siemens reactor control and protection systems would impose extremely stringent demands on the simulation models, particularly the Core physics and the RCS thermohydraulics. The quality of the models, and their thorough validation, were thus essential. The present paper describes the main features of the fully 3D Core model implemented by TT and S, and its extensive validation campaign, which was defined in extremely positive collaboration with the Customer and the Core Data suppliers. (author)

  1. Cross-Validation of Aerobic Capacity Prediction Models in Adolescents.

    Science.gov (United States)

    Burns, Ryan Donald; Hannon, James C; Brusseau, Timothy A; Eisenman, Patricia A; Saint-Maurice, Pedro F; Welk, Greg J; Mahar, Matthew T

    2015-08-01

    Cardiorespiratory endurance is a component of health-related fitness. FITNESSGRAM recommends the Progressive Aerobic Cardiovascular Endurance Run (PACER) or One mile Run/Walk (1MRW) to assess cardiorespiratory endurance by estimating VO2 Peak. No research has cross-validated prediction models from both PACER and 1MRW, including the New PACER Model and PACER-Mile Equivalent (PACER-MEQ) using current standards. The purpose of this study was to cross-validate prediction models from PACER and 1MRW against measured VO2 Peak in adolescents. Cardiorespiratory endurance data were collected on 90 adolescents aged 13-16 years (Mean = 14.7 ± 1.3 years; 32 girls, 52 boys) who completed the PACER and 1MRW in addition to a laboratory maximal treadmill test to measure VO2 Peak. Multiple correlations among various models with measured VO2 Peak were considered moderately strong (R = .74-0.78), and prediction error (RMSE) ranged from 5.95 ml·kg⁻¹,min⁻¹ to 8.27 ml·kg⁻¹.min⁻¹. Criterion-referenced agreement into FITNESSGRAM's Healthy Fitness Zones was considered fair-to-good among models (Kappa = 0.31-0.62; Agreement = 75.5-89.9%; F = 0.08-0.65). In conclusion, prediction models demonstrated moderately strong linear relationships with measured VO2 Peak, fair prediction error, and fair-to-good criterion referenced agreement with measured VO2 Peak into FITNESSGRAM's Healthy Fitness Zones.

  2. Unit testing, model validation, and biological simulation.

    Science.gov (United States)

    Sarma, Gopal P; Jacobs, Travis W; Watts, Mark D; Ghayoomie, S Vahid; Larson, Stephen D; Gerkin, Richard C

    2016-01-01

    The growth of the software industry has gone hand in hand with the development of tools and cultural practices for ensuring the reliability of complex pieces of software. These tools and practices are now acknowledged to be essential to the management of modern software. As computational models and methods have become increasingly common in the biological sciences, it is important to examine how these practices can accelerate biological software development and improve research quality. In this article, we give a focused case study of our experience with the practices of unit testing and test-driven development in OpenWorm, an open-science project aimed at modeling Caenorhabditis elegans. We identify and discuss the challenges of incorporating test-driven development into a heterogeneous, data-driven project, as well as the role of model validation tests, a category of tests unique to software which expresses scientific models.

  3. Sizing and scaling requirements of a large-scale physical model for code validation

    International Nuclear Information System (INIS)

    Khaleel, R.; Legore, T.

    1990-01-01

    Model validation is an important consideration in application of a code for performance assessment and therefore in assessing the long-term behavior of the engineered and natural barriers of a geologic repository. Scaling considerations relevant to porous media flow are reviewed. An analysis approach is presented for determining the sizing requirements of a large-scale, hydrology physical model. The physical model will be used to validate performance assessment codes that evaluate the long-term behavior of the repository isolation system. Numerical simulation results for sizing requirements are presented for a porous medium model in which the media properties are spatially uncorrelated

  4. Results of the project 'combustion modelling' (BKM II); Ergebnisse des Projekts 'Brennkammermodellierung' (BKM II)

    Energy Technology Data Exchange (ETDEWEB)

    Noll, B.; Rachner, M.; Frank, P.; Schmitz, G.; Geigle, K.P.; Meier, W.; Schuetz, H.; Aigner, M. [DLR Deutsches Zentrum fuer Luft- und Raumfahrt e.V., Stuttgart (Germany). Inst. fuer Verbrennungstechnik; Kessler, R. [DLR Deutsches Zentrum fuer Luft- und Raumfahrt e.V., Goettingen (Germany). Inst. fuer Aerodynamik und Stroemungstechnik; Lehmann, B. [DLR Deutsches Zentrum fuer Luft- und Raumfahrt e.V., Koeln (Germany). Inst. fuer Antriebstechnik; Forkert, T. [DLR Deutsches Zentrum fuer Luft- und Raumfahrt e.V., Koeln (Germany). Simulation und Softwaretechnik

    2002-07-01

    In the year 1996 the spheres of competence of several DLR-Institutes working in the areas of fluid dynamics, reaction kinetics, combustion, numerical methods and laser measuring techniques have been brought together while contributing to the internal DLR project 'combustion chamber modelling (BKM)', in order to proceed with the computational simulation of combustion processes in combustion chambers of gas turbines. The main issue was the development of a research code for numerical simulation of fluid flow in real combustion chambers. Here the development of computational models of physical and chemical processes was emphasized, among other processes the formation of soot was treated. Moreover, a worldwide outstanding database of measured data for the purpose of code validation has been created within the framework of the BKM project using the laboratory facilities of the DLR, which are in Germany unique for the experimental investigation of the various processes in combustion chambers of gas turbines. The project BKM is part of the specific DLR-programme 'energy'. With the successful completion of the first phase of the project in 1998, a second project phase of three years (BKM II) has been launched at the beginning of 1999. Here the work of the first phase continued and new topics were tackled. The second phase of the project was partly founded by the DLR-programme 'aeronautics'. (orig.) [German] Im Jahr 1996 wurden die Faehigkeiten mehrerer DLR-Institute auf den Gebieten Stroemungsmechanik, Reaktionskinetik, Verbrennung sowie Numerische Verfahren und Laser-Messverfahren in dem DLR-internen Projekt 'Brennkammermodellierung' (BKM) zusammengefuehrt, um die rechnerische Simulation der Verbrennungsvorgaenge in Gasturbinen-Brennkammern voranzutreiben. Dabei war die Entwicklung eines Forschungscodes zur numerischen Simulation von realen Brennkammerstroemungen das vorrangige Ziel der Arbeiten. Ein besonderes Schwergewicht lag

  5. Wave Tank Testing and Model Validation of an Autonomous Wave Energy Converter

    Directory of Open Access Journals (Sweden)

    Bret Bosma

    2015-08-01

    Full Text Available A key component in bringing ocean wave energy converters from concept to commercialization is the building and testing of scaled prototypes to provide model validation. A one quarter scale prototype of an autonomous two body heaving point absorber was modeled, built, and tested for this work. Wave tank testing results are compared with two hydrodynamic and system models—implemented in both ANSYS AQWA and MATLAB/Simulink—and show model validation over certain regions of operation. This work will serve as a guide for future developers of wave energy converter devices, providing insight in taking their design from concept to prototype stage.

  6. Using the mouse to model human disease: increasing validity and reproducibility

    Directory of Open Access Journals (Sweden)

    Monica J. Justice

    2016-02-01

    Full Text Available Experiments that use the mouse as a model for disease have recently come under scrutiny because of the repeated failure of data, particularly derived from preclinical studies, to be replicated or translated to humans. The usefulness of mouse models has been questioned because of irreproducibility and poor recapitulation of human conditions. Newer studies, however, point to bias in reporting results and improper data analysis as key factors that limit reproducibility and validity of preclinical mouse research. Inaccurate and incomplete descriptions of experimental conditions also contribute. Here, we provide guidance on best practice in mouse experimentation, focusing on appropriate selection and validation of the model, sources of variation and their influence on phenotypic outcomes, minimum requirements for control sets, and the importance of rigorous statistics. Our goal is to raise the standards in mouse disease modeling to enhance reproducibility, reliability and clinical translation of findings.

  7. Methods for Geometric Data Validation of 3d City Models

    Science.gov (United States)

    Wagner, D.; Alam, N.; Wewetzer, M.; Pries, M.; Coors, V.

    2015-12-01

    Geometric quality of 3D city models is crucial for data analysis and simulation tasks, which are part of modern applications of the data (e.g. potential heating energy consumption of city quarters, solar potential, etc.). Geometric quality in these contexts is however a different concept as it is for 2D maps. In the latter case, aspects such as positional or temporal accuracy and correctness represent typical quality metrics of the data. They are defined in ISO 19157 and should be mentioned as part of the metadata. 3D data has a far wider range of aspects which influence their quality, plus the idea of quality itself is application dependent. Thus, concepts for definition of quality are needed, including methods to validate these definitions. Quality on this sense means internal validation and detection of inconsistent or wrong geometry according to a predefined set of rules. A useful starting point would be to have correct geometry in accordance with ISO 19107. A valid solid should consist of planar faces which touch their neighbours exclusively in defined corner points and edges. No gaps between them are allowed, and the whole feature must be 2-manifold. In this paper, we present methods to validate common geometric requirements for building geometry. Different checks based on several algorithms have been implemented to validate a set of rules derived from the solid definition mentioned above (e.g. water tightness of the solid or planarity of its polygons), as they were developed for the software tool CityDoctor. The method of each check is specified, with a special focus on the discussion of tolerance values where they are necessary. The checks include polygon level checks to validate the correctness of each polygon, i.e. closeness of the bounding linear ring and planarity. On the solid level, which is only validated if the polygons have passed validation, correct polygon orientation is checked, after self-intersections outside of defined corner points and edges

  8. A New Statistical Method to Determine the Degree of Validity of Health Economic Model Outcomes against Empirical Data.

    NARCIS (Netherlands)

    Corro Ramos, Isaac; van Voorn, George A K; Vemer, Pepijn; Feenstra, Talitha L; Al, Maiwenn J

    2017-01-01

    The validation of health economic (HE) model outcomes against empirical data is of key importance. Although statistical testing seems applicable, guidelines for the validation of HE models lack guidance on statistical validation, and actual validation efforts often present subjective judgment of

  9. Experimental validation of a mathematical model for seabed liquefaction in waves

    DEFF Research Database (Denmark)

    Sumer, B. Mutlu; Kirca, Özgür; Fredsøe, Jørgen

    2011-01-01

    This paper summarizes the results of an experimental study directed towards the validation of a mathematical model for the buildup of pore water pressure and resulting liquefaction of marine soils under progressive waves. Experiments were conducted under controlled conditions with silt ( d50 = 0.......070 mm) in a wave flume with a soil pit. Waves with wave heights in the range 7.7-18 cm with the water depth 55 cm and the wave period 1.6 s enabled us to study both the liquefaction and no-liquefaction regime pore water pressure buildup. The experimental data was used to validate the model. A numerical...

  10. Validity test and its consistency in the construction of patient loyalty model

    Science.gov (United States)

    Yanuar, Ferra

    2016-04-01

    The main objective of this present study is to demonstrate the estimation of validity values and its consistency based on structural equation model. The method of estimation was then implemented to an empirical data in case of the construction the patient loyalty model. In the hypothesis model, service quality, patient satisfaction and patient loyalty were determined simultaneously, each factor were measured by any indicator variables. The respondents involved in this study were the patients who ever got healthcare at Puskesmas in Padang, West Sumatera. All 394 respondents who had complete information were included in the analysis. This study found that each construct; service quality, patient satisfaction and patient loyalty were valid. It means that all hypothesized indicator variables were significant to measure their corresponding latent variable. Service quality is the most measured by tangible, patient satisfaction is the most mesured by satisfied on service and patient loyalty is the most measured by good service quality. Meanwhile in structural equation, this study found that patient loyalty was affected by patient satisfaction positively and directly. Service quality affected patient loyalty indirectly with patient satisfaction as mediator variable between both latent variables. Both structural equations were also valid. This study also proved that validity values which obtained here were also consistence based on simulation study using bootstrap approach.

  11. WEC-SIM Phase 1 Validation Testing -- Numerical Modeling of Experiments: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Ruehl, Kelley; Michelen, Carlos; Bosma, Bret; Yu, Yi-Hsiang

    2016-08-01

    The Wave Energy Converter Simulator (WEC-Sim) is an open-source code jointly developed by Sandia National Laboratories and the National Renewable Energy Laboratory. It is used to model wave energy converters subjected to operational and extreme waves. In order for the WEC-Sim code to be beneficial to the wave energy community, code verification and physical model validation is necessary. This paper describes numerical modeling of the wave tank testing for the 1:33-scale experimental testing of the floating oscillating surge wave energy converter. The comparison between WEC-Sim and the Phase 1 experimental data set serves as code validation. This paper is a follow-up to the WEC-Sim paper on experimental testing, and describes the WEC-Sim numerical simulations for the floating oscillating surge wave energy converter.

  12. Validation of the replica trick for simple models

    Science.gov (United States)

    Shinzato, Takashi

    2018-04-01

    We discuss the replica analytic continuation using several simple models in order to prove mathematically the validity of the replica analysis, which is used in a wide range of fields related to large-scale complex systems. While replica analysis consists of two analytical techniques—the replica trick (or replica analytic continuation) and the thermodynamical limit (and/or order parameter expansion)—we focus our study on replica analytic continuation, which is the mathematical basis of the replica trick. We apply replica analysis to solve a variety of analytical models, and examine the properties of replica analytic continuation. Based on the positive results for these models we propose that replica analytic continuation is a robust procedure in replica analysis.

  13. Competitive adsorption of Pb(II), Cu(II), and Zn(II) ions onto hydroxyapatite-biochar nanocomposite in aqueous solutions

    Science.gov (United States)

    Wang, Yu-Ying; Liu, Yu-Xue; Lu, Hao-Hao; Yang, Rui-Qin; Yang, Sheng-Mao

    2018-05-01

    A hydroxyapatite-biochar nanocomposite (HAP-BC) was successfully fabricated and its physicochemical properties characterized. The analyses showed that HAP nanoparticles were successfully loaded on the biochar surface. The adsorption of Pb(II), Cu(II), and Zn(II) by HAP-BC was systematically studied in single and ternary metal systems. The results demonstrated that pH affects the adsorption of heavy metals onto HAP-BC. Regarding the adsorption kinetics, the pseudo-second-order model showed the best fit for all three heavy metal ions on HAP-BC. In both single and ternary metal ion systems, the adsorption isotherm of Pb(II) by HAP-BC followed Langmuir model, while those of Cu(II) and Zn(II) fitted well with Freundlich model. The maximum adsorption capacity for each tested metal by HAP-BC was higher than that of pristine rice straw biochar (especially for Pb(II)) or those of other reported adsorbents. Therefore, HAP-BC could explore as a new material for future application in heavy metal removal.

  14. On-line validation of linear process models using generalized likelihood ratios

    International Nuclear Information System (INIS)

    Tylee, J.L.

    1981-12-01

    A real-time method for testing the validity of linear models of nonlinear processes is described and evaluated. Using generalized likelihood ratios, the model dynamics are continually monitored to see if the process has moved far enough away from the nominal linear model operating point to justify generation of a new linear model. The method is demonstrated using a seventh-order model of a natural circulation steam generator

  15. A primer for biomedical scientists on how to execute model II linear regression analysis.

    Science.gov (United States)

    Ludbrook, John

    2012-04-01

    1. There are two very different ways of executing linear regression analysis. One is Model I, when the x-values are fixed by the experimenter. The other is Model II, in which the x-values are free to vary and are subject to error. 2. I have received numerous complaints from biomedical scientists that they have great difficulty in executing Model II linear regression analysis. This may explain the results of a Google Scholar search, which showed that the authors of articles in journals of physiology, pharmacology and biochemistry rarely use Model II regression analysis. 3. I repeat my previous arguments in favour of using least products linear regression analysis for Model II regressions. I review three methods for executing ordinary least products (OLP) and weighted least products (WLP) regression analysis: (i) scientific calculator and/or computer spreadsheet; (ii) specific purpose computer programs; and (iii) general purpose computer programs. 4. Using a scientific calculator and/or computer spreadsheet, it is easy to obtain correct values for OLP slope and intercept, but the corresponding 95% confidence intervals (CI) are inaccurate. 5. Using specific purpose computer programs, the freeware computer program smatr gives the correct OLP regression coefficients and obtains 95% CI by bootstrapping. In addition, smatr can be used to compare the slopes of OLP lines. 6. When using general purpose computer programs, I recommend the commercial programs systat and Statistica for those who regularly undertake linear regression analysis and I give step-by-step instructions in the Supplementary Information as to how to use loss functions. © 2011 The Author. Clinical and Experimental Pharmacology and Physiology. © 2011 Blackwell Publishing Asia Pty Ltd.

  16. Analytical thermal model validation for Cassini radioisotope thermoelectric generator

    International Nuclear Information System (INIS)

    Lin, E.I.

    1997-01-01

    The Saturn-bound Cassini spacecraft is designed to rely, without precedent, on the waste heat from its three radioisotope thermoelectric generators (RTGs) to warm the propulsion module subsystem, and the RTG end dome temperature is a key determining factor of the amount of waste heat delivered. A previously validated SINDA thermal model of the RTG was the sole guide to understanding its complex thermal behavior, but displayed large discrepancies against some initial thermal development test data. A careful revalidation effort led to significant modifications and adjustments of the model, which result in a doubling of the radiative heat transfer from the heat source support assemblies to the end domes and bring up the end dome and flange temperature predictions to within 2 C of the pertinent test data. The increased inboard end dome temperature has a considerable impact on thermal control of the spacecraft central body. The validation process offers an example of physically-driven analytical model calibration with test data from not only an electrical simulator but also a nuclear-fueled flight unit, and has established the end dome temperatures of a flight RTG where no in-flight or ground-test data existed before

  17. Reliability and validity of the Thai self-report version of the Yale–Brown Obsessive–Compulsive Scale-Second Edition

    Directory of Open Access Journals (Sweden)

    Hiranyatheb T

    2015-10-01

    .94, α=0.90, and α=0.89, respectively. The correlation between each item and the Y-BOCS-II-SR-T total score showed strong correlation for all items. Confirmatory factor analysis with model modification showed adequate fit for obsession and compulsion factor models. The Y-BOCS-II-SR-T had strong correlation with the YBOCS-II-T and the FOCI-T (rs>0.90 and weaker correlation with the HAM-D, PHQ-9, and PTQL (rs<0.60, which implied good convergent and divergent validity. Conclusion: The Y-BOCS-II-SR-T is a psychometrically sound and valid measure for assessing obsessive–compulsive symptoms. Keywords: Thai, obsessive–compulsive disorder, Yale–Brown Obsessive–Compulsive Scale, self-report

  18. Calibration and validation of coarse-grained models of atomic systems: application to semiconductor manufacturing

    Science.gov (United States)

    Farrell, Kathryn; Oden, J. Tinsley

    2014-07-01

    Coarse-grained models of atomic systems, created by aggregating groups of atoms into molecules to reduce the number of degrees of freedom, have been used for decades in important scientific and technological applications. In recent years, interest in developing a more rigorous theory for coarse graining and in assessing the predictivity of coarse-grained models has arisen. In this work, Bayesian methods for the calibration and validation of coarse-grained models of atomistic systems in thermodynamic equilibrium are developed. For specificity, only configurational models of systems in canonical ensembles are considered. Among major challenges in validating coarse-grained models are (1) the development of validation processes that lead to information essential in establishing confidence in the model's ability predict key quantities of interest and (2), above all, the determination of the coarse-grained model itself; that is, the characterization of the molecular architecture, the choice of interaction potentials and thus parameters, which best fit available data. The all-atom model is treated as the "ground truth," and it provides the basis with respect to which properties of the coarse-grained model are compared. This base all-atom model is characterized by an appropriate statistical mechanics framework in this work by canonical ensembles involving only configurational energies. The all-atom model thus supplies data for Bayesian calibration and validation methods for the molecular model. To address the first challenge, we develop priors based on the maximum entropy principle and likelihood functions based on Gaussian approximations of the uncertainties in the parameter-to-observation error. To address challenge (2), we introduce the notion of model plausibilities as a means for model selection. This methodology provides a powerful approach toward constructing coarse-grained models which are most plausible for given all-atom data. We demonstrate the theory and

  19. Research program to develop and validate conceptual models for flow and transport through unsaturated, fractured rock

    International Nuclear Information System (INIS)

    Glass, R.J.; Tidwell, V.C.

    1991-01-01

    As part of the Yucca Mountain Project, our research program to develop and validate conceptual models for flow and transport through unsaturated fractured rock integrates fundamental physical experimentation with conceptual model formulation and mathematical modeling. Our research is directed toward developing and validating macroscopic, continuum-based models and supporting effective property models because of their widespread utility within the context of this project. Success relative to the development and validation of effective property models is predicated on a firm understanding of the basic physics governing flow through fractured media, specifically in the areas of unsaturated flow and transport in a single fracture and fracture-matrix interaction. 43 refs

  20. Research program to develop and validate conceptual models for flow and transport through unsaturated, fractured rock

    International Nuclear Information System (INIS)

    Glass, R.J.; Tidwell, V.C.

    1991-09-01

    As part of the Yucca Mountain Project, our research program to develop and validate conceptual models for flow and transport through unsaturated fractured rock integrates fundamental physical experimentation with conceptual model formulation and mathematical modeling. Our research is directed toward developing and validating macroscopic, continuum-based models and supporting effective property models because of their widespread utility within the context of this project. Success relative to the development and validation of effective property models is predicted on a firm understanding of the basic physics governing flow through fractured media, specifically in the areas of unsaturated flow and transport in a single fracture and fracture-matrix interaction

  1. Research program to develop and validate conceptual models for flow and transport through unsaturated, fractured rock

    International Nuclear Information System (INIS)

    Glass, R.J.; Tidwell, V.C.

    1991-01-01

    As part of the Yucca Mountain Project, our research program to develop and validate conceptual models for flow and transport through unsaturated fractured rock integrates fundamental physical experimentation with conceptual model formulation and mathematical modeling. Our research is directed toward developing and validating macroscopic, continuum-based models and supporting effective property models because of their widespread utility within the context of this project. Success relative to the development and validation of effective property models is predicted on a firm understanding of the basic physics governing flow through fractured media, specifically in the areas of unsaturated flow and transport in a single fracture and fracture-matrix interaction

  2. Application of neural computing paradigms for signal validation

    International Nuclear Information System (INIS)

    Upadhyaya, B.R.; Eryurek, E.; Mathai, G.

    1989-01-01

    Signal validation and process monitoring problems often require the prediction of one or more process variables in a system. The feasibility of applying neural network paradigms to relate one variable with a set of other related variables is studied. The backpropagation network (BPN) is applied to develop models of signals from both a commercial power plant and the EBR-II. Modification of the BPN algorithm is studied with emphasis on the speed of network training and the accuracy of prediction. The prediction of process variables in a Westinghouse PWR is presented in this paper

  3. Approaches to Validation of Models for Low Gravity Fluid Behavior

    Science.gov (United States)

    Chato, David J.; Marchetta, Jeffery; Hochstein, John I.; Kassemi, Mohammad

    2005-01-01

    This paper details the author experiences with the validation of computer models to predict low gravity fluid behavior. It reviews the literature of low gravity fluid behavior as a starting point for developing a baseline set of test cases. It examines authors attempts to validate their models against these cases and the issues they encountered. The main issues seem to be that: Most of the data is described by empirical correlation rather than fundamental relation; Detailed measurements of the flow field have not been made; Free surface shapes are observed but through thick plastic cylinders, and therefore subject to a great deal of optical distortion; and Heat transfer process time constants are on the order of minutes to days but the zero-gravity time available has been only seconds.

  4. Modeling of the hERG K+ Channel Blockage Using Online Chemical Database and Modeling Environment (OCHEM).

    Science.gov (United States)

    Li, Xiao; Zhang, Yuan; Li, Huanhuan; Zhao, Yong

    2017-12-01

    Human ether-a-go-go related gene (hERG) K+ channel plays an important role in cardiac action potential. Blockage of hERG channel may result in long QT syndrome (LQTS), even cause sudden cardiac death. Many drugs have been withdrawn from the market because of the serious hERG-related cardiotoxicity. Therefore, it is quite essential to estimate the chemical blockage of hERG in the early stage of drug discovery. In this study, a diverse set of 3721 compounds with hERG inhibition data was assembled from literature. Then, we make full use of the Online Chemical Modeling Environment (OCHEM), which supplies rich machine learning methods and descriptor sets, to build a series of classification models for hERG blockage. We also generated two consensus models based on the top-performing individual models. The consensus models performed much better than the individual models both on 5-fold cross validation and external validation. Especially, consensus model II yielded the prediction accuracy of 89.5 % and MCC of 0.670 on external validation. This result indicated that the predictive power of consensus model II should be stronger than most of the previously reported models. The 17 top-performing individual models and the consensus models and the data sets used for model development are available at https://ochem.eu/article/103592. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. External Validation of a Prediction Model for Successful External Cephalic Version

    NARCIS (Netherlands)

    de Hundt, Marcella; Vlemmix, Floortje; Kok, Marjolein; van der Steeg, Jan W.; Bais, Joke M.; Mol, Ben W.; van der Post, Joris A.

    2012-01-01

    We sought external validation of a prediction model for the probability of a successful external cephalic version (ECV). We evaluated the performance of the prediction model with calibration and discrimination. For clinical practice, we developed a score chart to calculate the probability of a

  6. Modeling and Experimental Validation of an Islanded No-Inertia Microgrid Site

    DEFF Research Database (Denmark)

    Bonfiglio, Andrea; Delfino, Federico; Labella, Alessandro

    2018-01-01

    The paper proposes a simple but effective model for no-inertia microgrids suitable to represent the instantaneous values of its meaningful electric variables, becoming a useful platform to test innovative control logics and energy management systems. The proposed model is validated against a more...

  7. Validation of Pressure Drop Models for PHWR-type Fuel Elements

    International Nuclear Information System (INIS)

    Brasnarof Daniel; Daverio, H.

    2003-01-01

    In the present work an one-dimensional pressure drop analytical model and the COBRA code, are validated with experimental data of CANDU and Atucha fuel bundles in low and high pressure experimental test loops.Models have very good agreement with the experimental data, having less than 5 % of discrepancy. The analytical model results were compared with COBRA code results, having small difference between them in a wide range of pressure, temperature and mass flow

  8. Analysis of JSI TRIGA MARK II reactor physical parameters calculated with TRIPOLI and MCNP.

    Science.gov (United States)

    Henry, R; Tiselj, I; Snoj, L

    2015-03-01

    New computational model of the JSI TRIGA Mark II research reactor was built for TRIPOLI computer code and compared with existing MCNP code model. The same modelling assumptions were used in order to check the differences of the mathematical models of both Monte Carlo codes. Differences between the TRIPOLI and MCNP predictions of keff were up to 100pcm. Further validation was performed with analyses of the normalized reaction rates and computations of kinetic parameters for various core configurations. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. Development and validation of a mortality risk model for pediatric sepsis

    Science.gov (United States)

    Chen, Mengshi; Lu, Xiulan; Hu, Li; Liu, Pingping; Zhao, Wenjiao; Yan, Haipeng; Tang, Liang; Zhu, Yimin; Xiao, Zhenghui; Chen, Lizhang; Tan, Hongzhuan

    2017-01-01

    Abstract Pediatric sepsis is a burdensome public health problem. Assessing the mortality risk of pediatric sepsis patients, offering effective treatment guidance, and improving prognosis to reduce mortality rates, are crucial. We extracted data derived from electronic medical records of pediatric sepsis patients that were collected during the first 24 hours after admission to the pediatric intensive care unit (PICU) of the Hunan Children's hospital from January 2012 to June 2014. A total of 788 children were randomly divided into a training (592, 75%) and validation group (196, 25%). The risk factors for mortality among these patients were identified by conducting multivariate logistic regression in the training group. Based on the established logistic regression equation, the logit probabilities for all patients (in both groups) were calculated to verify the model's internal and external validities. According to the training group, 6 variables (brain natriuretic peptide, albumin, total bilirubin, D-dimer, lactate levels, and mechanical ventilation in 24 hours) were included in the final logistic regression model. The areas under the curves of the model were 0.854 (0.826, 0.881) and 0.844 (0.816, 0.873) in the training and validation groups, respectively. The Mortality Risk Model for Pediatric Sepsis we established in this study showed acceptable accuracy to predict the mortality risk in pediatric sepsis patients. PMID:28514310

  10. Consistency, Verification, and Validation of Turbulence Models for Reynolds-Averaged Navier-Stokes Applications

    Science.gov (United States)

    Rumsey, Christopher L.

    2009-01-01

    In current practice, it is often difficult to draw firm conclusions about turbulence model accuracy when performing multi-code CFD studies ostensibly using the same model because of inconsistencies in model formulation or implementation in different codes. This paper describes an effort to improve the consistency, verification, and validation of turbulence models within the aerospace community through a website database of verification and validation cases. Some of the variants of two widely-used turbulence models are described, and two independent computer codes (one structured and one unstructured) are used in conjunction with two specific versions of these models to demonstrate consistency with grid refinement for several representative problems. Naming conventions, implementation consistency, and thorough grid resolution studies are key factors necessary for success.

  11. Development and validation of a viscoelastic and nonlinear liver model for needle insertion

    Energy Technology Data Exchange (ETDEWEB)

    Kobayashi, Yo [Waseda University, Consolidated Research Institute for Advanced Science and Medical Care, Shinjuku, Tokyo (Japan); Onishi, Akinori; Hoshi, Takeharu; Kawamura, Kazuya [Waseda University, Graduate School of Science and Engineering, Shinjuku (Japan); Hashizume, Makoto [Kyushu University Hospital, Center for the Integration of Advanced Medicine and Innovative Technology, Fukuoka (Japan); Fujie, Masakatsu G. [Waseda University, Graduate School of Science and Engineering, Faculty of Science and Engineering, Shinjuku (Japan)

    2009-01-15

    The objective of our work is to develop and validate a viscoelastic and nonlinear physical liver model for organ model-based needle insertion, in which the deformation of an organ is estimated and predicted, and the needle path is determined with organ deformation taken into consideration. First, an overview is given of the development of the physical liver model. The material properties of the liver considering viscoelasticity and nonlinearity are modeled based on the measured data collected from a pig's liver. The method to develop the liver model using FEM is also shown. Second, the experimental method to validate the model is explained. Both in vitro and in vivo experiments that made use of a pig's liver were conducted for comparison with the simulation using the model. Results of the in vitro experiment showed that the model reproduces nonlinear and viscoelastic response of displacement at an internally located point with high accuracy. For a force up to 0.45 N, the maximum error is below 1 mm. Results of the in vivo experiment showed that the model reproduces the nonlinear increase of load upon the needle during insertion. Based on these results, the liver model developed and validated in this work reproduces the physical response of a liver in both in vitro and in vivo situations. (orig.)

  12. Modeling the distribution of Mg II absorbers around galaxies using background galaxies and quasars

    Energy Technology Data Exchange (ETDEWEB)

    Bordoloi, R.; Lilly, S. J. [Institute for Astronomy, ETH Zürich, Wolfgang-Pauli-Strasse 27, 8093 Zürich (Switzerland); Kacprzak, G. G. [Swinburne University of Technology, Victoria 3122 (Australia); Churchill, C. W., E-mail: rongmonb@phys.ethz.ch [New Mexico State University, Las Cruces, NM 88003 (United States)

    2014-04-01

    We present joint constraints on the distribution of Mg II absorption around high redshift galaxies obtained by combining two orthogonal probes, the integrated Mg II absorption seen in stacked background galaxy spectra and the distribution of parent galaxies of individual strong Mg II systems as seen in the spectra of background quasars. We present a suite of models that can be used to predict, for different two- and three-dimensional distributions, how the projected Mg II absorption will depend on a galaxy's apparent inclination, the impact parameter b and the azimuthal angle between the projected vector to the line of sight and the projected minor axis. In general, we find that variations in the absorption strength with azimuthal angles provide much stronger constraints on the intrinsic geometry of the Mg II absorption than the dependence on the inclination of the galaxies. In addition to the clear azimuthal dependence in the integrated Mg II absorption that we reported earlier in Bordoloi et al., we show that strong equivalent width Mg II absorbers (W{sub r} (2796) ≥ 0.3 Å) are also asymmetrically distributed in azimuth around their host galaxies: 72% of the absorbers in Kacprzak et al., and 100% of the close-in absorbers within 35 kpc of the center of their host galaxies, are located within 50° of the host galaxy's projected semi minor axis. It is shown that either composite models consisting of a simple bipolar component plus a spherical or disk component, or a single highly softened bipolar distribution, can well represent the azimuthal dependencies observed in both the stacked spectrum and quasar absorption-line data sets within 40 kpc. Simultaneously fitting both data sets, we find that in the composite model the bipolar cone has an opening angle of ∼100° (i.e., confined to within 50° of the disk axis) and contains about two-thirds of the total Mg II absorption in the system. The single softened cone model has an exponential fall off with

  13. Validation of fuel performance codes at the NRI Rez plc for Temelin and Dukovany NPPs fuel safety evaluations and operation support

    International Nuclear Information System (INIS)

    Valach, M.; Hejna, J.; Zymak, J.

    2003-05-01

    The report summarises the first phase of the FUMEX II related work performed in the period September 2002 - May 2003. An inventory of the PIN and FRAS codes family used and developed during previous years was made in light of their applicability (validity) in the domain of high burn-up and FUMEX II Project Experimental database. KOLA data were chosen as appropriate for the first step of both codes fixing (both tuned for VVER fuel originally). The modern requirements, expressed by adaptation of the UO 2 conductivity degradation from OECD HRP, RIM and FGR (athermal) modelling implementation into the PIN code and a diffusion FGR model development planned for embedding, into this code allow us to reasonably shadow or keep tight contact with top quality models as TRANSURANUS, COPERNIC, CYRANO, FEMAXI, FRAPCON3 or ENIGMA. Testing and validation runs with prepared input KOLA deck were made. FUMEX II exercise propose LOCA and RIA like transients, so we started development of those two codes coupling - denominated as PIN2FRAS code. Principles of the interface were tested, benchmarking on tentative RIA pulses on highly burned KOLA fuel are presented as the first achievement from our work. (author)

  14. Development and validation of a two-dimensional fast-response flood estimation model

    Energy Technology Data Exchange (ETDEWEB)

    Judi, David R [Los Alamos National Laboratory; Mcpherson, Timothy N [Los Alamos National Laboratory; Burian, Steven J [UNIV OF UTAK

    2009-01-01

    A finite difference formulation of the shallow water equations using an upwind differencing method was developed maintaining computational efficiency and accuracy such that it can be used as a fast-response flood estimation tool. The model was validated using both laboratory controlled experiments and an actual dam breach. Through the laboratory experiments, the model was shown to give good estimations of depth and velocity when compared to the measured data, as well as when compared to a more complex two-dimensional model. Additionally, the model was compared to high water mark data obtained from the failure of the Taum Sauk dam. The simulated inundation extent agreed well with the observed extent, with the most notable differences resulting from the inability to model sediment transport. The results of these validation studies complex two-dimensional model. Additionally, the model was compared to high water mark data obtained from the failure of the Taum Sauk dam. The simulated inundation extent agreed well with the observed extent, with the most notable differences resulting from the inability to model sediment transport. The results of these validation studies show that a relatively numerical scheme used to solve the complete shallow water equations can be used to accurately estimate flood inundation. Future work will focus on further reducing the computation time needed to provide flood inundation estimates for fast-response analyses. This will be accomplished through the efficient use of multi-core, multi-processor computers coupled with an efficient domain-tracking algorithm, as well as an understanding of the impacts of grid resolution on model results.

  15. Monte Carlo Modelling of Mammograms : Development and Validation

    International Nuclear Information System (INIS)

    Spyrou, G.; Panayiotakis, G.; Bakas, A.; Tzanakos, G.

    1998-01-01

    A software package using Monte Carlo methods has been developed for the simulation of x-ray mammography. A simplified geometry of the mammographic apparatus has been considered along with the software phantom of compressed breast. This phantom may contain inhomogeneities of various compositions and sizes at any point. Using this model one can produce simulated mammograms. Results that demonstrate the validity of this simulation are presented. (authors)

  16. Modeling transducer impulse responses for predicting calibrated pressure pulses with the ultrasound simulation program Field II

    DEFF Research Database (Denmark)

    Bæk, David; Jensen, Jørgen Arendt; Willatzen, Morten

    2010-01-01

    FIELD II is a simulation software capable of predicting the field pressure in front of transducers having any complicated geometry. A calibrated prediction with this program is, however, dependent on an exact voltage-to-surface acceleration impulse response of the transducer. Such impulse response...... is not calculated by FIELD II. This work investigates the usability of combining a one-dimensional multilayer transducer modeling principle with the FIELD II software. Multilayer here refers to a transducer composed of several material layers. Measurements of pressure and current from Pz27 piezoceramic disks...... transducer model and the FIELD II software in combination give good agreement with measurements....

  17. Validation of Nonlinear Bipolar Transistor Model by Small-Signal Measurements

    DEFF Research Database (Denmark)

    Vidkjær, Jens; Porra, V.; Zhu, J.

    1992-01-01

    A new method for the validity analysis of nonlinear transistor models is presented based on DC-and small-signal S-parameter measurements and realistic consideration of the measurement and de-embedding errors and singularities of the small-signal equivalent circuit. As an example, some analysis...... results for an extended Gummel Poon model are presented in the case of a UHF bipolar power transistor....

  18. Understanding Dynamic Model Validation of a Wind Turbine Generator and a Wind Power Plant: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Muljadi, Eduard; Zhang, Ying Chen; Gevorgian, Vahan; Kosterev, Dmitry

    2016-09-01

    Regional reliability organizations require power plants to validate the dynamic models that represent them to ensure that power systems studies are performed to the best representation of the components installed. In the process of validating a wind power plant (WPP), one must be cognizant of the parameter settings of the wind turbine generators (WTGs) and the operational settings of the WPP. Validating the dynamic model of a WPP is required to be performed periodically. This is because the control parameters of the WTGs and the other supporting components within a WPP may be modified to comply with new grid codes or upgrades to the WTG controller with new capabilities developed by the turbine manufacturers or requested by the plant owners or operators. The diversity within a WPP affects the way we represent it in a model. Diversity within a WPP may be found in the way the WTGs are controlled, the wind resource, the layout of the WPP (electrical diversity), and the type of WTGs used. Each group of WTGs constitutes a significant portion of the output power of the WPP, and their unique and salient behaviors should be represented individually. The objective of this paper is to illustrate the process of dynamic model validations of WTGs and WPPs, the available data recorded that must be screened before it is used for the dynamic validations, and the assumptions made in the dynamic models of the WTG and WPP that must be understood. Without understanding the correct process, the validations may lead to the wrong representations of the WTG and WPP modeled.

  19. Use of the FDA nozzle model to illustrate validation techniques in computational fluid dynamics (CFD) simulations.

    Science.gov (United States)

    Hariharan, Prasanna; D'Souza, Gavin A; Horner, Marc; Morrison, Tina M; Malinauskas, Richard A; Myers, Matthew R

    2017-01-01

    A "credible" computational fluid dynamics (CFD) model has the potential to provide a meaningful evaluation of safety in medical devices. One major challenge in establishing "model credibility" is to determine the required degree of similarity between the model and experimental results for the model to be considered sufficiently validated. This study proposes a "threshold-based" validation approach that provides a well-defined acceptance criteria, which is a function of how close the simulation and experimental results are to the safety threshold, for establishing the model validity. The validation criteria developed following the threshold approach is not only a function of Comparison Error, E (which is the difference between experiments and simulations) but also takes in to account the risk to patient safety because of E. The method is applicable for scenarios in which a safety threshold can be clearly defined (e.g., the viscous shear-stress threshold for hemolysis in blood contacting devices). The applicability of the new validation approach was tested on the FDA nozzle geometry. The context of use (COU) was to evaluate if the instantaneous viscous shear stress in the nozzle geometry at Reynolds numbers (Re) of 3500 and 6500 was below the commonly accepted threshold for hemolysis. The CFD results ("S") of velocity and viscous shear stress were compared with inter-laboratory experimental measurements ("D"). The uncertainties in the CFD and experimental results due to input parameter uncertainties were quantified following the ASME V&V 20 standard. The CFD models for both Re = 3500 and 6500 could not be sufficiently validated by performing a direct comparison between CFD and experimental results using the Student's t-test. However, following the threshold-based approach, a Student's t-test comparing |S-D| and |Threshold-S| showed that relative to the threshold, the CFD and experimental datasets for Re = 3500 were statistically similar and the model could be

  20. A validated dynamic model of the first marine molten carbonate fuel cell

    International Nuclear Information System (INIS)

    Ovrum, E.; Dimopoulos, G.

    2012-01-01

    In this work we present a modular, dynamic and multi-dimensional model of a molten carbonate fuel cell (MCFC) onboard the offshore supply vessel “Viking Lady” serving as an auxiliary power unit. The model is able to capture detailed thermodynamic, heat transfer and electrochemical reaction phenomena within the fuel cell layers. The model has been calibrated and validated with measured performance data from a prototype installation onboard the vessel. The model is able to capture detailed thermodynamic, heat transfer and electrochemical reaction phenomena within the fuel cell layers. The model has been calibrated and validated with measured performance data from a prototype installation onboard the offshore supply vessel. The calibration process included parameter identification, sensitivity analysis to identify the critical model parameters, and iterative calibration of these to minimize the overall prediction error. The calibrated model has a low prediction error of 4% for the operating range of the cell, exhibiting at the same time a physically sound qualitative behavior in terms of thermodynamic heat transfer and electrochemical phenomena, both on steady-state and transient operation. The developed model is suitable for a wide range of studies covering the aspects of thermal efficiency, performance, operability, safety and endurance/degradation, which are necessary to introduce fuel cells in ships. The aim of this MCFC model is to aid to the introduction, design, concept approval and verification of environmentally friendly marine applications such as fuel cells, in a cost-effective, fast and safe manner. - Highlights: ► We model the first marine molten carbonate fuel cell auxiliary power unit. ► The model is distributed spatially and models both steady state and transients. ► The model is validated against experimental data. ► The paper illustrates how the model can be used in safety and reliability studies.

  1. Development and Validation of a Polarimetric-MCScene 3D Atmospheric Radiation Model

    Energy Technology Data Exchange (ETDEWEB)

    Berk, Alexander [Spectral Sciences, Inc., Burlington, MA (United States); Hawes, Frederick [Spectral Sciences, Inc., Burlington, MA (United States); Fox, Marsha [Spectral Sciences, Inc., Burlington, MA (United States)

    2016-03-15

    Polarimetric measurements can substantially enhance the ability of both spectrally resolved and single band imagery to detect the proliferation of weapons of mass destruction, providing data for locating and identifying facilities, materials, and processes of undeclared and proliferant nuclear weapons programs worldwide. Unfortunately, models do not exist that efficiently and accurately predict spectral polarized signatures for the materials of interest embedded in complex 3D environments. Having such a model would enable one to test hypotheses and optimize both the enhancement of scene contrast and the signal processing for spectral signature extraction. The Phase I set the groundwork for development of fully validated polarimetric spectral signature and scene simulation models. This has been accomplished 1. by (a) identifying and downloading state-of-the-art surface and atmospheric polarimetric data sources, (b) implementing tools for generating custom polarimetric data, and (c) identifying and requesting US Government funded field measurement data for use in validation; 2. by formulating an approach for upgrading the radiometric spectral signature model MODTRAN to generate polarimetric intensities through (a) ingestion of the polarimetric data, (b) polarimetric vectorization of existing MODTRAN modules, and (c) integration of a newly developed algorithm for computing polarimetric multiple scattering contributions; 3. by generating an initial polarimetric model that demonstrates calculation of polarimetric solar and lunar single scatter intensities arising from the interaction of incoming irradiances with molecules and aerosols; 4. by developing a design and implementation plan to (a) automate polarimetric scene construction and (b) efficiently sample polarimetric scattering and reflection events, for use in a to be developed polarimetric version of the existing first-principles synthetic scene simulation model, MCScene; and 5. by planning a validation field

  2. Antisocial Personality Disorder Subscale (Chinese Version) of the Structured Clinical Interview for the DSM-IV Axis II disorders: validation study in Cantonese-speaking Hong Kong Chinese.

    Science.gov (United States)

    Tang, D Y Y; Liu, A C Y; Leung, M H T; Siu, B W M

    2013-06-01

    OBJECTIVE. Antisocial personality disorder (ASPD) is a risk factor for violence and is associated with poor treatment response when it is a co-morbid condition with substance abuse. It is an under-recognised clinical entity in the local Hong Kong setting, for which there are only a few available Chinese-language diagnostic instruments. None has been tested for its psychometric properties in the Cantonese-speaking population in Hong Kong. This study therefore aimed to assess the reliability and validity of the Chinese version of the ASPD subscale of the Structured Clinical Interview for the DSM-IV Axis II Disorders (SCID-II) in Hong Kong Chinese. METHODS. This assessment tool was modified according to dialectal differences between Mainland China and Hong Kong. Inpatients in Castle Peak Hospital, Hong Kong, who were designated for priority follow-up based on their assessed propensity for violence and who fulfilled the inclusion criteria for the study, were recruited. To assess the level of agreement, best-estimate diagnosis made by a multidisciplinary team was compared with diagnostic status determined by the SCID-II ASPD subscale. The internal consistency, sensitivity, and specificity of the subscale were also calculated. RESULTS. The internal consistency of the subscale was acceptable at 0.79, whereas the test-retest reliability and inter-rater reliability showed an excellent and good agreement of 0.90 and 0.86, respectively. Best-estimate clinical diagnosis-SCID diagnosis agreement was acceptable at 0.76. The sensitivity, specificity, positive and negative predictive values were 0.91, 0.86, 0.83, and 0.93, respectively. CONCLUSION. The Chinese version of the SCID-II ASPD subscale is reliable and valid for diagnosing ASPD in a Cantonese-speaking clinical population.

  3. Transient Model Validation of Fixed-Speed Induction Generator Using Wind Farm Measurements

    DEFF Research Database (Denmark)

    Rogdakis, Georgios; Garcia-Valle, Rodrigo; Arana Aristi, Iván

    2012-01-01

    In this paper, an electromagnetic transient model for fixed-speed wind turbines equipped with induction generators is developed and implemented in PSCAD/EMTDC. The model is comprised by: an induction generator, aerodynamic rotor, and a two-mass representation of the shaft system. Model validation...

  4. New LUX and PandaX-II results illuminating the simplest Higgs-portal dark matter models

    International Nuclear Information System (INIS)

    He, Xiao-Gang; Tandean, Jusak

    2016-01-01

    Direct searches for dark matter (DM) by the LUX and PandaX-II Collaborations employing xenon-based detectors have recently come up with the most stringent limits to date on the spin-independent elastic scattering of DM off nucleons. For Higgs-portal scalar DM models, the new results have precluded any possibility of accommodating low-mass DM as suggested by the DAMA and CDMS II Si experiments utilizing other target materials, even after invoking isospin-violating DM interactions with nucleons. In the simplest model, SM+D, which is the standard model plus a real singlet scalar named darkon acting as the DM candidate, the LUX and PandaX-II limits rule out DM masses roughly from 4 to 450 GeV, except a small range around the resonance point at half of the Higgs mass where the interaction cross-section is near the neutrino-background floor. In the THDM II+D, which is the type-II two-Higgs-doublet model combined with a darkon, the region excluded in the SM+D by the direct searches can be recovered due to suppression of the DM effective interactions with nucleons at some values of the ratios of Higgs couplings to the up and down quarks, making the interactions significantly isospin-violating. However, in either model, if the 125-GeV Higgs boson is the portal between the dark and SM sectors, DM masses less than 50 GeV or so are already ruled out by the LHC constraint on the Higgs invisible decay. In the THDM II+D, if the heavier CP-even Higgs boson is the portal, theoretical restrictions from perturbativity, vacuum stability, and unitarity requirements turn out to be important instead and exclude much of the region below 100 GeV. For larger DM masses, the THDM II+D has plentiful parameter space that corresponds to interaction cross-sections under the neutrino-background floor and therefore is likely to be beyond the reach of future direct searches without directional sensitivity.

  5. A GLOBAL TWO-TEMPERATURE CORONA AND INNER HELIOSPHERE MODEL: A COMPREHENSIVE VALIDATION STUDY

    Energy Technology Data Exchange (ETDEWEB)

    Jin, M.; Manchester, W. B.; Van der Holst, B.; Gruesbeck, J. R.; Frazin, R. A.; Landi, E.; Toth, G.; Gombosi, T. I. [Atmospheric Oceanic and Space Sciences, University of Michigan, Ann Arbor, MI 48109 (United States); Vasquez, A. M. [Instituto de Astronomia y Fisica del Espacio (CONICET-UBA) and FCEN (UBA), CC 67, Suc 28, Ciudad de Buenos Aires (Argentina); Lamy, P. L.; Llebaria, A.; Fedorov, A., E-mail: jinmeng@umich.edu [Laboratoire d' Astrophysique de Marseille, Universite de Provence, Marseille (France)

    2012-01-20

    The recent solar minimum with very low activity provides us a unique opportunity for validating solar wind models. During CR2077 (2008 November 20 through December 17), the number of sunspots was near the absolute minimum of solar cycle 23. For this solar rotation, we perform a multi-spacecraft validation study for the recently developed three-dimensional, two-temperature, Alfven-wave-driven global solar wind model (a component within the Space Weather Modeling Framework). By using in situ observations from the Solar Terrestrial Relations Observatory (STEREO) A and B, Advanced Composition Explorer (ACE), and Venus Express, we compare the observed proton state (density, temperature, and velocity) and magnetic field of the heliosphere with that predicted by the model. Near the Sun, we validate the numerical model with the electron density obtained from the solar rotational tomography of Solar and Heliospheric Observatory/Large Angle and Spectrometric Coronagraph C2 data in the range of 2.4 to 6 solar radii. Electron temperature and density are determined from differential emission measure tomography (DEMT) of STEREO A and B Extreme Ultraviolet Imager data in the range of 1.035 to 1.225 solar radii. The electron density and temperature derived from the Hinode/Extreme Ultraviolet Imaging Spectrometer data are also used to compare with the DEMT as well as the model output. Moreover, for the first time, we compare ionic charge states of carbon, oxygen, silicon, and iron observed in situ with the ACE/Solar Wind Ion Composition Spectrometer with those predicted by our model. The validation results suggest that most of the model outputs for CR2077 can fit the observations very well. Based on this encouraging result, we therefore expect great improvement for the future modeling of coronal mass ejections (CMEs) and CME-driven shocks.

  6. Explicit validation of a surface shortwave radiation balance model over snow-covered complex terrain

    Science.gov (United States)

    Helbig, N.; Löwe, H.; Mayer, B.; Lehning, M.

    2010-09-01

    A model that computes the surface radiation balance for all sky conditions in complex terrain is presented. The spatial distribution of direct and diffuse sky radiation is determined from observations of incident global radiation, air temperature, and relative humidity at a single measurement location. Incident radiation under cloudless sky is spatially derived from a parameterization of the atmospheric transmittance. Direct and diffuse sky radiation for all sky conditions are obtained by decomposing the measured global radiation value. Spatial incident radiation values under all atmospheric conditions are computed by adjusting the spatial radiation values obtained from the parametric model with the radiation components obtained from the decomposition model at the measurement site. Topographic influences such as shading are accounted for. The radiosity approach is used to compute anisotropic terrain reflected radiation. Validations of the shortwave radiation balance model are presented in detail for a day with cloudless sky. For a day with overcast sky a first validation is presented. Validation of a section of the horizon line as well as of individual radiation components is performed with high-quality measurements. A new measurement setup was designed to determine terrain reflected radiation. There is good agreement between the measurements and the modeled terrain reflected radiation values as well as with incident radiation values. A comparison of the model with a fully three-dimensional radiative transfer Monte Carlo model is presented. That validation reveals a good agreement between modeled radiation values.

  7. Comparative calculations and validation studies with atmospheric dispersion models

    International Nuclear Information System (INIS)

    Paesler-Sauer, J.

    1986-11-01

    This report presents the results of an intercomparison of different mesoscale dispersion models and measured data of tracer experiments. The types of models taking part in the intercomparison are Gaussian-type, numerical Eulerian, and Lagrangian dispersion models. They are suited for the calculation of the atmospherical transport of radionuclides released from a nuclear installation. For the model intercomparison artificial meteorological situations were defined and corresponding arithmetical problems were formulated. For the purpose of model validation real dispersion situations of tracer experiments were used as input data for model calculations; in these cases calculated and measured time-integrated concentrations close to the ground are compared. Finally a valuation of the models concerning their efficiency in solving the problems is carried out by the aid of objective methods. (orig./HP) [de

  8. Groundwater Model Validation for the Project Shoal Area, Corrective Action Unit 447

    Energy Technology Data Exchange (ETDEWEB)

    Hassan, Ahmed [Desert Research Inst. (DRI), Las Vegas, NV (United States). Division of Hydrologic Sciences; Chapman, Jenny [Desert Research Inst. (DRI), Las Vegas, NV (United States). Division of Hydrologic Sciences; Lyles, Brad [Desert Research Inst. (DRI), Las Vegas, NV (United States). Division of Hydrologic Sciences

    2008-05-19

    Stoller has examined newly collected water level data in multiple wells at the Shoal site. On the basis of these data and information presented in the report, we are currently unable to confirm that the model is successfully validated. Most of our concerns regarding the model stem from two findings: (1) measured water level data do not provide clear evidence of a prevailing lateral flow direction; and (2) the groundwater flow system has been and continues to be in a transient state, which contrasts with assumed steady-state conditions in the model. The results of DRI's model validation efforts and observations made regarding water level behavior are discussed in the following sections. A summary of our conclusions and recommendations for a path forward are also provided in this letter report.

  9. MT3DMS: Model use, calibration, and validation

    Science.gov (United States)

    Zheng, C.; Hill, Mary C.; Cao, G.; Ma, R.

    2012-01-01

    MT3DMS is a three-dimensional multi-species solute transport model for solving advection, dispersion, and chemical reactions of contaminants in saturated groundwater flow systems. MT3DMS interfaces directly with the U.S. Geological Survey finite-difference groundwater flow model MODFLOW for the flow solution and supports the hydrologic and discretization features of MODFLOW. MT3DMS contains multiple transport solution techniques in one code, which can often be important, including in model calibration. Since its first release in 1990 as MT3D for single-species mass transport modeling, MT3DMS has been widely used in research projects and practical field applications. This article provides a brief introduction to MT3DMS and presents recommendations about calibration and validation procedures for field applications of MT3DMS. The examples presented suggest the need to consider alternative processes as models are calibrated and suggest opportunities and difficulties associated with using groundwater age in transport model calibration.

  10. PENERAPAN MODEL THINK-PAIR-SHARE UNTUK MENINGKATKAN KETERAMPILAN MENULIS KELAS II SDN 3 BANJAR JAWA

    Directory of Open Access Journals (Sweden)

    Ningsi Soisana Lakilaf

    2017-12-01

    Full Text Available Penelitian ini bertujuan untuk meningkatkan keterampilan menulis siswa  setelah penerapan model pembelajaran Think-Pear-Share bermediakan gambar pada siswa kelas II Semester I di SD Negeri 3 Banjar Jawa, Tahun Pelajaran 2017/2018.Pelaksanaan penelitian ini menggunakan penelitian tindakan kelas (PTK yang dilaksanakan dalam 2 silklus,  setiap siklus  terdiri dari 2 pertemua, dengan tahapan yang terdiri dari (1 perencanaan, (2 pelaksanaan, (3 pengamatan, dan (4 refleksi. Subjek penelitian ini adalah guru dan siswa kelas II SD Negeri 3 Banjar Jawa  dalam penelitian ini adalah teknik tes dan nontes.Hasil penelitian ini menunjukan bahwa dengan menggunakan model pembelajaran Think-Pair-Share bermedia gamabar diketahui bahwa ketuntasan hasil belajar siswa mengalami peningkatan dalam pembelajaran dengan hasil presentasi mendeskripsikan secara tertulis sebelum pelaksanaan tindakan 27%, siklus I 77% dan Siklus II 90 %. Pembelajaran dengan menerapkan model Think-Pair-Share bermedia gambar dapat meningkatkan keterampilan menulis. Kesimpulan dari penelitian ini adalah melalui penerapan model Think- Pair-Share bermedia gambar dapat meningkatkan keterampilan  menulis siswa kelas II SD Negeri 3 Banjar Jawa,. Saran yang dapat diberikan adalah sebaiknya guru lebih aktif dan kreatif dalam melaksanakan pembelajaran yang inovatif dan menyenangkan.   Kata Kunci : Keterampilan menulis, model Think-Pair-Share

  11. Clinical prediction models for bronchopulmonary dysplasia: a systematic review and external validation study

    NARCIS (Netherlands)

    Onland, Wes; Debray, Thomas P.; Laughon, Matthew M.; Miedema, Martijn; Cools, Filip; Askie, Lisa M.; Asselin, Jeanette M.; Calvert, Sandra A.; Courtney, Sherry E.; Dani, Carlo; Durand, David J.; Marlow, Neil; Peacock, Janet L.; Pillow, J. Jane; Soll, Roger F.; Thome, Ulrich H.; Truffert, Patrick; Schreiber, Michael D.; van Reempts, Patrick; Vendettuoli, Valentina; Vento, Giovanni; van Kaam, Anton H.; Moons, Karel G.; Offringa, Martin

    2013-01-01

    Bronchopulmonary dysplasia (BPD) is a common complication of preterm birth. Very different models using clinical parameters at an early postnatal age to predict BPD have been developed with little extensive quantitative validation. The objective of this study is to review and validate clinical

  12. Validation of a risk prediction model for Barrett's esophagus in an Australian population.

    Science.gov (United States)

    Ireland, Colin J; Gordon, Andrea L; Thompson, Sarah K; Watson, David I; Whiteman, David C; Reed, Richard L; Esterman, Adrian

    2018-01-01

    Esophageal adenocarcinoma is a disease that has a high mortality rate, the only known precursor being Barrett's esophagus (BE). While screening for BE is not cost-effective at the population level, targeted screening might be beneficial. We have developed a risk prediction model to identify people with BE, and here we present the external validation of this model. A cohort study was undertaken to validate a risk prediction model for BE. Individuals with endoscopy and histopathology proven BE completed a questionnaire containing variables previously identified as risk factors for this condition. Their responses were combined with data from a population sample for analysis. Risk scores were derived for each participant. Overall performance of the risk prediction model in terms of calibration and discrimination was assessed. Scores from 95 individuals with BE and 636 individuals from the general population were analyzed. The Brier score was 0.118, suggesting reasonable overall performance. The area under the receiver operating characteristic was 0.83 (95% CI 0.78-0.87). The Hosmer-Lemeshow statistic was p =0.14. Minimizing false positives and false negatives, the model achieved a sensitivity of 74% and a specificity of 73%. This study has validated a risk prediction model for BE that has a higher sensitivity than previous models.

  13. First results of GERDA Phase II and consistency with background models

    Science.gov (United States)

    Agostini, M.; Allardt, M.; Bakalyarov, A. M.; Balata, M.; Barabanov, I.; Baudis, L.; Bauer, C.; Bellotti, E.; Belogurov, S.; Belyaev, S. T.; Benato, G.; Bettini, A.; Bezrukov, L.; Bode1, T.; Borowicz, D.; Brudanin, V.; Brugnera, R.; Caldwell, A.; Cattadori, C.; Chernogorov, A.; D'Andrea, V.; Demidova, E. V.; Di Marco, N.; Domula, A.; Doroshkevich, E.; Egorov, V.; Falkenstein, R.; Frodyma, N.; Gangapshev, A.; Garfagnini, A.; Gooch, C.; Grabmayr, P.; Gurentsov, V.; Gusev, K.; Hakenmüller, J.; Hegai, A.; Heisel, M.; Hemmer, S.; Hofmann, W.; Hult, M.; Inzhechik, L. V.; Janicskó Csáthy, J.; Jochum, J.; Junker, M.; Kazalov, V.; Kihm, T.; Kirpichnikov, I. V.; Kirsch, A.; Kish, A.; Klimenko, A.; Kneißl, R.; Knöpfle, K. T.; Kochetov, O.; Kornoukhov, V. N.; Kuzminov, V. V.; Laubenstein, M.; Lazzaro, A.; Lebedev, V. I.; Lehnert, B.; Liao, H. Y.; Lindner, M.; Lippi, I.; Lubashevskiy, A.; Lubsandorzhiev, B.; Lutter, G.; Macolino, C.; Majorovits, B.; Maneschg, W.; Medinaceli, E.; Miloradovic, M.; Mingazheva, R.; Misiaszek, M.; Moseev, P.; Nemchenok, I.; Palioselitis, D.; Panas, K.; Pandola, L.; Pelczar, K.; Pullia, A.; Riboldi, S.; Rumyantseva, N.; Sada, C.; Salamida, F.; Salathe, M.; Schmitt, C.; Schneider, B.; Schönert, S.; Schreiner, J.; Schulz, O.; Schütz, A.-K.; Schwingenheuer, B.; Selivanenko, O.; Shevzik, E.; Shirchenko, M.; Simgen, H.; Smolnikov, A.; Stanco, L.; Vanhoefer, L.; Vasenko, A. A.; Veresnikova, A.; von Sturm, K.; Wagner, V.; Wegmann, A.; Wester, T.; Wiesinger, C.; Wojcik, M.; Yanovich, E.; Zhitnikov, I.; Zhukov, S. V.; Zinatulina, D.; Zuber, K.; Zuzel, G.

    2017-01-01

    The GERDA (GERmanium Detector Array) is an experiment for the search of neutrinoless double beta decay (0νββ) in 76Ge, located at Laboratori Nazionali del Gran Sasso of INFN (Italy). GERDA operates bare high purity germanium detectors submersed in liquid Argon (LAr). Phase II of data-taking started in Dec 2015 and is currently ongoing. In Phase II 35 kg of germanium detectors enriched in 76Ge including thirty newly produced Broad Energy Germanium (BEGe) detectors is operating to reach an exposure of 100 kg·yr within about 3 years data taking. The design goal of Phase II is to reduce the background by one order of magnitude to get the sensitivity for T1/20ν = O≤ft( {{{10}26}} \\right){{ yr}}. To achieve the necessary background reduction, the setup was complemented with LAr veto. Analysis of the background spectrum of Phase II demonstrates consistency with the background models. Furthermore 226Ra and 232Th contamination levels consistent with screening results. In the first Phase II data release we found no hint for a 0νββ decay signal and place a limit of this process T1/20ν > 5.3 \\cdot {1025} yr (90% C.L., sensitivity 4.0·1025 yr). First results of GERDA Phase II will be presented.

  14. Inference of RNA polymerase II transcription dynamics from chromatin immunoprecipitation time course data.

    Directory of Open Access Journals (Sweden)

    Ciira wa Maina

    2014-05-01

    Full Text Available Gene transcription mediated by RNA polymerase II (pol-II is a key step in gene expression. The dynamics of pol-II moving along the transcribed region influence the rate and timing of gene expression. In this work, we present a probabilistic model of transcription dynamics which is fitted to pol-II occupancy time course data measured using ChIP-Seq. The model can be used to estimate transcription speed and to infer the temporal pol-II activity profile at the gene promoter. Model parameters are estimated using either maximum likelihood estimation or via Bayesian inference using Markov chain Monte Carlo sampling. The Bayesian approach provides confidence intervals for parameter estimates and allows the use of priors that capture domain knowledge, e.g. the expected range of transcription speeds, based on previous experiments. The model describes the movement of pol-II down the gene body and can be used to identify the time of induction for transcriptionally engaged genes. By clustering the inferred promoter activity time profiles, we are able to determine which genes respond quickly to stimuli and group genes that share activity profiles and may therefore be co-regulated. We apply our methodology to biological data obtained using ChIP-seq to measure pol-II occupancy genome-wide when MCF-7 human breast cancer cells are treated with estradiol (E2. The transcription speeds we obtain agree with those obtained previously for smaller numbers of genes with the advantage that our approach can be applied genome-wide. We validate the biological significance of the pol-II promoter activity clusters by investigating cluster-specific transcription factor binding patterns and determining canonical pathway enrichment. We find that rapidly induced genes are enriched for both estrogen receptor alpha (ERα and FOXA1 binding in their proximal promoter regions.

  15. Probability of Detection (POD) as a statistical model for the validation of qualitative methods.

    Science.gov (United States)

    Wehling, Paul; LaBudde, Robert A; Brunelle, Sharon L; Nelson, Maria T

    2011-01-01

    A statistical model is presented for use in validation of qualitative methods. This model, termed Probability of Detection (POD), harmonizes the statistical concepts and parameters between quantitative and qualitative method validation. POD characterizes method response with respect to concentration as a continuous variable. The POD model provides a tool for graphical representation of response curves for qualitative methods. In addition, the model allows comparisons between candidate and reference methods, and provides calculations of repeatability, reproducibility, and laboratory effects from collaborative study data. Single laboratory study and collaborative study examples are given.

  16. Validation of models that predict Cesarean section after induction of labor

    NARCIS (Netherlands)

    Verhoeven, C. J. M.; Oudenaarden, A.; Hermus, M. A. A.; Porath, M. M.; Oei, S. G.; Mol, B. W. J.

    2009-01-01

    Objective Models for the prediction of Cesarean delivery after induction of labor can be used to improve clinical decision-making. The objective of this study was to validate two existing models, published by Peregrine et al. and Rane et al., for the prediction of Cesarean section after induction of

  17. How much certainty is enough? Validation of a nutrient retention model for prioritizing watershed conservation in North Carolina

    Science.gov (United States)

    Hamel, P.; Chaplin-Kramer, R.; Benner, R.

    2013-12-01

    Context Quantifying ecosystems services, nature's benefits to people, is an area of active research in water resource management. Increasingly, water utilities and basin management authorities are interested in optimizing watershed scale conservation strategies to mitigate the economic and environmental impacts of land-use and hydrological changes. While many models are available to represent hydrological processes in a spatially explicit way, large uncertainties remain associated with i) the biophysical outputs of these models (e.g., nutrient concentration at a given location), and ii) the service valuation method to support specific decisions (e.g., targeting conservation areas based on their contribution to retaining nutrient). Better understanding these uncertainties and their impact on the decision process is critical for establishing credibility of such models in a planning context. Methods To address this issue in an emerging payments for watershed services program in the Cape Fear watershed, North Carolina, USA, we tested and validated the use of a nutrient retention model (InVEST) for targeting conservation activities. Specifically, we modeled water yield and nutrient transport throughout the watershed and valued the retention service provided by forested areas. Observed flow and water quality data at multiple locations allowed calibration of the model at the watershed level as well as the subwatershed level. By comparing the results from each model parameterization, we were able to assess the uncertainties related to both the model structure and parameter estimation. Finally, we assessed the use of the model for climate scenario simulation by characterizing its ability to represent inter-annual variability. Results and discussion The spatial analyses showed that the two calibration approaches could yield distinct parameter sets, both for the water yield and the nutrient model. These results imply a difference in the absolute nutrient concentration

  18. The theoretical and computational models of the GASFLOW-II code

    International Nuclear Information System (INIS)

    Travis, J.R.

    1999-01-01

    GASFLOW-II is a finite-volume computer code that solves the time-dependent compressible Navier-Stokes equations for multiple gas species in a dispersed liquid water two-phase medium. The fluid-dynamics algorithm is coupled to the chemical kinetics of combusting gases to simulate diffusion or propagating flames in complex geometries of nuclear containments. GASFLOW-II is therefore able to predict gaseous distributions and thermal and pressure loads on containment structures and safety related equipment in the event combustion occurs. Current developments of GASFLOW-II are focused on hydrogen distribution, mitigation measures including carbon dioxide inerting, and possible combustion events in nuclear reactor containments. Fluid turbulence is calculated to enhance the transport and mixing of gases in rooms and volumes that may be connected by a ventilation system. Condensation, vaporization, and heat transfer to walls, floors, ceilings, internal structures, and within the fluid are calculated to model the appropriate mass and energy sinks. (author)

  19. Regulatory perspectives on model validation in high-level radioactive waste management programs: A joint NRC/SKI white paper

    Energy Technology Data Exchange (ETDEWEB)

    Wingefors, S.; Andersson, J.; Norrby, S. [Swedish Nuclear Power lnspectorate, Stockholm (Sweden). Office of Nuclear Waste Safety; Eisenberg, N.A.; Lee, M.P.; Federline, M.V. [U.S. Nuclear Regulatory Commission, Washington, DC (United States). Office of Nuclear Material Safety and Safeguards; Sagar, B.; Wittmeyer, G.W. [Center for Nuclear Waste Regulatory Analyses, San Antonio, TX (United States)

    1999-03-01

    Validation (or confidence building) should be an important aspect of the regulatory uses of mathematical models in the safety assessments of geologic repositories for the disposal of spent nuclear fuel and other high-level radioactive wastes (HLW). A substantial body of literature exists indicating the manner in which scientific validation of models is usually pursued. Because models for a geologic repository performance assessment cannot be tested over the spatial scales of interest and long time periods for which the models will make estimates of performance, the usual avenue for model validation- that is, comparison of model estimates with actual data at the space-time scales of interest- is precluded. Further complicating the model validation process in HLW programs are the uncertainties inherent in describing the geologic complexities of potential disposal sites, and their interactions with the engineered system, with a limited set of generally imprecise data, making it difficult to discriminate between model discrepancy and inadequacy of input data. A successful strategy for model validation, therefore, should attempt to recognize these difficulties, address their resolution, and document the resolution in a careful manner. The end result of validation efforts should be a documented enhancement of confidence in the model to an extent that the model's results can aid in regulatory decision-making. The level of validation needed should be determined by the intended uses of these models, rather than by the ideal of validation of a scientific theory. This white Paper presents a model validation strategy that can be implemented in a regulatory environment. It was prepared jointly by staff members of the U.S. Nuclear Regulatory Commission and the Swedish Nuclear Power Inspectorate-SKI. This document should not be viewed as, and is not intended to be formal guidance or as a staff position on this matter. Rather, based on a review of the literature and previous

  20. Regulatory perspectives on model validation in high-level radioactive waste management programs: A joint NRC/SKI white paper

    International Nuclear Information System (INIS)

    Wingefors, S.; Andersson, J.; Norrby, S.

    1999-03-01

    Validation (or confidence building) should be an important aspect of the regulatory uses of mathematical models in the safety assessments of geologic repositories for the disposal of spent nuclear fuel and other high-level radioactive wastes (HLW). A substantial body of literature exists indicating the manner in which scientific validation of models is usually pursued. Because models for a geologic repository performance assessment cannot be tested over the spatial scales of interest and long time periods for which the models will make estimates of performance, the usual avenue for model validation- that is, comparison of model estimates with actual data at the space-time scales of interest- is precluded. Further complicating the model validation process in HLW programs are the uncertainties inherent in describing the geologic complexities of potential disposal sites, and their interactions with the engineered system, with a limited set of generally imprecise data, making it difficult to discriminate between model discrepancy and inadequacy of input data. A successful strategy for model validation, therefore, should attempt to recognize these difficulties, address their resolution, and document the resolution in a careful manner. The end result of validation efforts should be a documented enhancement of confidence in the model to an extent that the model's results can aid in regulatory decision-making. The level of validation needed should be determined by the intended uses of these models, rather than by the ideal of validation of a scientific theory. This white Paper presents a model validation strategy that can be implemented in a regulatory environment. It was prepared jointly by staff members of the U.S. Nuclear Regulatory Commission and the Swedish Nuclear Power Inspectorate-SKI. This document should not be viewed as, and is not intended to be formal guidance or as a staff position on this matter. Rather, based on a review of the literature and previous