WorldWideScience

Sample records for validated model developed

  1. Development of a Conservative Model Validation Approach for Reliable Analysis

    Science.gov (United States)

    2015-01-01

    CIE 2015 August 2-5, 2015, Boston, Massachusetts, USA [DRAFT] DETC2015-46982 DEVELOPMENT OF A CONSERVATIVE MODEL VALIDATION APPROACH FOR RELIABLE...obtain a conservative simulation model for reliable design even with limited experimental data. Very little research has taken into account the...3, the proposed conservative model validation is briefly compared to the conventional model validation approach. Section 4 describes how to account

  2. Developing rural palliative care: validating a conceptual model.

    Science.gov (United States)

    Kelley, Mary Lou; Williams, Allison; DeMiglio, Lily; Mettam, Hilary

    2011-01-01

    The purpose of this research was to validate a conceptual model for developing palliative care in rural communities. This model articulates how local rural healthcare providers develop palliative care services according to four sequential phases. The model has roots in concepts of community capacity development, evolves from collaborative, generalist rural practice, and utilizes existing health services infrastructure. It addresses how rural providers manage challenges, specifically those related to: lack of resources, minimal community understanding of palliative care, health professionals' resistance, the bureaucracy of the health system, and the obstacles of providing services in rural environments. Seven semi-structured focus groups were conducted with interdisciplinary health providers in 7 rural communities in two Canadian provinces. Using a constant comparative analysis approach, focus group data were analyzed by examining participants' statements in relation to the model and comparing emerging themes in the development of rural palliative care to the elements of the model. The data validated the conceptual model as the model was able to theoretically predict and explain the experiences of the 7 rural communities that participated in the study. New emerging themes from the data elaborated existing elements in the model and informed the requirement for minor revisions. The model was validated and slightly revised, as suggested by the data. The model was confirmed as being a useful theoretical tool for conceptualizing the development of rural palliative care that is applicable in diverse rural communities.

  3. Importance of Computer Model Validation in Pyroprocessing Technology Development

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Y. E.; Li, Hui; Yim, M. S. [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of)

    2014-05-15

    In this research, we developed a plan for experimental validation of one of the computer models developed for ER process modeling, i. e., the ERAD code. Several candidate surrogate materials are selected for the experiment considering the chemical and physical properties. Molten salt-based pyroprocessing technology is being examined internationally as an alternative to treat spent nuclear fuel over aqueous technology. The central process in pyroprocessing is electrorefining(ER) which separates uranium from transuranic elements and fission products present in spent nuclear fuel. ER is a widely used process in the minerals industry to purify impure metals. Studies of ER by using actual spent nuclear fuel materials are problematic for both technical and political reasons. Therefore, the initial effort for ER process optimization is made by using computer models. A number of models have been developed for this purpose. But as validation of these models is incomplete and often times problematic, the simulation results from these models are inherently uncertain.

  4. Development and validation of a mass casualty conceptual model.

    Science.gov (United States)

    Culley, Joan M; Effken, Judith A

    2010-03-01

    To develop and validate a conceptual model that provides a framework for the development and evaluation of information systems for mass casualty events. The model was designed based on extant literature and existing theoretical models. A purposeful sample of 18 experts validated the model. Open-ended questions, as well as a 7-point Likert scale, were used to measure expert consensus on the importance of each construct and its relationship in the model and the usefulness of the model to future research. Computer-mediated applications were used to facilitate a modified Delphi technique through which a panel of experts provided validation for the conceptual model. Rounds of questions continued until consensus was reached, as measured by an interquartile range (no more than 1 scale point for each item); stability (change in the distribution of responses less than 15% between rounds); and percent agreement (70% or greater) for indicator questions. Two rounds of the Delphi process were needed to satisfy the criteria for consensus or stability related to the constructs, relationships, and indicators in the model. The panel reached consensus or sufficient stability to retain all 10 constructs, 9 relationships, and 39 of 44 indicators. Experts viewed the model as useful (mean of 5.3 on a 7-point scale). Validation of the model provides the first step in understanding the context in which mass casualty events take place and identifying variables that impact outcomes of care. This study provides a foundation for understanding the complexity of mass casualty care, the roles that nurses play in mass casualty events, and factors that must be considered in designing and evaluating information-communication systems to support effective triage under these conditions.

  5. Monte Carlo Modelling of Mammograms : Development and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Spyrou, G; Panayiotakis, G [Univercity of Patras, School of Medicine, Medical Physics Department, 265 00 Patras (Greece); Bakas, A [Technological Educational Institution of Athens, Department of Radiography, 122 10 Athens (Greece); Tzanakos, G [University of Athens, Department of Physics, Divission of Nuclear and Particle Physics, 157 71 Athens (Greece)

    1999-12-31

    A software package using Monte Carlo methods has been developed for the simulation of x-ray mammography. A simplified geometry of the mammographic apparatus has been considered along with the software phantom of compressed breast. This phantom may contain inhomogeneities of various compositions and sizes at any point. Using this model one can produce simulated mammograms. Results that demonstrate the validity of this simulation are presented. (authors) 16 refs, 4 figs

  6. Monte Carlo Modelling of Mammograms : Development and Validation

    International Nuclear Information System (INIS)

    Spyrou, G.; Panayiotakis, G.; Bakas, A.; Tzanakos, G.

    1998-01-01

    A software package using Monte Carlo methods has been developed for the simulation of x-ray mammography. A simplified geometry of the mammographic apparatus has been considered along with the software phantom of compressed breast. This phantom may contain inhomogeneities of various compositions and sizes at any point. Using this model one can produce simulated mammograms. Results that demonstrate the validity of this simulation are presented. (authors)

  7. Developing a model for validation and prediction of bank customer ...

    African Journals Online (AJOL)

    Credit risk is the most important risk of banks. The main approaches of the bank to reduce credit risk are correct validation using the final status and the validation model parameters. High fuel of bank reserves and lost or outstanding facilities of banks indicate the lack of appropriate validation models in the banking network.

  8. Condensation of steam in horizontal pipes: model development and validation

    International Nuclear Information System (INIS)

    Szijarto, R.

    2015-01-01

    This thesis submitted to the Swiss Federal Institute of Technology ETH in Zurich presents the development and validation of a model for the condensation of steam in horizontal pipes. Condensation models were introduced and developed particularly for the application in the emergency cooling system of a Gen-III+ boiling water reactor. Such an emergency cooling system consists of slightly inclined horizontal pipes, which are immersed in a cold water tank. The pipes are connected to the reactor pressure vessel. They are responsible for a fast depressurization of the reactor core in the case of accident. Condensation in horizontal pipes was investigated with both one-dimensional system codes (RELAP5) and three-dimensional computational fluid dynamics software (ANSYS FLUENT). The performance of the RELAP5 code was not sufficient for transient condensation processes. Therefore, a mechanistic model was developed and implemented. Four models were tested on the LAOKOON facility, which analysed direct contact condensation in a horizontal duct

  9. Development and Validation of a 3-Dimensional CFB Furnace Model

    Science.gov (United States)

    Vepsäläinen, Arl; Myöhänen, Karl; Hyppäneni, Timo; Leino, Timo; Tourunen, Antti

    At Foster Wheeler, a three-dimensional CFB furnace model is essential part of knowledge development of CFB furnace process regarding solid mixing, combustion, emission formation and heat transfer. Results of laboratory and pilot scale phenomenon research are utilized in development of sub-models. Analyses of field-test results in industrial-scale CFB boilers including furnace profile measurements are simultaneously carried out with development of 3-dimensional process modeling, which provides a chain of knowledge that is utilized as feedback for phenomenon research. Knowledge gathered by model validation studies and up-to-date parameter databases are utilized in performance prediction and design development of CFB boiler furnaces. This paper reports recent development steps related to modeling of combustion and formation of char and volatiles of various fuel types in CFB conditions. Also a new model for predicting the formation of nitrogen oxides is presented. Validation of mixing and combustion parameters for solids and gases are based on test balances at several large-scale CFB boilers combusting coal, peat and bio-fuels. Field-tests including lateral and vertical furnace profile measurements and characterization of solid materials provides a window for characterization of fuel specific mixing and combustion behavior in CFB furnace at different loads and operation conditions. Measured horizontal gas profiles are projection of balance between fuel mixing and reactions at lower part of furnace and are used together with both lateral temperature profiles at bed and upper parts of furnace for determination of solid mixing and combustion model parameters. Modeling of char and volatile based formation of NO profiles is followed by analysis of oxidizing and reducing regions formed due lower furnace design and mixing characteristics of fuel and combustion airs effecting to formation ofNO furnace profile by reduction and volatile-nitrogen reactions. This paper presents

  10. High Turbidity Solis Clear Sky Model: Development and Validation

    Directory of Open Access Journals (Sweden)

    Pierre Ineichen

    2018-03-01

    Full Text Available The Solis clear sky model is a spectral scheme based on radiative transfer calculations and the Lambert–Beer relation. Its broadband version is a simplified fast analytical version; it is limited to broadband aerosol optical depths lower than 0.45, which is a weakness when applied in countries with very high turbidity such as China or India. In order to extend the use of the original simplified version of the model for high turbidity values, we developed a new version of the broadband Solis model based on radiative transfer calculations, valid for turbidity values up to 7, for the three components, global, beam, and diffuse, and for the four aerosol types defined by Shettle and Fenn. A validation of low turbidity data acquired in Geneva shows slightly better results than the previous version. On data acquired at sites presenting higher turbidity data, the bias stays within ±4% for the beam and the global irradiances, and the standard deviation around 5% for clean and stable condition data and around 12% for questionable data and variable sky conditions.

  11. Non-Linear Slosh Damping Model Development and Validation

    Science.gov (United States)

    Yang, H. Q.; West, Jeff

    2015-01-01

    Propellant tank slosh dynamics are typically represented by a mechanical model of spring mass damper. This mechanical model is then included in the equation of motion of the entire vehicle for Guidance, Navigation and Control (GN&C) analysis. For a partially-filled smooth wall propellant tank, the critical damping based on classical empirical correlation is as low as 0.05%. Due to this low value of damping, propellant slosh is potential sources of disturbance critical to the stability of launch and space vehicles. It is postulated that the commonly quoted slosh damping is valid only under the linear regime where the slosh amplitude is small. With the increase of slosh amplitude, the critical damping value should also increase. If this nonlinearity can be verified and validated, the slosh stability margin can be significantly improved, and the level of conservatism maintained in the GN&C analysis can be lessened. The purpose of this study is to explore and to quantify the dependence of slosh damping with slosh amplitude. Accurately predicting the extremely low damping value of a smooth wall tank is very challenging for any Computational Fluid Dynamics (CFD) tool. One must resolve thin boundary layers near the wall and limit numerical damping to minimum. This computational study demonstrates that with proper grid resolution, CFD can indeed accurately predict the low damping physics from smooth walls under the linear regime. Comparisons of extracted damping values with experimental data for different tank sizes show very good agreements. Numerical simulations confirm that slosh damping is indeed a function of slosh amplitude. When slosh amplitude is low, the damping ratio is essentially constant, which is consistent with the empirical correlation. Once the amplitude reaches a critical value, the damping ratio becomes a linearly increasing function of the slosh amplitude. A follow-on experiment validated the developed nonlinear damping relationship. This discovery can

  12. Utilizing Chamber Data for Developing and Validating Climate Change Models

    Science.gov (United States)

    Monje, Oscar

    2012-01-01

    Controlled environment chambers (e.g. growth chambers, SPAR chambers, or open-top chambers) are useful for measuring plant ecosystem responses to climatic variables and CO2 that affect plant water relations. However, data from chambers was found to overestimate responses of C fluxes to CO2 enrichment. Chamber data may be confounded by numerous artifacts (e.g. sidelighting, edge effects, increased temperature and VPD, etc) and this limits what can be measured accurately. Chambers can be used to measure canopy level energy balance under controlled conditions and plant transpiration responses to CO2 concentration can be elucidated. However, these measurements cannot be used directly in model development or validation. The response of stomatal conductance to CO2 will be the same as in the field, but the measured response must be recalculated in such a manner to account for differences in aerodynamic conductance, temperature and VPD between the chamber and the field.

  13. Developing and Validating a Predictive Model for Stroke Progression

    Directory of Open Access Journals (Sweden)

    L.E. Craig

    2011-12-01

    Full Text Available Background: Progression is believed to be a common and important complication in acute stroke, and has been associated with increased mortality and morbidity. Reliable identification of predictors of early neurological deterioration could potentially benefit routine clinical care. The aim of this study was to identify predictors of early stroke progression using two independent patient cohorts. Methods: Two patient cohorts were used for this study – the first cohort formed the training data set, which included consecutive patients admitted to an urban teaching hospital between 2000 and 2002, and the second cohort formed the test data set, which included patients admitted to the same hospital between 2003 and 2004. A standard definition of stroke progression was used. The first cohort (n = 863 was used to develop the model. Variables that were statistically significant (p 0.1 in turn. The second cohort (n = 216 was used to test the performance of the model. The performance of the predictive model was assessed in terms of both calibration and discrimination. Multiple imputation methods were used for dealing with the missing values. Results: Variables shown to be significant predictors of stroke progression were conscious level, history of coronary heart disease, presence of hyperosmolarity, CT lesion, living alone on admission, Oxfordshire Community Stroke Project classification, presence of pyrexia and smoking status. The model appears to have reasonable discriminative properties [the median receiver-operating characteristic curve value was 0.72 (range 0.72–0.73] and to fit well with the observed data, which is indicated by the high goodness-of-fit p value [the median p value from the Hosmer-Lemeshow test was 0.90 (range 0.50–0.92]. Conclusion: The predictive model developed in this study contains variables that can be easily collected in practice therefore increasing its usability in clinical practice. Using this analysis approach, the

  14. Developing and validating a predictive model for stroke progression.

    Science.gov (United States)

    Craig, L E; Wu, O; Gilmour, H; Barber, M; Langhorne, P

    2011-01-01

    Progression is believed to be a common and important complication in acute stroke, and has been associated with increased mortality and morbidity. Reliable identification of predictors of early neurological deterioration could potentially benefit routine clinical care. The aim of this study was to identify predictors of early stroke progression using two independent patient cohorts. Two patient cohorts were used for this study - the first cohort formed the training data set, which included consecutive patients admitted to an urban teaching hospital between 2000 and 2002, and the second cohort formed the test data set, which included patients admitted to the same hospital between 2003 and 2004. A standard definition of stroke progression was used. The first cohort (n = 863) was used to develop the model. Variables that were statistically significant (p p > 0.1) in turn. The second cohort (n = 216) was used to test the performance of the model. The performance of the predictive model was assessed in terms of both calibration and discrimination. Multiple imputation methods were used for dealing with the missing values. Variables shown to be significant predictors of stroke progression were conscious level, history of coronary heart disease, presence of hyperosmolarity, CT lesion, living alone on admission, Oxfordshire Community Stroke Project classification, presence of pyrexia and smoking status. The model appears to have reasonable discriminative properties [the median receiver-operating characteristic curve value was 0.72 (range 0.72-0.73)] and to fit well with the observed data, which is indicated by the high goodness-of-fit p value [the median p value from the Hosmer-Lemeshow test was 0.90 (range 0.50-0.92)]. The predictive model developed in this study contains variables that can be easily collected in practice therefore increasing its usability in clinical practice. Using this analysis approach, the discrimination and calibration of the predictive model appear

  15. Developing and Validating a Predictive Model for Stroke Progression

    Science.gov (United States)

    Craig, L.E.; Wu, O.; Gilmour, H.; Barber, M.; Langhorne, P.

    2011-01-01

    Background Progression is believed to be a common and important complication in acute stroke, and has been associated with increased mortality and morbidity. Reliable identification of predictors of early neurological deterioration could potentially benefit routine clinical care. The aim of this study was to identify predictors of early stroke progression using two independent patient cohorts. Methods Two patient cohorts were used for this study – the first cohort formed the training data set, which included consecutive patients admitted to an urban teaching hospital between 2000 and 2002, and the second cohort formed the test data set, which included patients admitted to the same hospital between 2003 and 2004. A standard definition of stroke progression was used. The first cohort (n = 863) was used to develop the model. Variables that were statistically significant (p 0.1) in turn. The second cohort (n = 216) was used to test the performance of the model. The performance of the predictive model was assessed in terms of both calibration and discrimination. Multiple imputation methods were used for dealing with the missing values. Results Variables shown to be significant predictors of stroke progression were conscious level, history of coronary heart disease, presence of hyperosmolarity, CT lesion, living alone on admission, Oxfordshire Community Stroke Project classification, presence of pyrexia and smoking status. The model appears to have reasonable discriminative properties [the median receiver-operating characteristic curve value was 0.72 (range 0.72–0.73)] and to fit well with the observed data, which is indicated by the high goodness-of-fit p value [the median p value from the Hosmer-Lemeshow test was 0.90 (range 0.50–0.92)]. Conclusion The predictive model developed in this study contains variables that can be easily collected in practice therefore increasing its usability in clinical practice. Using this analysis approach, the discrimination and

  16. Development and validation of models for bubble coalescence and breakup

    Energy Technology Data Exchange (ETDEWEB)

    Liao, Yiaxiang

    2013-10-08

    A generalized model for bubble coalescence and breakup has been developed, which is based on a comprehensive survey of existing theories and models. One important feature of the model is that all important mechanisms leading to bubble coalescence and breakup in a turbulent gas-liquid flow are considered. The new model is tested extensively in a 1D Test Solver and a 3D CFD code ANSYS CFX for the case of vertical gas-liquid pipe flow under adiabatic conditions, respectively. Two kinds of extensions of the standard multi-fluid model, i.e. the discrete population model and the inhomogeneous MUSIG (multiple-size group) model, are available in the two solvers, respectively. These extensions with suitable closure models such as those for coalescence and breakup are able to predict the evolution of bubble size distribution in dispersed flows and to overcome the mono-dispersed flow limitation of the standard multi-fluid model. For the validation of the model the high quality database of the TOPFLOW L12 experiments for air-water flow in a vertical pipe was employed. A wide range of test points, which cover the bubbly flow, turbulent-churn flow as well as the transition regime, is involved in the simulations. The comparison between the simulated results such as bubble size distribution, gas velocity and volume fraction and the measured ones indicates a generally good agreement for all selected test points. As the superficial gas velocity increases, bubble size distribution evolves via coalescence dominant regimes first, then breakup-dominant regimes and finally turns into a bimodal distribution. The tendency of the evolution is well reproduced by the model. However, the tendency is almost always overestimated, i.e. too much coalescence in the coalescence dominant case while too much breakup in breakup dominant ones. The reason of this problem is discussed by studying the contribution of each coalescence and breakup mechanism at different test points. The redistribution of the

  17. Development and validation of models for bubble coalescence and breakup

    International Nuclear Information System (INIS)

    Liao, Yiaxiang

    2013-01-01

    A generalized model for bubble coalescence and breakup has been developed, which is based on a comprehensive survey of existing theories and models. One important feature of the model is that all important mechanisms leading to bubble coalescence and breakup in a turbulent gas-liquid flow are considered. The new model is tested extensively in a 1D Test Solver and a 3D CFD code ANSYS CFX for the case of vertical gas-liquid pipe flow under adiabatic conditions, respectively. Two kinds of extensions of the standard multi-fluid model, i.e. the discrete population model and the inhomogeneous MUSIG (multiple-size group) model, are available in the two solvers, respectively. These extensions with suitable closure models such as those for coalescence and breakup are able to predict the evolution of bubble size distribution in dispersed flows and to overcome the mono-dispersed flow limitation of the standard multi-fluid model. For the validation of the model the high quality database of the TOPFLOW L12 experiments for air-water flow in a vertical pipe was employed. A wide range of test points, which cover the bubbly flow, turbulent-churn flow as well as the transition regime, is involved in the simulations. The comparison between the simulated results such as bubble size distribution, gas velocity and volume fraction and the measured ones indicates a generally good agreement for all selected test points. As the superficial gas velocity increases, bubble size distribution evolves via coalescence dominant regimes first, then breakup-dominant regimes and finally turns into a bimodal distribution. The tendency of the evolution is well reproduced by the model. However, the tendency is almost always overestimated, i.e. too much coalescence in the coalescence dominant case while too much breakup in breakup dominant ones. The reason of this problem is discussed by studying the contribution of each coalescence and breakup mechanism at different test points. The redistribution of the

  18. NAIRAS aircraft radiation model development, dose climatology, and initial validation

    Science.gov (United States)

    Mertens, Christopher J.; Meier, Matthias M.; Brown, Steven; Norman, Ryan B.; Xu, Xiaojing

    2013-10-01

    The Nowcast of Atmospheric Ionizing Radiation for Aviation Safety (NAIRAS) is a real-time, global, physics-based model used to assess radiation exposure to commercial aircrews and passengers. The model is a free-running physics-based model in the sense that there are no adjustment factors applied to nudge the model into agreement with measurements. The model predicts dosimetric quantities in the atmosphere from both galactic cosmic rays (GCR) and solar energetic particles, including the response of the geomagnetic field to interplanetary dynamical processes and its subsequent influence on atmospheric dose. The focus of this paper is on atmospheric GCR exposure during geomagnetically quiet conditions, with three main objectives. First, provide detailed descriptions of the NAIRAS GCR transport and dosimetry methodologies. Second, present a climatology of effective dose and ambient dose equivalent rates at typical commercial airline altitudes representative of solar cycle maximum and solar cycle minimum conditions and spanning the full range of geomagnetic cutoff rigidities. Third, conduct an initial validation of the NAIRAS model by comparing predictions of ambient dose equivalent rates with tabulated reference measurement data and recent aircraft radiation measurements taken in 2008 during the minimum between solar cycle 23 and solar cycle 24. By applying the criterion of the International Commission on Radiation Units and Measurements (ICRU) on acceptable levels of aircraft radiation dose uncertainty for ambient dose equivalent greater than or equal to an annual dose of 1 mSv, the NAIRAS model is within 25% of the measured data, which fall within the ICRU acceptable uncertainty limit of 30%. The NAIRAS model predictions of ambient dose equivalent rate are generally within 50% of the measured data for any single-point comparison. The largest differences occur at low latitudes and high cutoffs, where the radiation dose level is low. Nevertheless, analysis suggests

  19. NAIRAS aircraft radiation model development, dose climatology, and initial validation.

    Science.gov (United States)

    Mertens, Christopher J; Meier, Matthias M; Brown, Steven; Norman, Ryan B; Xu, Xiaojing

    2013-10-01

    [1] The Nowcast of Atmospheric Ionizing Radiation for Aviation Safety (NAIRAS) is a real-time, global, physics-based model used to assess radiation exposure to commercial aircrews and passengers. The model is a free-running physics-based model in the sense that there are no adjustment factors applied to nudge the model into agreement with measurements. The model predicts dosimetric quantities in the atmosphere from both galactic cosmic rays (GCR) and solar energetic particles, including the response of the geomagnetic field to interplanetary dynamical processes and its subsequent influence on atmospheric dose. The focus of this paper is on atmospheric GCR exposure during geomagnetically quiet conditions, with three main objectives. First, provide detailed descriptions of the NAIRAS GCR transport and dosimetry methodologies. Second, present a climatology of effective dose and ambient dose equivalent rates at typical commercial airline altitudes representative of solar cycle maximum and solar cycle minimum conditions and spanning the full range of geomagnetic cutoff rigidities. Third, conduct an initial validation of the NAIRAS model by comparing predictions of ambient dose equivalent rates with tabulated reference measurement data and recent aircraft radiation measurements taken in 2008 during the minimum between solar cycle 23 and solar cycle 24. By applying the criterion of the International Commission on Radiation Units and Measurements (ICRU) on acceptable levels of aircraft radiation dose uncertainty for ambient dose equivalent greater than or equal to an annual dose of 1 mSv, the NAIRAS model is within 25% of the measured data, which fall within the ICRU acceptable uncertainty limit of 30%. The NAIRAS model predictions of ambient dose equivalent rate are generally within 50% of the measured data for any single-point comparison. The largest differences occur at low latitudes and high cutoffs, where the radiation dose level is low. Nevertheless, analysis

  20. Research program to develop and validate conceptual models for flow and transport through unsaturated, fractured rock

    International Nuclear Information System (INIS)

    Glass, R.J.; Tidwell, V.C.

    1991-09-01

    As part of the Yucca Mountain Project, our research program to develop and validate conceptual models for flow and transport through unsaturated fractured rock integrates fundamental physical experimentation with conceptual model formulation and mathematical modeling. Our research is directed toward developing and validating macroscopic, continuum-based models and supporting effective property models because of their widespread utility within the context of this project. Success relative to the development and validation of effective property models is predicted on a firm understanding of the basic physics governing flow through fractured media, specifically in the areas of unsaturated flow and transport in a single fracture and fracture-matrix interaction

  1. Research program to develop and validate conceptual models for flow and transport through unsaturated, fractured rock

    International Nuclear Information System (INIS)

    Glass, R.J.; Tidwell, V.C.

    1991-01-01

    As part of the Yucca Mountain Project, our research program to develop and validate conceptual models for flow and transport through unsaturated fractured rock integrates fundamental physical experimentation with conceptual model formulation and mathematical modeling. Our research is directed toward developing and validating macroscopic, continuum-based models and supporting effective property models because of their widespread utility within the context of this project. Success relative to the development and validation of effective property models is predicated on a firm understanding of the basic physics governing flow through fractured media, specifically in the areas of unsaturated flow and transport in a single fracture and fracture-matrix interaction. 43 refs

  2. Research program to develop and validate conceptual models for flow and transport through unsaturated, fractured rock

    International Nuclear Information System (INIS)

    Glass, R.J.; Tidwell, V.C.

    1991-01-01

    As part of the Yucca Mountain Project, our research program to develop and validate conceptual models for flow and transport through unsaturated fractured rock integrates fundamental physical experimentation with conceptual model formulation and mathematical modeling. Our research is directed toward developing and validating macroscopic, continuum-based models and supporting effective property models because of their widespread utility within the context of this project. Success relative to the development and validation of effective property models is predicted on a firm understanding of the basic physics governing flow through fractured media, specifically in the areas of unsaturated flow and transport in a single fracture and fracture-matrix interaction

  3. Radiation Background and Attenuation Model Validation and Development

    Energy Technology Data Exchange (ETDEWEB)

    Peplow, Douglas E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Santiago, Claudio P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-08-05

    This report describes the initial results of a study being conducted as part of the Urban Search Planning Tool project. The study is comparing the Urban Scene Simulator (USS), a one-dimensional (1D) radiation transport model developed at LLNL, with the three-dimensional (3D) radiation transport model from ORNL using the MCNP, SCALE/ORIGEN and SCALE/MAVRIC simulation codes. In this study, we have analyzed the differences between the two approaches at every step, from source term representation, to estimating flux and detector count rates at a fixed distance from a simple surface (slab), and at points throughout more complex 3D scenes.

  4. Validation of HEDR models

    International Nuclear Information System (INIS)

    Napier, B.A.; Simpson, J.C.; Eslinger, P.W.; Ramsdell, J.V. Jr.; Thiede, M.E.; Walters, W.H.

    1994-05-01

    The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computer models for estimating the possible radiation doses that individuals may have received from past Hanford Site operations. This document describes the validation of these models. In the HEDR Project, the model validation exercise consisted of comparing computational model estimates with limited historical field measurements and experimental measurements that are independent of those used to develop the models. The results of any one test do not mean that a model is valid. Rather, the collection of tests together provide a level of confidence that the HEDR models are valid

  5. Cost prediction following traumatic brain injury: model development and validation.

    Science.gov (United States)

    Spitz, Gershon; McKenzie, Dean; Attwood, David; Ponsford, Jennie L

    2016-02-01

    The ability to predict costs following a traumatic brain injury (TBI) would assist in planning treatment and support services by healthcare providers, insurers and other agencies. The objective of the current study was to develop predictive models of hospital, medical, paramedical, and long-term care (LTC) costs for the first 10 years following a TBI. The sample comprised 798 participants with TBI, the majority of whom were male and aged between 15 and 34 at time of injury. Costing information was obtained for hospital, medical, paramedical, and LTC costs up to 10 years postinjury. Demographic and injury-severity variables were collected at the time of admission to the rehabilitation hospital. Duration of PTA was the most important single predictor for each cost type. The final models predicted 44% of hospital costs, 26% of medical costs, 23% of paramedical costs, and 34% of LTC costs. Greater costs were incurred, depending on cost type, for individuals with longer PTA duration, obtaining a limb or chest injury, a lower GCS score, older age at injury, not being married or defacto prior to injury, living in metropolitan areas, and those reporting premorbid excessive or problem alcohol use. This study has provided a comprehensive analysis of factors predicting various types of costs following TBI, with the combination of injury-related and demographic variables predicting 23-44% of costs. PTA duration was the strongest predictor across all cost categories. These factors may be used for the planning and case management of individuals following TBI. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  6. Pharmacokinetic modeling of gentamicin in treatment of infective endocarditis: Model development and validation of existing models

    Science.gov (United States)

    van der Wijk, Lars; Proost, Johannes H.; Sinha, Bhanu; Touw, Daan J.

    2017-01-01

    Gentamicin shows large variations in half-life and volume of distribution (Vd) within and between individuals. Thus, monitoring and accurately predicting serum levels are required to optimize effectiveness and minimize toxicity. Currently, two population pharmacokinetic models are applied for predicting gentamicin doses in adults. For endocarditis patients the optimal model is unknown. We aimed at: 1) creating an optimal model for endocarditis patients; and 2) assessing whether the endocarditis and existing models can accurately predict serum levels. We performed a retrospective observational two-cohort study: one cohort to parameterize the endocarditis model by iterative two-stage Bayesian analysis, and a second cohort to validate and compare all three models. The Akaike Information Criterion and the weighted sum of squares of the residuals divided by the degrees of freedom were used to select the endocarditis model. Median Prediction Error (MDPE) and Median Absolute Prediction Error (MDAPE) were used to test all models with the validation dataset. We built the endocarditis model based on data from the modeling cohort (65 patients) with a fixed 0.277 L/h/70kg metabolic clearance, 0.698 (±0.358) renal clearance as fraction of creatinine clearance, and Vd 0.312 (±0.076) L/kg corrected lean body mass. External validation with data from 14 validation cohort patients showed a similar predictive power of the endocarditis model (MDPE -1.77%, MDAPE 4.68%) as compared to the intensive-care (MDPE -1.33%, MDAPE 4.37%) and standard (MDPE -0.90%, MDAPE 4.82%) models. All models acceptably predicted pharmacokinetic parameters for gentamicin in endocarditis patients. However, these patients appear to have an increased Vd, similar to intensive care patients. Vd mainly determines the height of peak serum levels, which in turn correlate with bactericidal activity. In order to maintain simplicity, we advise to use the existing intensive-care model in clinical practice to avoid

  7. Pharmacokinetic modeling of gentamicin in treatment of infective endocarditis: Model development and validation of existing models.

    Directory of Open Access Journals (Sweden)

    Anna Gomes

    Full Text Available Gentamicin shows large variations in half-life and volume of distribution (Vd within and between individuals. Thus, monitoring and accurately predicting serum levels are required to optimize effectiveness and minimize toxicity. Currently, two population pharmacokinetic models are applied for predicting gentamicin doses in adults. For endocarditis patients the optimal model is unknown. We aimed at: 1 creating an optimal model for endocarditis patients; and 2 assessing whether the endocarditis and existing models can accurately predict serum levels. We performed a retrospective observational two-cohort study: one cohort to parameterize the endocarditis model by iterative two-stage Bayesian analysis, and a second cohort to validate and compare all three models. The Akaike Information Criterion and the weighted sum of squares of the residuals divided by the degrees of freedom were used to select the endocarditis model. Median Prediction Error (MDPE and Median Absolute Prediction Error (MDAPE were used to test all models with the validation dataset. We built the endocarditis model based on data from the modeling cohort (65 patients with a fixed 0.277 L/h/70kg metabolic clearance, 0.698 (±0.358 renal clearance as fraction of creatinine clearance, and Vd 0.312 (±0.076 L/kg corrected lean body mass. External validation with data from 14 validation cohort patients showed a similar predictive power of the endocarditis model (MDPE -1.77%, MDAPE 4.68% as compared to the intensive-care (MDPE -1.33%, MDAPE 4.37% and standard (MDPE -0.90%, MDAPE 4.82% models. All models acceptably predicted pharmacokinetic parameters for gentamicin in endocarditis patients. However, these patients appear to have an increased Vd, similar to intensive care patients. Vd mainly determines the height of peak serum levels, which in turn correlate with bactericidal activity. In order to maintain simplicity, we advise to use the existing intensive-care model in clinical practice to

  8. Development and Validation of a Predictive Model for Functional Outcome After Stroke Rehabilitation: The Maugeri Model.

    Science.gov (United States)

    Scrutinio, Domenico; Lanzillo, Bernardo; Guida, Pietro; Mastropasqua, Filippo; Monitillo, Vincenzo; Pusineri, Monica; Formica, Roberto; Russo, Giovanna; Guarnaschelli, Caterina; Ferretti, Chiara; Calabrese, Gianluigi

    2017-12-01

    Prediction of outcome after stroke rehabilitation may help clinicians in decision-making and planning rehabilitation care. We developed and validated a predictive tool to estimate the probability of achieving improvement in physical functioning (model 1) and a level of independence requiring no more than supervision (model 2) after stroke rehabilitation. The models were derived from 717 patients admitted for stroke rehabilitation. We used multivariable logistic regression analysis to build each model. Then, each model was prospectively validated in 875 patients. Model 1 included age, time from stroke occurrence to rehabilitation admission, admission motor and cognitive Functional Independence Measure scores, and neglect. Model 2 included age, male gender, time since stroke onset, and admission motor and cognitive Functional Independence Measure score. Both models demonstrated excellent discrimination. In the derivation cohort, the area under the curve was 0.883 (95% confidence intervals, 0.858-0.910) for model 1 and 0.913 (95% confidence intervals, 0.884-0.942) for model 2. The Hosmer-Lemeshow χ 2 was 4.12 ( P =0.249) and 1.20 ( P =0.754), respectively. In the validation cohort, the area under the curve was 0.866 (95% confidence intervals, 0.840-0.892) for model 1 and 0.850 (95% confidence intervals, 0.815-0.885) for model 2. The Hosmer-Lemeshow χ 2 was 8.86 ( P =0.115) and 34.50 ( P =0.001), respectively. Both improvement in physical functioning (hazard ratios, 0.43; 0.25-0.71; P =0.001) and a level of independence requiring no more than supervision (hazard ratios, 0.32; 0.14-0.68; P =0.004) were independently associated with improved 4-year survival. A calculator is freely available for download at https://goo.gl/fEAp81. This study provides researchers and clinicians with an easy-to-use, accurate, and validated predictive tool for potential application in rehabilitation research and stroke management. © 2017 American Heart Association, Inc.

  9. Development and validation of a tokamak skin effect transformer model

    International Nuclear Information System (INIS)

    Romero, J.A.; Moret, J.-M.; Coda, S.; Felici, F.; Garrido, I.

    2012-01-01

    A lumped parameter, state space model for a tokamak transformer including the slow flux penetration in the plasma (skin effect transformer model) is presented. The model does not require detailed or explicit information about plasma profiles or geometry. Instead, this information is lumped in system variables, parameters and inputs. The model has an exact mathematical structure built from energy and flux conservation theorems, predicting the evolution and non-linear interaction of plasma current and internal inductance as functions of the primary coil currents, plasma resistance, non-inductive current drive and the loop voltage at a specific location inside the plasma (equilibrium loop voltage). Loop voltage profile in the plasma is substituted by a three-point discretization, and ordinary differential equations are used to predict the equilibrium loop voltage as a function of the boundary and resistive loop voltages. This provides a model for equilibrium loop voltage evolution, which is reminiscent of the skin effect. The order and parameters of this differential equation are determined empirically using system identification techniques. Fast plasma current modulation experiments with random binary signals have been conducted in the TCV tokamak to generate the required data for the analysis. Plasma current was modulated under ohmic conditions between 200 and 300 kA with 30 ms rise time, several times faster than its time constant L/R ≈ 200 ms. A second-order linear differential equation for equilibrium loop voltage is sufficient to describe the plasma current and internal inductance modulation with 70% and 38% fit parameters, respectively. The model explains the most salient features of the plasma current transients, such as the inverse correlation between plasma current ramp rates and internal inductance changes, without requiring detailed or explicit information about resistivity profiles. This proves that a lumped parameter modelling approach can be used to

  10. Development and demonstration of a validation methodology for vehicle lateral dynamics simulation models

    Energy Technology Data Exchange (ETDEWEB)

    Kutluay, Emir

    2013-02-01

    In this thesis a validation methodology to be used in the assessment of the vehicle dynamics simulation models is presented. Simulation of vehicle dynamics is used to estimate the dynamic responses of existing or proposed vehicles and has a wide array of applications in the development of vehicle technologies. Although simulation environments, measurement tools and mathematical theories on vehicle dynamics are well established, the methodical link between the experimental test data and validity analysis of the simulation model is still lacking. The developed validation paradigm has a top-down approach to the problem. It is ascertained that vehicle dynamics simulation models can only be validated using test maneuvers although they are aimed for real world maneuvers. Test maneuvers are determined according to the requirements of the real event at the start of the model development project and data handling techniques, validation metrics and criteria are declared for each of the selected maneuvers. If the simulation results satisfy these criteria, then the simulation is deemed ''not invalid''. If the simulation model fails to meet the criteria, the model is deemed invalid, and model iteration should be performed. The results are analyzed to determine if the results indicate a modeling error or a modeling inadequacy; and if a conditional validity in terms of system variables can be defined. Three test cases are used to demonstrate the application of the methodology. The developed methodology successfully identified the shortcomings of the tested simulation model, and defined the limits of application. The tested simulation model is found to be acceptable but valid only in a certain dynamical range. Several insights for the deficiencies of the model are reported in the analysis but the iteration step of the methodology is not demonstrated. Utilizing the proposed methodology will help to achieve more time and cost efficient simulation projects with

  11. Development and validation of an HIV risk scorecard model

    OpenAIRE

    Wilbert Sibanda; Philip Pretorius

    2013-01-01

    This research paper covers the development of an HIV risk scorecard using SAS Enterprise MinerTM. The HIV risk scorecard was developed using the 2007 South African annual antenatal HIV and syphilis seroprevalence data. Antenatal data contains various demographic characteristics for each pregnant woman, such as pregnant woman's age, male sexual partner's age, race, level of education, gravidity, parity, HIV and syphilis status. The purpose of this research was to use a scorecard to rank the ef...

  12. Model-based wear measurements in total knee arthroplasty : development and validation of novel radiographic techniques

    NARCIS (Netherlands)

    IJsseldijk, van E.A.

    2016-01-01

    The primary aim of this work was to develop novel model-based mJSW measurement methods using a 3D reconstruction and compare the accuracy and precision of these methods to conventional mJSW measurement. This thesis contributed to the development, validation and clinical application of model-based

  13. Model Validation Status Review

    International Nuclear Information System (INIS)

    E.L. Hardin

    2001-01-01

    The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified, and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M and O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and

  14. Model Validation Status Review

    Energy Technology Data Exchange (ETDEWEB)

    E.L. Hardin

    2001-11-28

    The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified, and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M&O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and

  15. HEDR model validation plan

    International Nuclear Information System (INIS)

    Napier, B.A.; Gilbert, R.O.; Simpson, J.C.; Ramsdell, J.V. Jr.; Thiede, M.E.; Walters, W.H.

    1993-06-01

    The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computational ''tools'' for estimating the possible radiation dose that individuals may have received from past Hanford Site operations. This document describes the planned activities to ''validate'' these tools. In the sense of the HEDR Project, ''validation'' is a process carried out by comparing computational model predictions with field observations and experimental measurements that are independent of those used to develop the model

  16. Development of Learning Models Based on Problem Solving and Meaningful Learning Standards by Expert Validity for Animal Development Course

    Science.gov (United States)

    Lufri, L.; Fitri, R.; Yogica, R.

    2018-04-01

    The purpose of this study is to produce a learning model based on problem solving and meaningful learning standards by expert assessment or validation for the course of Animal Development. This research is a development research that produce the product in the form of learning model, which consist of sub product, namely: the syntax of learning model and student worksheets. All of these products are standardized through expert validation. The research data is the level of validity of all sub products obtained using questionnaire, filled by validators from various field of expertise (field of study, learning strategy, Bahasa). Data were analysed using descriptive statistics. The result of the research shows that the problem solving and meaningful learning model has been produced. Sub products declared appropriate by expert include the syntax of learning model and student worksheet.

  17. Development and validation of a viscoelastic and nonlinear liver model for needle insertion

    Energy Technology Data Exchange (ETDEWEB)

    Kobayashi, Yo [Waseda University, Consolidated Research Institute for Advanced Science and Medical Care, Shinjuku, Tokyo (Japan); Onishi, Akinori; Hoshi, Takeharu; Kawamura, Kazuya [Waseda University, Graduate School of Science and Engineering, Shinjuku (Japan); Hashizume, Makoto [Kyushu University Hospital, Center for the Integration of Advanced Medicine and Innovative Technology, Fukuoka (Japan); Fujie, Masakatsu G. [Waseda University, Graduate School of Science and Engineering, Faculty of Science and Engineering, Shinjuku (Japan)

    2009-01-15

    The objective of our work is to develop and validate a viscoelastic and nonlinear physical liver model for organ model-based needle insertion, in which the deformation of an organ is estimated and predicted, and the needle path is determined with organ deformation taken into consideration. First, an overview is given of the development of the physical liver model. The material properties of the liver considering viscoelasticity and nonlinearity are modeled based on the measured data collected from a pig's liver. The method to develop the liver model using FEM is also shown. Second, the experimental method to validate the model is explained. Both in vitro and in vivo experiments that made use of a pig's liver were conducted for comparison with the simulation using the model. Results of the in vitro experiment showed that the model reproduces nonlinear and viscoelastic response of displacement at an internally located point with high accuracy. For a force up to 0.45 N, the maximum error is below 1 mm. Results of the in vivo experiment showed that the model reproduces the nonlinear increase of load upon the needle during insertion. Based on these results, the liver model developed and validated in this work reproduces the physical response of a liver in both in vitro and in vivo situations. (orig.)

  18. High-resolution computational algorithms for simulating offshore wind turbines and farms: Model development and validation

    Energy Technology Data Exchange (ETDEWEB)

    Calderer, Antoni [Univ. of Minnesota, Minneapolis, MN (United States); Yang, Xiaolei [Stony Brook Univ., NY (United States); Angelidis, Dionysios [Univ. of Minnesota, Minneapolis, MN (United States); Feist, Chris [Univ. of Minnesota, Minneapolis, MN (United States); Guala, Michele [Univ. of Minnesota, Minneapolis, MN (United States); Ruehl, Kelley [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Guo, Xin [Univ. of Minnesota, Minneapolis, MN (United States); Boomsma, Aaron [Univ. of Minnesota, Minneapolis, MN (United States); Shen, Lian [Univ. of Minnesota, Minneapolis, MN (United States); Sotiropoulos, Fotis [Stony Brook Univ., NY (United States)

    2015-10-30

    The present project involves the development of modeling and analysis design tools for assessing offshore wind turbine technologies. The computational tools developed herein are able to resolve the effects of the coupled interaction of atmospheric turbulence and ocean waves on aerodynamic performance and structural stability and reliability of offshore wind turbines and farms. Laboratory scale experiments have been carried out to derive data sets for validating the computational models.

  19. Development and Validation of Linear Alternator Models for the Advanced Stirling Convertor

    Science.gov (United States)

    Metscher, Jonathan F.; Lewandowski, Edward J.

    2015-01-01

    Two models of the linear alternator of the Advanced Stirling Convertor (ASC) have been developed using the Sage 1-D modeling software package. The first model relates the piston motion to electric current by means of a motor constant. The second uses electromagnetic model components to model the magnetic circuit of the alternator. The models are tuned and validated using test data and also compared against each other. Results show both models can be tuned to achieve results within 7 of ASC test data under normal operating conditions. Using Sage enables the creation of a complete ASC model to be developed and simulations completed quickly compared to more complex multi-dimensional models. These models allow for better insight into overall Stirling convertor performance, aid with Stirling power system modeling, and in the future support NASA mission planning for Stirling-based power systems.

  20. Individualized prediction of perineural invasion in colorectal cancer: development and validation of a radiomics prediction model.

    Science.gov (United States)

    Huang, Yanqi; He, Lan; Dong, Di; Yang, Caiyun; Liang, Cuishan; Chen, Xin; Ma, Zelan; Huang, Xiaomei; Yao, Su; Liang, Changhong; Tian, Jie; Liu, Zaiyi

    2018-02-01

    To develop and validate a radiomics prediction model for individualized prediction of perineural invasion (PNI) in colorectal cancer (CRC). After computed tomography (CT) radiomics features extraction, a radiomics signature was constructed in derivation cohort (346 CRC patients). A prediction model was developed to integrate the radiomics signature and clinical candidate predictors [age, sex, tumor location, and carcinoembryonic antigen (CEA) level]. Apparent prediction performance was assessed. After internal validation, independent temporal validation (separate from the cohort used to build the model) was then conducted in 217 CRC patients. The final model was converted to an easy-to-use nomogram. The developed radiomics nomogram that integrated the radiomics signature and CEA level showed good calibration and discrimination performance [Harrell's concordance index (c-index): 0.817; 95% confidence interval (95% CI): 0.811-0.823]. Application of the nomogram in validation cohort gave a comparable calibration and discrimination (c-index: 0.803; 95% CI: 0.794-0.812). Integrating the radiomics signature and CEA level into a radiomics prediction model enables easy and effective risk assessment of PNI in CRC. This stratification of patients according to their PNI status may provide a basis for individualized auxiliary treatment.

  1. Development and validation of multivariable models to predict mortality and hospitalization in patients with heart failure

    NARCIS (Netherlands)

    Voors, Adriaan A.; Ouwerkerk, Wouter; Zannad, Faiez; van Veldhuisen, Dirk J.; Samani, Nilesh J.; Ponikowski, Piotr; Ng, Leong L.; Metra, Marco; ter Maaten, Jozine M.; Lang, Chim C.; Hillege, Hans L.; van der Harst, Pim; Filippatos, Gerasimos; Dickstein, Kenneth; Cleland, John G.; Anker, Stefan D.; Zwinderman, Aeilko H.

    Introduction From a prospective multicentre multicountry clinical trial, we developed and validated risk models to predict prospective all-cause mortality and hospitalizations because of heart failure (HF) in patients with HF. Methods and results BIOSTAT-CHF is a research programme designed to

  2. Development and validation of multivariable models to predict mortality and hospitalization in patients with heart failure

    NARCIS (Netherlands)

    Voors, Adriaan A.; Ouwerkerk, Wouter; Zannad, Faiez; van Veldhuisen, Dirk J.; Samani, Nilesh J.; Ponikowski, Piotr; Ng, Leong L.; Metra, Marco; ter Maaten, Jozine M.; Lang, Chim C.; Hillege, Hans L.; van der Harst, Pim; Filippatos, Gerasimos; Dickstein, Kenneth; Cleland, John G.; Anker, Stefan D.; Zwinderman, Aeilko H.

    2017-01-01

    Introduction From a prospective multicentre multicountry clinical trial, we developed and validated risk models to predict prospective all-cause mortality and hospitalizations because of heart failure (HF) in patients with HF. Methods and results BIOSTAT-CHF is a research programme designed to

  3. Developing and Validating a Survival Prediction Model for NSCLC Patients Through Distributed Learning Across 3 Countries.

    Science.gov (United States)

    Jochems, Arthur; Deist, Timo M; El Naqa, Issam; Kessler, Marc; Mayo, Chuck; Reeves, Jackson; Jolly, Shruti; Matuszak, Martha; Ten Haken, Randall; van Soest, Johan; Oberije, Cary; Faivre-Finn, Corinne; Price, Gareth; de Ruysscher, Dirk; Lambin, Philippe; Dekker, Andre

    2017-10-01

    Tools for survival prediction for non-small cell lung cancer (NSCLC) patients treated with chemoradiation or radiation therapy are of limited quality. In this work, we developed a predictive model of survival at 2 years. The model is based on a large volume of historical patient data and serves as a proof of concept to demonstrate the distributed learning approach. Clinical data from 698 lung cancer patients, treated with curative intent with chemoradiation or radiation therapy alone, were collected and stored at 2 different cancer institutes (559 patients at Maastro clinic (Netherlands) and 139 at Michigan university [United States]). The model was further validated on 196 patients originating from The Christie (United Kingdon). A Bayesian network model was adapted for distributed learning (the animation can be viewed at https://www.youtube.com/watch?v=ZDJFOxpwqEA). Two-year posttreatment survival was chosen as the endpoint. The Maastro clinic cohort data are publicly available at https://www.cancerdata.org/publication/developing-and-validating-survival-prediction-model-nsclc-patients-through-distributed, and the developed models can be found at www.predictcancer.org. Variables included in the final model were T and N category, age, performance status, and total tumor dose. The model has an area under the curve (AUC) of 0.66 on the external validation set and an AUC of 0.62 on a 5-fold cross validation. A model based on the T and N category performed with an AUC of 0.47 on the validation set, significantly worse than our model (PLearning the model in a centralized or distributed fashion yields a minor difference on the probabilities of the conditional probability tables (0.6%); the discriminative performance of the models on the validation set is similar (P=.26). Distributed learning from federated databases allows learning of predictive models on data originating from multiple institutions while avoiding many of the data-sharing barriers. We believe that

  4. Development and validation of a 10-year-old child ligamentous cervical spine finite element model.

    Science.gov (United States)

    Dong, Liqiang; Li, Guangyao; Mao, Haojie; Marek, Stanley; Yang, King H

    2013-12-01

    Although a number of finite element (FE) adult cervical spine models have been developed to understand the injury mechanisms of the neck in automotive related crash scenarios, there have been fewer efforts to develop a child neck model. In this study, a 10-year-old ligamentous cervical spine FE model was developed for application in the improvement of pediatric safety related to motor vehicle crashes. The model geometry was obtained from medical scans and meshed using a multi-block approach. Appropriate properties based on review of literature in conjunction with scaling were assigned to different parts of the model. Child tensile force-deformation data in three segments, Occipital-C2 (C0-C2), C4-C5 and C6-C7, were used to validate the cervical spine model and predict failure forces and displacements. Design of computer experiments was performed to determine failure properties for intervertebral discs and ligaments needed to set up the FE model. The model-predicted ultimate displacements and forces were within the experimental range. The cervical spine FE model was validated in flexion and extension against the child experimental data in three segments, C0-C2, C4-C5 and C6-C7. Other model predictions were found to be consistent with the experimental responses scaled from adult data. The whole cervical spine model was also validated in tension, flexion and extension against the child experimental data. This study provided methods for developing a child ligamentous cervical spine FE model and to predict soft tissue failures in tension.

  5. Development and Validation of a Polarimetric-MCScene 3D Atmospheric Radiation Model

    Energy Technology Data Exchange (ETDEWEB)

    Berk, Alexander [Spectral Sciences, Inc., Burlington, MA (United States); Hawes, Frederick [Spectral Sciences, Inc., Burlington, MA (United States); Fox, Marsha [Spectral Sciences, Inc., Burlington, MA (United States)

    2016-03-15

    Polarimetric measurements can substantially enhance the ability of both spectrally resolved and single band imagery to detect the proliferation of weapons of mass destruction, providing data for locating and identifying facilities, materials, and processes of undeclared and proliferant nuclear weapons programs worldwide. Unfortunately, models do not exist that efficiently and accurately predict spectral polarized signatures for the materials of interest embedded in complex 3D environments. Having such a model would enable one to test hypotheses and optimize both the enhancement of scene contrast and the signal processing for spectral signature extraction. The Phase I set the groundwork for development of fully validated polarimetric spectral signature and scene simulation models. This has been accomplished 1. by (a) identifying and downloading state-of-the-art surface and atmospheric polarimetric data sources, (b) implementing tools for generating custom polarimetric data, and (c) identifying and requesting US Government funded field measurement data for use in validation; 2. by formulating an approach for upgrading the radiometric spectral signature model MODTRAN to generate polarimetric intensities through (a) ingestion of the polarimetric data, (b) polarimetric vectorization of existing MODTRAN modules, and (c) integration of a newly developed algorithm for computing polarimetric multiple scattering contributions; 3. by generating an initial polarimetric model that demonstrates calculation of polarimetric solar and lunar single scatter intensities arising from the interaction of incoming irradiances with molecules and aerosols; 4. by developing a design and implementation plan to (a) automate polarimetric scene construction and (b) efficiently sample polarimetric scattering and reflection events, for use in a to be developed polarimetric version of the existing first-principles synthetic scene simulation model, MCScene; and 5. by planning a validation field

  6. Developing a model for hospital inherent safety assessment: Conceptualization and validation.

    Science.gov (United States)

    Yari, Saeed; Akbari, Hesam; Gholami Fesharaki, Mohammad; Khosravizadeh, Omid; Ghasemi, Mohammad; Barsam, Yalda; Akbari, Hamed

    2018-01-01

    Paying attention to the safety of hospitals, as the most crucial institute for providing medical and health services wherein a bundle of facilities, equipment, and human resource exist, is of significant importance. The present research aims at developing a model for assessing hospitals' safety based on principles of inherent safety design. Face validity (30 experts), content validity (20 experts), construct validity (268 examples), convergent validity, and divergent validity have been employed to validate the prepared questionnaire; and the items analysis, the Cronbach's alpha test, ICC test (to measure reliability of the test), composite reliability coefficient have been used to measure primary reliability. The relationship between variables and factors has been confirmed at 0.05 significance level by conducting confirmatory factor analysis (CFA) and structural equations modeling (SEM) technique with the use of Smart-PLS. R-square and load factors values, which were higher than 0.67 and 0.300 respectively, indicated the strong fit. Moderation (0.970), simplification (0.959), substitution (0.943), and minimization (0.5008) have had the most weights in determining the inherent safety of hospital respectively. Moderation, simplification, and substitution, among the other dimensions, have more weight on the inherent safety, while minimization has the less weight, which could be due do its definition as to minimize the risk.

  7. Water loss in table grapes: model development and validation under dynamic storage conditions

    Directory of Open Access Journals (Sweden)

    Ericsem PEREIRA

    2017-09-01

    Full Text Available Abstract Water loss is a critical problem affecting the quality of table grapes. Temperature and relative humidity (RH are essential in this process. Although mathematical modelling can be applied to measure constant temperature and RH impacts, it is proved that variations in storage conditions are normally encountered in the cold chain. This study proposed a methodology to develop a weight loss model for table grapes and validate its predictions in non-constant conditions of a domestic refrigerator. Grapes were maintained under controlled conditions and the weight loss was measured to calibrate the model. The model described the water loss process adequately and the validation tests confirmed its predictive ability. Delayed cooling tests showed that estimated transpiration rates in subsequent continuous temperature treatment was not significantly influenced by prior exposure conditions, suggesting that this model may be useful to estimate the weight loss consequences of interruptions in the cold chain.

  8. Development and validation of logistic prognostic models by predefined SAS-macros

    Directory of Open Access Journals (Sweden)

    Ziegler, Christoph

    2006-02-01

    Full Text Available In medical decision making about therapies or diagnostic procedures in the treatment of patients the prognoses of the course or of the magnitude of diseases plays a relevant role. Beside of the subjective attitude of the clinician mathematical models can help in providing such prognoses. Such models are mostly multivariate regression models. In the case of a dichotomous outcome the logistic model will be applied as the standard model. In this paper we will describe SAS-macros for the development of such a model, for examination of the prognostic performance, and for model validation. The rational for this developmental approach of a prognostic modelling and the description of the macros can only given briefly in this paper. Much more details are given in. These 14 SAS-macros are a tool for setting up the whole process of deriving a prognostic model. Especially the possibility of validating the model by a standardized software tool gives an opportunity, which is not used in general in published prognostic models. Therefore, this can help to develop new models with good prognostic performance for use in medical applications.

  9. Groundwater Model Validation

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed E. Hassan

    2006-01-24

    Models have an inherent uncertainty. The difficulty in fully characterizing the subsurface environment makes uncertainty an integral component of groundwater flow and transport models, which dictates the need for continuous monitoring and improvement. Building and sustaining confidence in closure decisions and monitoring networks based on models of subsurface conditions require developing confidence in the models through an iterative process. The definition of model validation is postulated as a confidence building and long-term iterative process (Hassan, 2004a). Model validation should be viewed as a process not an end result. Following Hassan (2004b), an approach is proposed for the validation process of stochastic groundwater models. The approach is briefly summarized herein and detailed analyses of acceptance criteria for stochastic realizations and of using validation data to reduce input parameter uncertainty are presented and applied to two case studies. During the validation process for stochastic models, a question arises as to the sufficiency of the number of acceptable model realizations (in terms of conformity with validation data). Using a hierarchical approach to make this determination is proposed. This approach is based on computing five measures or metrics and following a decision tree to determine if a sufficient number of realizations attain satisfactory scores regarding how they represent the field data used for calibration (old) and used for validation (new). The first two of these measures are applied to hypothetical scenarios using the first case study and assuming field data consistent with the model or significantly different from the model results. In both cases it is shown how the two measures would lead to the appropriate decision about the model performance. Standard statistical tests are used to evaluate these measures with the results indicating they are appropriate measures for evaluating model realizations. The use of validation

  10. Empirical model development and validation with dynamic learning in the recurrent multilayer perception

    International Nuclear Information System (INIS)

    Parlos, A.G.; Chong, K.T.; Atiya, A.F.

    1994-01-01

    A nonlinear multivariable empirical model is developed for a U-tube steam generator using the recurrent multilayer perceptron network as the underlying model structure. The recurrent multilayer perceptron is a dynamic neural network, very effective in the input-output modeling of complex process systems. A dynamic gradient descent learning algorithm is used to train the recurrent multilayer perceptron, resulting in an order of magnitude improvement in convergence speed over static learning algorithms. In developing the U-tube steam generator empirical model, the effects of actuator, process,and sensor noise on the training and testing sets are investigated. Learning and prediction both appear very effective, despite the presence of training and testing set noise, respectively. The recurrent multilayer perceptron appears to learn the deterministic part of a stochastic training set, and it predicts approximately a moving average response. Extensive model validation studies indicate that the empirical model can substantially generalize (extrapolate), though online learning becomes necessary for tracking transients significantly different than the ones included in the training set and slowly varying U-tube steam generator dynamics. In view of the satisfactory modeling accuracy and the associated short development time, neural networks based empirical models in some cases appear to provide a serious alternative to first principles models. Caution, however, must be exercised because extensive on-line validation of these models is still warranted

  11. Development and validation of a prediction model for loss of physical function in elderly hemodialysis patients.

    Science.gov (United States)

    Fukuma, Shingo; Shimizu, Sayaka; Shintani, Ayumi; Kamitani, Tsukasa; Akizawa, Tadao; Fukuhara, Shunichi

    2017-09-05

    Among aging hemodialysis patients, loss of physical function has become a major issue. We developed and validated a model of predicting loss of physical function among elderly hemodialysis patients. We conducted a cohort study involving maintenance hemodialysis patients  ≥65 years of age from the Dialysis Outcomes and Practice Pattern Study in Japan. The derivation cohort included 593 early phase (1996-2004) patients and the temporal validation cohort included 447 late-phase (2005-12) patients. The main outcome was the incidence of loss of physical function, defined as the 12-item Short Form Health Survey physical function score decreasing to 0 within a year. Using backward stepwise logistic regression by Akaike's Information Criteria, six predictors (age, gender, dementia, mental health, moderate activity and ascending stairs) were selected for the final model. Points were assigned based on the regression coefficients and the total score was calculated by summing the points for each predictor. In total, 65 (11.0%) and 53 (11.9%) hemodialysis patients lost their physical function within 1 year in the derivation and validation cohorts, respectively. This model has good predictive performance quantified by both discrimination and calibration. The proportion of the loss of physical function increased sequentially through low-, middle-, and high-score categories based on the model (2.5%, 11.7% and 22.3% in the validation cohort, respectively). The loss of physical function was strongly associated with 1-year mortality [adjusted odds ratio 2.48 (95% confidence interval 1.26-4.91)]. We developed and validated a risk prediction model with good predictive performance for loss of physical function in elderly hemodialysis patients. Our simple prediction model may help physicians and patients make more informed decisions for healthy longevity. © The Author 2017. Published by Oxford University Press on behalf of ERA-EDTA.

  12. Development and validation of risk models and molecular diagnostics to permit personalized management of cancer.

    Science.gov (United States)

    Pu, Xia; Ye, Yuanqing; Wu, Xifeng

    2014-01-01

    Despite the advances made in cancer management over the past few decades, improvements in cancer diagnosis and prognosis are still poor, highlighting the need for individualized strategies. Toward this goal, risk prediction models and molecular diagnostic tools have been developed, tailoring each step of risk assessment from diagnosis to treatment and clinical outcomes based on the individual's clinical, epidemiological, and molecular profiles. These approaches hold increasing promise for delivering a new paradigm to maximize the efficiency of cancer surveillance and efficacy of treatment. However, they require stringent study design, methodology development, comprehensive assessment of biomarkers and risk factors, and extensive validation to ensure their overall usefulness for clinical translation. In the current study, the authors conducted a systematic review using breast cancer as an example and provide general guidelines for risk prediction models and molecular diagnostic tools, including development, assessment, and validation. © 2013 American Cancer Society.

  13. Development and prospective validation of a model estimating risk of readmission in cancer patients.

    Science.gov (United States)

    Schmidt, Carl R; Hefner, Jennifer; McAlearney, Ann S; Graham, Lisa; Johnson, Kristen; Moffatt-Bruce, Susan; Huerta, Timothy; Pawlik, Timothy M; White, Susan

    2018-02-26

    Hospital readmissions among cancer patients are common. While several models estimating readmission risk exist, models specific for cancer patients are lacking. A logistic regression model estimating risk of unplanned 30-day readmission was developed using inpatient admission data from a 2-year period (n = 18 782) at a tertiary cancer hospital. Readmission risk estimates derived from the model were then calculated prospectively over a 10-month period (n = 8616 admissions) and compared with actual incidence of readmission. There were 2478 (13.2%) unplanned readmissions. Model factors associated with readmission included: emergency department visit within 30 days, >1 admission within 60 days, non-surgical admission, solid malignancy, gastrointestinal cancer, emergency admission, length of stay >5 days, abnormal sodium, hemoglobin, or white blood cell count. The c-statistic for the model was 0.70. During the 10-month prospective evaluation, estimates of readmission from the model were associated with higher actual readmission incidence from 20.7% for the highest risk category to 9.6% for the lowest. An unplanned readmission risk model developed specifically for cancer patients performs well when validated prospectively. The specificity of the model for cancer patients, EMR incorporation, and prospective validation justify use of the model in future studies designed to reduce and prevent readmissions. © 2018 Wiley Periodicals, Inc.

  14. U-tube steam generator empirical model development and validation using neural networks

    International Nuclear Information System (INIS)

    Parlos, A.G.; Chong, K.T.; Atiya, A.

    1992-01-01

    Empirical modeling techniques that use model structures motivated from neural networks research have proven effective in identifying complex process dynamics. A recurrent multilayer perception (RMLP) network was developed as a nonlinear state-space model structure along with a static learning algorithm for estimating the parameter associated with it. The methods developed were demonstrated by identifying two submodels of a U-tube steam generator (UTSG), each valid around an operating power level. A significant drawback of this approach is the long off-line training times required for the development of even a simplified model of a UTSG. Subsequently, a dynamic gradient descent-based learning algorithm was developed as an accelerated alternative to train an RMLP network for use in empirical modeling of power plants. The two main advantages of this learning algorithm are its ability to consider past error gradient information for future use and the two forward passes associated with its implementation. The enhanced learning capabilities provided by the dynamic gradient descent-based learning algorithm were demonstrated via the case study of a simple steam boiler power plant. In this paper, the dynamic gradient descent-based learning algorithm is used for the development and validation of a complete UTSG empirical model

  15. Development and validation of a predictive model for excessive postpartum blood loss: A retrospective, cohort study.

    Science.gov (United States)

    Rubio-Álvarez, Ana; Molina-Alarcón, Milagros; Arias-Arias, Ángel; Hernández-Martínez, Antonio

    2018-03-01

    postpartum haemorrhage is one of the leading causes of maternal morbidity and mortality worldwide. Despite the use of uterotonics agents as preventive measure, it remains a challenge to identify those women who are at increased risk of postpartum bleeding. to develop and to validate a predictive model to assess the risk of excessive bleeding in women with vaginal birth. retrospective cohorts study. "Mancha-Centro Hospital" (Spain). the elaboration of the predictive model was based on a derivation cohort consisting of 2336 women between 2009 and 2011. For validation purposes, a prospective cohort of 953 women between 2013 and 2014 were employed. Women with antenatal fetal demise, multiple pregnancies and gestations under 35 weeks were excluded METHODS: we used a multivariate analysis with binary logistic regression, Ridge Regression and areas under the Receiver Operating Characteristic curves to determine the predictive ability of the proposed model. there was 197 (8.43%) women with excessive bleeding in the derivation cohort and 63 (6.61%) women in the validation cohort. Predictive factors in the final model were: maternal age, primiparity, duration of the first and second stages of labour, neonatal birth weight and antepartum haemoglobin levels. Accordingly, the predictive ability of this model in the derivation cohort was 0.90 (95% CI: 0.85-0.93), while it remained 0.83 (95% CI: 0.74-0.92) in the validation cohort. this predictive model is proved to have an excellent predictive ability in the derivation cohort, and its validation in a latter population equally shows a good ability for prediction. This model can be employed to identify women with a higher risk of postpartum haemorrhage. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. An integrated development environment for PMESII model authoring, integration, validation, and debugging

    Science.gov (United States)

    Pioch, Nicholas J.; Lofdahl, Corey; Sao Pedro, Michael; Krikeles, Basil; Morley, Liam

    2007-04-01

    To foster shared battlespace awareness in Air Operations Centers supporting the Joint Forces Commander and Joint Force Air Component Commander, BAE Systems is developing a Commander's Model Integration and Simulation Toolkit (CMIST), an Integrated Development Environment (IDE) for model authoring, integration, validation, and debugging. CMIST is built on the versatile Eclipse framework, a widely used open development platform comprised of extensible frameworks that enable development of tools for building, deploying, and managing software. CMIST provides two distinct layers: 1) a Commander's IDE for supporting staff to author models spanning the Political, Military, Economic, Social, Infrastructure, Information (PMESII) taxonomy; integrate multiple native (third-party) models; validate model interfaces and outputs; and debug the integrated models via intuitive controls and time series visualization, and 2) a PMESII IDE for modeling and simulation developers to rapidly incorporate new native simulation tools and models to make them available for use in the Commander's IDE. The PMESII IDE provides shared ontologies and repositories for world state, modeling concepts, and native tool characterization. CMIST includes extensible libraries for 1) reusable data transforms for semantic alignment of native data with the shared ontology, and 2) interaction patterns to synchronize multiple native simulations with disparate modeling paradigms, such as continuous-time system dynamics, agent-based discrete event simulation, and aggregate solution methods such as Monte Carlo sampling over dynamic Bayesian networks. This paper describes the CMIST system architecture, our technical approach to addressing these semantic alignment and synchronization problems, and initial results from integrating Political-Military-Economic models of post-war Iraq spanning multiple modeling paradigms.

  17. Modelling of the activity system - development of an evaluation method for integrated system validation

    International Nuclear Information System (INIS)

    Norros, Leena; Savioja, Paula

    2004-01-01

    In this paper we present our recent research which focuses on creating an evaluation method for human-system interfaces of complex systems. The method is aimed for use in the validation of modernised nuclear power plant (NPP) control rooms, and other complex systems with high reliability requirements. The task in validation is to determine whether the human-system functions safely and effectively. This question can be operationalized to the selection of relevant operational features and their appropriate acceptance criteria. Thus, there is a need to ensure that the results of the evaluation can be generalized so that they serve the purpose of integrated system validation. The definition of the appropriate acceptance criteria provides basis for the judgement of the appropriateness of the performance of the system. We propose that the operational situations and the acceptance criteria should be defined based on modelling of the NPP operation that is comprehended as an activity system. We developed a new core-tasks modelling framework. It is a formative modelling approach that combines causal, functional and understanding explanations of system performance. In this paper we reason how modelling can be used as a medium to determine the validity of the emerging control room system. (Author)

  18. Pharmacokinetic modeling of gentamicin in treatment of infective endocarditis : Model development and validation of existing models

    NARCIS (Netherlands)

    Gomes, Anna; van der Wijk, Lars; Proost, Johannes H; Sinha, Bhanu; Touw, Daan J

    2017-01-01

    Gentamicin shows large variations in half-life and volume of distribution (Vd) within and between individuals. Thus, monitoring and accurately predicting serum levels are required to optimize effectiveness and minimize toxicity. Currently, two population pharmacokinetic models are applied for

  19. Development and Validation of the Faceted Inventory of the Five-Factor Model (FI-FFM).

    Science.gov (United States)

    Watson, David; Nus, Ericka; Wu, Kevin D

    2017-06-01

    The Faceted Inventory of the Five-Factor Model (FI-FFM) is a comprehensive hierarchical measure of personality. The FI-FFM was created across five phases of scale development. It includes five facets apiece for neuroticism, extraversion, and conscientiousness; four facets within agreeableness; and three facets for openness. We present reliability and validity data obtained from three samples. The FI-FFM scales are internally consistent and highly stable over 2 weeks (retest rs ranged from .64 to .82, median r = .77). They show strong convergent and discriminant validity vis-à-vis the NEO, the Big Five Inventory, and the Personality Inventory for DSM-5. Moreover, self-ratings on the scales show moderate to strong agreement with corresponding ratings made by informants ( rs ranged from .26 to .66, median r = .42). Finally, in joint analyses with the NEO Personality Inventory-3, the FI-FFM neuroticism facet scales display significant incremental validity in predicting indicators of internalizing psychopathology.

  20. Model development and experimental validation of capnophilic lactic fermentation and hydrogen synthesis by Thermotoga neapolitana.

    Science.gov (United States)

    Pradhan, Nirakar; Dipasquale, Laura; d'Ippolito, Giuliana; Fontana, Angelo; Panico, Antonio; Pirozzi, Francesco; Lens, Piet N L; Esposito, Giovanni

    2016-08-01

    The aim of the present study was to develop a kinetic model for a recently proposed unique and novel metabolic process called capnophilic (CO2-requiring) lactic fermentation (CLF) pathway in Thermotoga neapolitana. The model was based on Monod kinetics and the mathematical expressions were developed to enable the simulation of biomass growth, substrate consumption and product formation. The calibrated kinetic parameters such as maximum specific uptake rate (k), semi-saturation constant (kS), biomass yield coefficient (Y) and endogenous decay rate (kd) were 1.30 h(-1), 1.42 g/L, 0.1195 and 0.0205 h(-1), respectively. A high correlation (>0.98) was obtained between the experimental data and model predictions for both model validation and cross validation processes. An increase of the lactate production in the range of 40-80% was obtained through CLF pathway compared to the classic dark fermentation model. The proposed kinetic model is the first mechanistically based model for the CLF pathway. This model provides useful information to improve the knowledge about how acetate and CO2 are recycled back by Thermotoga neapolitana to produce lactate without compromising the overall hydrogen yield. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Development and validation of multivariable predictive model for thromboembolic events in lymphoma patients.

    Science.gov (United States)

    Antic, Darko; Milic, Natasa; Nikolovski, Srdjan; Todorovic, Milena; Bila, Jelena; Djurdjevic, Predrag; Andjelic, Bosko; Djurasinovic, Vladislava; Sretenovic, Aleksandra; Vukovic, Vojin; Jelicic, Jelena; Hayman, Suzanne; Mihaljevic, Biljana

    2016-10-01

    Lymphoma patients are at increased risk of thromboembolic events but thromboprophylaxis in these patients is largely underused. We sought to develop and validate a simple model, based on individual clinical and laboratory patient characteristics that would designate lymphoma patients at risk for thromboembolic event. The study population included 1,820 lymphoma patients who were treated in the Lymphoma Departments at the Clinics of Hematology, Clinical Center of Serbia and Clinical Center Kragujevac. The model was developed using data from a derivation cohort (n = 1,236), and further assessed in the validation cohort (n = 584). Sixty-five patients (5.3%) in the derivation cohort and 34 (5.8%) patients in the validation cohort developed thromboembolic events. The variables independently associated with risk for thromboembolism were: previous venous and/or arterial events, mediastinal involvement, BMI>30 kg/m(2) , reduced mobility, extranodal localization, development of neutropenia and hemoglobin level 3). For patients classified at risk (intermediate and high-risk scores), the model produced negative predictive value of 98.5%, positive predictive value of 25.1%, sensitivity of 75.4%, and specificity of 87.5%. A high-risk score had positive predictive value of 65.2%. The diagnostic performance measures retained similar values in the validation cohort. Developed prognostic Thrombosis Lymphoma - ThroLy score is more specific for lymphoma patients than any other available score targeting thrombosis in cancer patients. Am. J. Hematol. 91:1014-1019, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  2. Development, Validation and Parametric study of a 3-Year-Old Child Head Finite Element Model

    Science.gov (United States)

    Cui, Shihai; Chen, Yue; Li, Haiyan; Ruan, ShiJie

    2015-12-01

    Traumatic brain injury caused by drop and traffic accidents is an important reason for children's death and disability. Recently, the computer finite element (FE) head model has been developed to investigate brain injury mechanism and biomechanical responses. Based on CT data of a healthy 3-year-old child head, the FE head model with detailed anatomical structure was developed. The deep brain structures such as white matter, gray matter, cerebral ventricle, hippocampus, were firstly created in this FE model. The FE model was validated by comparing the simulation results with that of cadaver experiments based on reconstructing the child and adult cadaver experiments. In addition, the effects of skull stiffness on the child head dynamic responses were further investigated. All the simulation results confirmed the good biofidelity of the FE model.

  3. Development and Validation of a Prediction Model to Estimate Individual Risk of Pancreatic Cancer.

    Science.gov (United States)

    Yu, Ami; Woo, Sang Myung; Joo, Jungnam; Yang, Hye-Ryung; Lee, Woo Jin; Park, Sang-Jae; Nam, Byung-Ho

    2016-01-01

    There is no reliable screening tool to identify people with high risk of developing pancreatic cancer even though pancreatic cancer represents the fifth-leading cause of cancer-related death in Korea. The goal of this study was to develop an individualized risk prediction model that can be used to screen for asymptomatic pancreatic cancer in Korean men and women. Gender-specific risk prediction models for pancreatic cancer were developed using the Cox proportional hazards model based on an 8-year follow-up of a cohort study of 1,289,933 men and 557,701 women in Korea who had biennial examinations in 1996-1997. The performance of the models was evaluated with respect to their discrimination and calibration ability based on the C-statistic and Hosmer-Lemeshow type χ2 statistic. A total of 1,634 (0.13%) men and 561 (0.10%) women were newly diagnosed with pancreatic cancer. Age, height, BMI, fasting glucose, urine glucose, smoking, and age at smoking initiation were included in the risk prediction model for men. Height, BMI, fasting glucose, urine glucose, smoking, and drinking habit were included in the risk prediction model for women. Smoking was the most significant risk factor for developing pancreatic cancer in both men and women. The risk prediction model exhibited good discrimination and calibration ability, and in external validation it had excellent prediction ability. Gender-specific risk prediction models for pancreatic cancer were developed and validated for the first time. The prediction models will be a useful tool for detecting high-risk individuals who may benefit from increased surveillance for pancreatic cancer.

  4. Development and validity of a new model for assessing pressure redistribution properties of support surfaces.

    Science.gov (United States)

    Matsuo, Junko; Sugama, Junko; Sanada, Hiromi; Okuwa, Mayumi; Nakatani, Toshio; Konya, Chizuko; Sakamoto, Jirou

    2011-05-01

    Pressure ulcers are a common problem, especially in older patients. In Japan, most institutionalized older people are malnourished and show extreme bony prominence (EBP). EBP is a significant factor in the development of pressure ulcers due to increased interface pressure concentrated at the skin surface over the EBP. The use of support surfaces is recommended for the prophylaxis of pressure ulcers. However, the present equivocal criteria for evaluating the pressure redistribution of support surfaces are inadequate. Since pressure redistribution is influenced by physique and posture, evaluations using human subjects are limited. For this reason, models that can substitute for humans are necessary. We developed a new EBP model based on the anthropometric measurements, including pelvic inclination, of 100 bedridden elderly people. A comparison between the pressure distribution charts of our model and bedridden elderly subjects demonstrated that maximum contact pressure values, buttock contact pressure values, and bone prominence rates corresponded closely. This indicates that the model provides a good approximation of the features of elderly people with EBP. We subsequently examined the validity of the model through quantitative assessment of pressure redistribution functions consisting of immersion, envelopment, and contact area change. The model was able to detect differences in the hardness of urethane foam, differences in the internal pressure of an air mattress, and sequential changes during the pressure switching mode. These results demonstrate the validity of our new buttock model in evaluating pressure redistribution for a variety of surfaces. Copyright © 2010 Tissue Viability Society. Published by Elsevier Ltd. All rights reserved.

  5. Development and validation of a nursing professionalism evaluation model in a career ladder system.

    Science.gov (United States)

    Kim, Yeon Hee; Jung, Young Sun; Min, Ja; Song, Eun Young; Ok, Jung Hui; Lim, Changwon; Kim, Kyunghee; Kim, Ji-Su

    2017-01-01

    The clinical ladder system categorizes the degree of nursing professionalism and rewards and is an important human resource tool for managing nursing. We developed a model to evaluate nursing professionalism, which determines the clinical ladder system levels, and verified its validity. Data were collected using a clinical competence tool developed in this study, and existing methods such as the nursing professionalism evaluation tool, peer reviews, and face-to-face interviews to evaluate promotions and verify the presented content in a medical institution. Reliability and convergent and discriminant validity of the clinical competence evaluation tool were verified using SmartPLS software. The validity of the model for evaluating overall nursing professionalism was also analyzed. Clinical competence was determined by five dimensions of nursing practice: scientific, technical, ethical, aesthetic, and existential. The structural model explained 66% of the variance. Clinical competence scales, peer reviews, and face-to-face interviews directly determined nursing professionalism levels. The evaluation system can be used for evaluating nurses' professionalism in actual medical institutions from a nursing practice perspective. A conceptual framework for establishing a human resources management system for nurses and a tool for evaluating nursing professionalism at medical institutions is provided.

  6. Development and validation of a Markov microsimulation model for the economic evaluation of treatments in osteoporosis.

    Science.gov (United States)

    Hiligsmann, Mickaël; Ethgen, Olivier; Bruyère, Olivier; Richy, Florent; Gathon, Henry-Jean; Reginster, Jean-Yves

    2009-01-01

    Markov models are increasingly used in economic evaluations of treatments for osteoporosis. Most of the existing evaluations are cohort-based Markov models missing comprehensive memory management and versatility. In this article, we describe and validate an original Markov microsimulation model to accurately assess the cost-effectiveness of prevention and treatment of osteoporosis. We developed a Markov microsimulation model with a lifetime horizon and a direct health-care cost perspective. The patient history was recorded and was used in calculations of transition probabilities, utilities, and costs. To test the internal consistency of the model, we carried out an example calculation for alendronate therapy. Then, external consistency was investigated by comparing absolute lifetime risk of fracture estimates with epidemiologic data. For women at age 70 years, with a twofold increase in the fracture risk of the average population, the costs per quality-adjusted life-year gained for alendronate therapy versus no treatment were estimated at €9105 and €15,325, respectively, under full and realistic adherence assumptions. All the sensitivity analyses in terms of model parameters and modeling assumptions were coherent with expected conclusions and absolute lifetime risk of fracture estimates were within the range of previous estimates, which confirmed both internal and external consistency of the model. Microsimulation models present some major advantages over cohort-based models, increasing the reliability of the results and being largely compatible with the existing state of the art, evidence-based literature. The developed model appears to be a valid model for use in economic evaluations in osteoporosis.

  7. The development and validation of a thermal model for the cabin of a vehicle

    International Nuclear Information System (INIS)

    Marcos, David; Pino, Francisco J.; Bordons, Carlos; Guerra, José J.

    2014-01-01

    Energy management in modern vehicles is a crucial issue, especially in the case of electric vehicles (EV) or hybrid vehicles (HV), in which different energy sources and loads must be considered for the operation of a vehicle. Air conditioning is an important load that must be thoroughly analysed because it can constitute a considerable percentage of the energy demand. In this paper, a simplified and dynamic thermal model for the cabin of a vehicle is proposed and validated. The developed model can be used for the design and testing of the heating, ventilation, and air conditioning (HVAC) system of a vehicle and for the study of its effects on the performance and fuel consumption of vehicles, such as EVs or HVs. The model is based on theoretical heat transfer, thermal inertia, and radiation treatment equations. The model results obtained from simulations are compared with the cabin air temperature of a vehicle under different conditions. This comparison demonstrates the accuracy between the simulation results and actual results. - Highlights: •A thermal model of a vehicle cabin with two thermal inertias is developed. •The model is validated with experimental data. •The simulation results and the experimental data fit

  8. Development and validation of a chronic copper biotic ligand model for Ceriodaphnia dubia

    International Nuclear Information System (INIS)

    Schwartz, Melissa L.; Vigneault, Bernard

    2007-01-01

    A biotic ligand model (BLM) to predict chronic Cu toxicity to Ceriodaphnia dubia was developed and tested. The effect of cationic competition, pH and natural organic matter complexation of Cu was examined to develop the model. There was no effect of cationic competition using increasing Ca and Na concentrations in our exposures. However, we did see a significant regression of decreasing toxicity (measured as the IC25; concentration at which there was a 25% inhibition of reproduction) as Mg concentration increased. However, taking into account the actual variability of the IC25 and since the relative increase in IC25 due to additional Mg was small (1.5-fold) Mg competition was not included in the model. Changes in pH had a significant effect on Cu IC25, which is consistent with proton competition as often suggested for acute BLMs. Finally, natural organic matter (NOM) was added to exposures resulting in significant decreases in toxicity. Therefore, our predictive model for chronic Cu toxicity to C. dubia includes the effect of pH and NOM complexation. The model was validated with Cu IC25 data generated in six natural surface waters collected from across Canada. Using WHAM VI, we calculated Cu speciation in each natural water and using our model, we generated 'predicted' IC25 data. We successfully predicted all Cu IC25 within a factor of 3 for the six waters used for validation

  9. The Johns Hopkins model of psychological first aid (RAPID-PFA): curriculum development and content validation.

    Science.gov (United States)

    Everly, George S; Barnett, Daniel J; Links, Jonathan M

    2012-01-01

    There appears to be virtual universal endorsement of the need for and value of acute "psychological first aid" (PFA) in the wake of trauma and disasters. In this paper, we describe the development of the curriculum for The Johns Hopkins RAPID-PFA model of psychological first aid. We employed an adaptation of the basic framework for the development of a clinical science as recommended by Millon which entailed: historical review, theoretical development, and content validation. The process of content validation of the RAPID-PFA curriculum entailed the assessment of attitudes (confidence in the application of PFA interventions, preparedness in the application of PFA); knowledge related to the application of immediate mental health interventions; and behavior (the ability to recognize clinical markers in the field as assessed via a videotape recognition exercise). Results of the content validation phase suggest the six-hour RAPID-PFA curriculum, initially based upon structural modeling analysis, can improve confidence in the application of PFA interventions, preparedness in the application of PFA, knowledge related to the application of immediate mental health interventions, and the ability to recognize clinical markers in the field as assessed via a videotape recognition exercise.

  10. Development and validation of the 3-D CFD model for CANDU-6 moderator temperature predictions

    International Nuclear Information System (INIS)

    Yoon, Churl; Rhee, Bo Wook; Min, Byung Joo

    2003-03-01

    A computational fluid dynamics model for predicting the moderator circulation inside the CANada Deuterium Uranium (CANDU) reactor vessel has been developed to estimate the local subcooling of the moderator in the vicinity of the Calandria tubes. The buoyancy effect induced by internal heating is accounted for by Boussinesq approximation. The standard κ-ε turbulence model associated with logarithmic wall treatment is applied to predict the turbulent jet flows from the inlet nozzles. The matrix of the Calandria tubes in the core region is simplified to porous media, in which an-isotropic hydraulic impedance is modeled using an empirical correlation of the frictional pressure loss. The governing equations are solved by CFX-4.4, a commercial CFD code developed by AEA technology. The CFD model has been successfully verified and validated against experimental data obtained in the Stern Laboratories Inc. (SLI) in Hamilton, Ontario

  11. Development and validation of a two-dimensional fast-response flood estimation model

    Energy Technology Data Exchange (ETDEWEB)

    Judi, David R [Los Alamos National Laboratory; Mcpherson, Timothy N [Los Alamos National Laboratory; Burian, Steven J [UNIV OF UTAK

    2009-01-01

    A finite difference formulation of the shallow water equations using an upwind differencing method was developed maintaining computational efficiency and accuracy such that it can be used as a fast-response flood estimation tool. The model was validated using both laboratory controlled experiments and an actual dam breach. Through the laboratory experiments, the model was shown to give good estimations of depth and velocity when compared to the measured data, as well as when compared to a more complex two-dimensional model. Additionally, the model was compared to high water mark data obtained from the failure of the Taum Sauk dam. The simulated inundation extent agreed well with the observed extent, with the most notable differences resulting from the inability to model sediment transport. The results of these validation studies complex two-dimensional model. Additionally, the model was compared to high water mark data obtained from the failure of the Taum Sauk dam. The simulated inundation extent agreed well with the observed extent, with the most notable differences resulting from the inability to model sediment transport. The results of these validation studies show that a relatively numerical scheme used to solve the complete shallow water equations can be used to accurately estimate flood inundation. Future work will focus on further reducing the computation time needed to provide flood inundation estimates for fast-response analyses. This will be accomplished through the efficient use of multi-core, multi-processor computers coupled with an efficient domain-tracking algorithm, as well as an understanding of the impacts of grid resolution on model results.

  12. Development and Validation of a Constitutive Model for Dental Composites during the Curing Process

    Science.gov (United States)

    Wickham Kolstad, Lauren

    Debonding is a critical failure of a dental composites used for dental restorations. Debonding of dental composites can be determined by comparing the shrinkage stress of to the debonding strength of the adhesive that bonds it to the tooth surface. It is difficult to measure shrinkage stress experimentally. In this study, finite element analysis is used to predict the stress in the composite during cure. A new constitutive law is presented that will allow composite developers to evaluate composite shrinkage stress at early stages in the material development. Shrinkage stress and shrinkage strain experimental data were gathered for three dental resins, Z250, Z350, and P90. Experimental data were used to develop a constitutive model for the Young's modulus as a function of time of the dental composite during cure. A Maxwell model, spring and dashpot in series, was used to simulate the composite. The compliance of the shrinkage stress device was also taken into account by including a spring in series with the Maxwell model. A coefficient of thermal expansion was also determined for internal loading of the composite by dividing shrinkage strain by time. Three FEA models are presented. A spring-disk model validates that the constitutive law is self-consistent. A quarter cuspal deflection model uses separate experimental data to verify that the constitutive law is valid. Finally, an axisymmetric tooth model is used to predict interfacial stresses in the composite. These stresses are compared to the debonding strength to check if the composite debonds. The new constitutive model accurately predicted cuspal deflection data. Predictions for interfacial bond stress in the tooth model compare favorably with debonding characteristics observed in practice for dental resins.

  13. Development and Initial Validation of the Five-Factor Model Adolescent Personality Questionnaire (FFM-APQ).

    Science.gov (United States)

    Rogers, Mary E; Glendon, A Ian

    2018-01-01

    This research reports on the 4-phase development of the 25-item Five-Factor Model Adolescent Personality Questionnaire (FFM-APQ). The purpose was to develop and determine initial evidence for validity of a brief adolescent personality inventory using a vocabulary that could be understood by adolescents up to 18 years old. Phase 1 (N = 48) consisted of item generation and expert (N = 5) review of items; Phase 2 (N = 179) involved item analyses; in Phase 3 (N = 496) exploratory factor analysis assessed the underlying structure; in Phase 4 (N = 405) confirmatory factor analyses resulted in a 25-item inventory with 5 subscales.

  14. Development of regional scale soil erosion and sediment transport model; its calibration and validations

    International Nuclear Information System (INIS)

    Rehman, M.H.; Akhtar, M.N.

    2005-01-01

    Despite of the fact that many soil erosion models have been developed in the past more than 5 decades including empirical based models like USLE and RUSLE and many process based soil erosion and sediment transport models like WEPP, EUROSEM and SHETRAN, the application of these models to regional scales remained questionable. To address the problem, a process-based soil erosion and sediment transport model has been developed to estimate the soil erosion, deposition, transport and sediment yield at regional scale. The soil erosion processes are modeled as the detachment of soil by the raindrop impact over the entire grid and detachment of soil due to overland flow only within the equivalent channels, whereas sediment is routed to the forward grid considering the transport capacity of the flow. The loss of heterogeneity in the spatial information of the topography due to slope averaging effect is reproduced by adapting a Fractal analysis approach. The model has been calibrated for Nan river basin (N.13A) and validated to the Yom river basin (Y.6) and Nam Mae Klang river basin (P.24A) of Thailand, simulated results show good agreements with the observed sediment discharge data. The developed model with few new components can also be applied for predicting the sediment discharges of the river Indus. (author)

  15. Development and validation of a septoplasty training model using 3-dimensional printing technology.

    Science.gov (United States)

    AlReefi, Mahmoud A; Nguyen, Lily H P; Mongeau, Luc G; Haq, Bassam Ul; Boyanapalli, Siddharth; Hafeez, Nauman; Cegarra-Escolano, Francois; Tewfik, Marc A

    2017-04-01

    Providing alternative training modalities may improve trainees' ability to perform septoplasty. Three-dimensional printing has been shown to be a powerful tool in surgical training. The objectives of this study were to explain the development of our 3-dimensional (3D) printed septoplasty training model, to assess its face and content validity, and to present evidence supporting its ability to distinguish between levels of surgical proficiency. Imaging data of a patient with a nasal septal deviation was selected for printing. Printing materials reproducing the mechanical properties of human tissues were selected based on literature review and prototype testing. Eight expert rhinologists, 6 senior residents, and 6 junior residents performed endoscopic septoplasties on the model and completed a postsimulation survey. Performance metrics in quality (final product analysis), efficiency (time), and safety (eg, perforation length, nares damage) were recorded and analyzed in a study-blind manner. The model was judged to be anatomically correct and the steps performed realistic, with scores of 4.05 ± 0.82 and 4.2 ± 1, respectively, on a 5-point Likert scale. Ninety-two percent of residents desired the simulator to be integrated into their teaching curriculum. There was a significant difference (p simulator training models for septoplasty. Our model incorporates 2 different materials mixed into the 3 relevant consistencies necessary to simulate septoplasty. Our findings provide evidence supporting the validity of the model. © 2016 ARS-AAOA, LLC.

  16. Readmissions and death after ICU discharge: development and validation of two predictive models.

    Directory of Open Access Journals (Sweden)

    Omar Badawi

    Full Text Available INTRODUCTION: Early discharge from the ICU is desirable because it shortens time in the ICU and reduces care costs, but can also increase the likelihood of ICU readmission and post-discharge unanticipated death if patients are discharged before they are stable. We postulated that, using eICU® Research Institute (eRI data from >400 ICUs, we could develop robust models predictive of post-discharge death and readmission that may be incorporated into future clinical information systems (CIS to assist ICU discharge planning. METHODS: Retrospective, multi-center, exploratory cohort study of ICU survivors within the eRI database between 1/1/2007 and 3/31/2011. EXCLUSION CRITERIA: DNR or care limitations at ICU discharge and discharge to location external to hospital. Patients were randomized (2∶1 to development and validation cohorts. Multivariable logistic regression was performed on a broad range of variables including: patient demographics, ICU admission diagnosis, admission severity of illness, laboratory values and physiologic variables present during the last 24 hours of the ICU stay. Multiple imputation was used to address missing data. The primary outcomes were the area under the receiver operator characteristic curves (auROC in the validation cohorts for the models predicting readmission and death within 48 hours of ICU discharge. RESULTS: 469,976 and 234,987 patients representing 219 hospitals were in the development and validation cohorts. Early ICU readmission and death was experienced by 2.54% and 0.92% of all patients, respectively. The relationship between predictors and outcomes (death vs readmission differed, justifying the need for separate models. The models for early readmission and death produced auROCs of 0.71 and 0.92, respectively. Both models calibrated well across risk groups. CONCLUSIONS: Our models for death and readmission after ICU discharge showed good to excellent discrimination and good calibration. Although

  17. Development of a new model to predict indoor daylighting: Integration in CODYRUN software and validation

    Energy Technology Data Exchange (ETDEWEB)

    Fakra, A.H., E-mail: fakra@univ-reunion.f [Physics and Mathematical Engineering Laboratory for Energy and Environment (PIMENT), University of La Reunion, 117 rue du General Ailleret, 97430 Le Tampon (French Overseas Dpt.), Reunion (France); Miranville, F.; Boyer, H.; Guichard, S. [Physics and Mathematical Engineering Laboratory for Energy and Environment (PIMENT), University of La Reunion, 117 rue du General Ailleret, 97430 Le Tampon (French Overseas Dpt.), Reunion (France)

    2011-07-15

    Research highlights: {yields} This study presents a new model capable to simulate indoor daylighting. {yields} The model was introduced in research software called CODYRUN. {yields} The validation of the code was realized from a lot of tests cases. -- Abstract: Many models exist in the scientific literature for determining indoor daylighting values. They are classified in three categories: numerical, simplified and empirical models. Nevertheless, each of these categories of models are not convenient for every application. Indeed, the numerical model requires high calculation time; conditions of use of the simplified models are limited, and experimental models need not only important financial resources but also a perfect control of experimental devices (e.g. scale model), as well as climatic characteristics of the location (e.g. in situ experiment). In this article, a new model based on a combination of multiple simplified models is established. The objective is to improve this category of model. The originality of our paper relies on the coupling of several simplified models of indoor daylighting calculations. The accuracy of the simulation code, introduced into CODYRUN software to simulate correctly indoor illuminance, is then verified. Besides, the software consists of a numerical building simulation code, developed in the Physics and Mathematical Engineering Laboratory for Energy and Environment (PIMENT) at the University of Reunion. Initially dedicated to the thermal, airflow and hydrous phenomena in the buildings, the software has been completed for the calculation of indoor daylighting. New models and algorithms - which rely on a semi-detailed approach - will be presented in this paper. In order to validate the accuracy of the integrated models, many test cases have been considered as analytical, inter-software comparisons and experimental comparisons. In order to prove the accuracy of the new model - which can properly simulate the illuminance - a

  18. Power-based electric vehicle energy consumption model: Model development and validation

    International Nuclear Information System (INIS)

    Fiori, Chiara; Ahn, Kyoungho; Rakha, Hesham A.

    2016-01-01

    Highlights: • The study developed an instantaneous energy consumption model (VT-CPEM) for EVs. • The model captures instantaneous braking energy regeneration. • The model can be used for transportation modeling and vehicle applications (e.g. eco-routing). • The proposed model can be easily calibrated using publically available EV data. • Usages of air conditioning and heating systems reduce EV energy consumption by up to 10% and 24%, respectively. - Abstract: The limited drive range (The maximum distance that an EV can travel.) of Electric Vehicles (EVs) is one of the major challenges that EV manufacturers are attempting to overcome. To this end, a simple, accurate, and efficient energy consumption model is needed to develop real-time eco-driving and eco-routing systems that can enhance the energy efficiency of EVs and thus extend their travel range. Although numerous publications have focused on the modeling of EV energy consumption levels, these studies are limited to measuring energy consumption of an EV’s control algorithm, macro-project evaluations, or simplified well-to-wheels analyses. Consequently, this paper addresses this need by developing a simple EV energy model that computes an EV’s instantaneous energy consumption using second-by-second vehicle speed, acceleration and roadway grade data as input variables. In doing so, the model estimates the instantaneous braking energy regeneration. The proposed model can be easily implemented in the following applications: in-vehicle, Smartphone eco-driving, eco-routing and transportation simulation software to quantify the network-wide energy consumption levels for a fleet of EVs. One of the main advantages of EVs is their ability to recover energy while braking using a regenerative braking system. State-of-the-art vehicle energy consumption models consider an average constant regenerative braking energy efficiency or regenerative braking factors that are mainly dependent on the vehicle’s average

  19. Developing and Validating the Socio-Technical Model in Ontology Engineering

    Science.gov (United States)

    Silalahi, Mesnan; Indra Sensuse, Dana; Giri Sucahyo, Yudho; Fadhilah Akmaliah, Izzah; Rahayu, Puji; Cahyaningsih, Elin

    2018-03-01

    This paper describes results from an attempt to develop a model in ontology engineering methodology and a way to validate the model. The approach to methodology in ontology engineering is from the point view of socio-technical system theory. Qualitative research synthesis is used to build the model using meta-ethnography. In order to ensure the objectivity of the measurement, inter-rater reliability method was applied using a multi-rater Fleiss Kappa. The results show the accordance of the research output with the diamond model in the socio-technical system theory by evidence of the interdependency of the four socio-technical variables namely people, technology, structure and task.

  20. Development and Validation of a Mathematical Model for Olive Oil Oxidation

    Science.gov (United States)

    Rahmouni, K.; Bouhafa, H.; Hamdi, S.

    2009-03-01

    A mathematical model describing the stability or the susceptibility to oxidation of extra virgin olive oil has been developed. The model has been resolved by an iterative method using differential finite method. It was validated by experimental data of extra virgin olive oil (EVOO) oxidation. EVOO stability was tested by using a Rancimat at four different temperatures 60, 70, 80 and 90° C until peroxide accumulation reached 20 [meq/kg]. Peroxide formation is speed relatively slow; fits zero order reaction with linear regression coefficients varying from 0, 98 to 0, 99. The mathematical model was used to predict the shelf life of bulk conditioned olive oil. This model described peroxide accumulation inside a container in excess of oxygen as a function of time at various positions from the interface air/oil. Good correlations were obtained between theoretical and experimental values.

  1. Toward Development of a Stochastic Wake Model: Validation Using LES and Turbine Loads

    Directory of Open Access Journals (Sweden)

    Jae Sang Moon

    2017-12-01

    Full Text Available Wind turbines within an array do not experience free-stream undisturbed flow fields. Rather, the flow fields on internal turbines are influenced by wakes generated by upwind unit and exhibit different dynamic characteristics relative to the free stream. The International Electrotechnical Commission (IEC standard 61400-1 for the design of wind turbines only considers a deterministic wake model for the design of a wind plant. This study is focused on the development of a stochastic model for waked wind fields. First, high-fidelity physics-based waked wind velocity fields are generated using Large-Eddy Simulation (LES. Stochastic characteristics of these LES waked wind velocity field, including mean and turbulence components, are analyzed. Wake-related mean and turbulence field-related parameters are then estimated for use with a stochastic model, using Multivariate Multiple Linear Regression (MMLR with the LES data. To validate the simulated wind fields based on the stochastic model, wind turbine tower and blade loads are generated using aeroelastic simulation for utility-scale wind turbine models and compared with those based directly on the LES inflow. The study’s overall objective is to offer efficient and validated stochastic approaches that are computationally tractable for assessing the performance and loads of turbines operating in wakes.

  2. Infant bone age estimation based on fibular shaft length: model development and clinical validation

    International Nuclear Information System (INIS)

    Tsai, Andy; Stamoulis, Catherine; Bixby, Sarah D.; Breen, Micheal A.; Connolly, Susan A.; Kleinman, Paul K.

    2016-01-01

    Bone age in infants (<1 year old) is generally estimated using hand/wrist or knee radiographs, or by counting ossification centers. The accuracy and reproducibility of these techniques are largely unknown. To develop and validate an infant bone age estimation technique using fibular shaft length and compare it to conventional methods. We retrospectively reviewed negative skeletal surveys of 247 term-born low-risk-of-abuse infants (no persistent child protection team concerns) from July 2005 to February 2013, and randomized them into two datasets: (1) model development (n = 123) and (2) model testing (n = 124). Three pediatric radiologists measured all fibular shaft lengths. An ordinary linear regression model was fitted to dataset 1, and the model was evaluated using dataset 2. Readers also estimated infant bone ages in dataset 2 using (1) the hemiskeleton method of Sontag, (2) the hemiskeleton method of Elgenmark, (3) the hand/wrist atlas of Greulich and Pyle, and (4) the knee atlas of Pyle and Hoerr. For validation, we selected lower-extremity radiographs of 114 normal infants with no suspicion of abuse. Readers measured the fibulas and also estimated bone ages using the knee atlas. Bone age estimates from the proposed method were compared to the other methods. The proposed method outperformed all other methods in accuracy and reproducibility. Its accuracy was similar for the testing and validating datasets, with root-mean-square error of 36 days and 37 days; mean absolute error of 28 days and 31 days; and error variability of 22 days and 20 days, respectively. This study provides strong support for an infant bone age estimation technique based on fibular shaft length as a more accurate alternative to conventional methods. (orig.)

  3. Infant bone age estimation based on fibular shaft length: model development and clinical validation

    Energy Technology Data Exchange (ETDEWEB)

    Tsai, Andy; Stamoulis, Catherine; Bixby, Sarah D.; Breen, Micheal A.; Connolly, Susan A.; Kleinman, Paul K. [Boston Children' s Hospital, Harvard Medical School, Department of Radiology, Boston, MA (United States)

    2016-03-15

    Bone age in infants (<1 year old) is generally estimated using hand/wrist or knee radiographs, or by counting ossification centers. The accuracy and reproducibility of these techniques are largely unknown. To develop and validate an infant bone age estimation technique using fibular shaft length and compare it to conventional methods. We retrospectively reviewed negative skeletal surveys of 247 term-born low-risk-of-abuse infants (no persistent child protection team concerns) from July 2005 to February 2013, and randomized them into two datasets: (1) model development (n = 123) and (2) model testing (n = 124). Three pediatric radiologists measured all fibular shaft lengths. An ordinary linear regression model was fitted to dataset 1, and the model was evaluated using dataset 2. Readers also estimated infant bone ages in dataset 2 using (1) the hemiskeleton method of Sontag, (2) the hemiskeleton method of Elgenmark, (3) the hand/wrist atlas of Greulich and Pyle, and (4) the knee atlas of Pyle and Hoerr. For validation, we selected lower-extremity radiographs of 114 normal infants with no suspicion of abuse. Readers measured the fibulas and also estimated bone ages using the knee atlas. Bone age estimates from the proposed method were compared to the other methods. The proposed method outperformed all other methods in accuracy and reproducibility. Its accuracy was similar for the testing and validating datasets, with root-mean-square error of 36 days and 37 days; mean absolute error of 28 days and 31 days; and error variability of 22 days and 20 days, respectively. This study provides strong support for an infant bone age estimation technique based on fibular shaft length as a more accurate alternative to conventional methods. (orig.)

  4. Comorbidity predicts poor prognosis in nasopharyngeal carcinoma: Development and validation of a predictive score model

    International Nuclear Information System (INIS)

    Guo, Rui; Chen, Xiao-Zhong; Chen, Lei; Jiang, Feng; Tang, Ling-Long; Mao, Yan-Ping; Zhou, Guan-Qun; Li, Wen-Fei; Liu, Li-Zhi; Tian, Li; Lin, Ai-Hua; Ma, Jun

    2015-01-01

    Background and purpose: The impact of comorbidity on prognosis in nasopharyngeal carcinoma (NPC) is poorly characterized. Material and methods: Using the Adult Comorbidity Evaluation-27 (ACE-27) system, we assessed the prognostic value of comorbidity and developed, validated and confirmed a predictive score model in a training set (n = 658), internal validation set (n = 658) and independent set (n = 652) using area under the receiver operating curve analysis. Results: Comorbidity was present in 40.4% of 1968 patients (mild, 30.1%; moderate, 9.1%; severe, 1.2%). Compared to an ACE-27 score ⩽1, patients with an ACE-27 score >1 in the training set had shorter overall survival (OS) and disease-free survival (DFS) (both P < 0.001), similar results were obtained in the other sets (P < 0.05). In multivariate analysis, ACE-27 score was a significant independent prognostic factor for OS and DFS. The combined risk score model including ACE-27 had superior prognostic value to TNM stage alone in the internal validation set (0.70 vs. 0.66; P = 0.02), independent set (0.73 vs. 0.67; P = 0.002) and all patients (0.71 vs. 0.67; P < 0.001). Conclusions: Comorbidity significantly affects prognosis, especially in stages II and III, and should be incorporated into the TNM staging system for NPC. Assessment of comorbidity may improve outcome prediction and help tailor individualized treatment

  5. The development and validation of a five-factor model of Sources of Self-Efficacy in clinical nursing education

    NARCIS (Netherlands)

    Gloudemans, H.; Reynaert, W.; Schalk, R.; Braeken, J.

    2013-01-01

    Background: The aim of this study is to validate a newly developed nurses' self-efficacy sources inventory. We test the validity of a five-dimensional model of sources of self-efficacy, which we contrast with the traditional four-dimensional model based on Bandura’s theoretical

  6. The development and validation of a five factor model of sources of self-efficacy in clinical nursing education

    NARCIS (Netherlands)

    Prof. Dr. Rene Schalk; dr. Wouter Reynaert; Dr. Johan Braeken; Drs. Henk Gloudemans

    2012-01-01

    Background: The aim of this study is to validate a newly developed nurses' self-efficacy sources inventory. We test the validity of a five-dimensional model of sources of self-efficacy, which we contrast with the traditional four-dimensional model based on Bandura's theoretical concepts. Methods:

  7. The development and validation of a five-factor model of sources of self-efficacy in clinical nursing education

    NARCIS (Netherlands)

    Gloudemans, H.; Schalk, R.; Reynaert, W.M.; Braeken, J.

    2013-01-01

    Background: The aim of this study is to validate a newly developed nurses' self-efficacy sources inventory. We test the validity of a five-dimensional model of sources of self-efficacy, which we contrast with the traditional four-dimensional model based on Bandura’s theoretical concepts. Methods:

  8. Development and validation of the ENIGMA code for MOX fuel performance modelling

    International Nuclear Information System (INIS)

    Palmer, I.; Rossiter, G.; White, R.J.

    2000-01-01

    The ENIGMA fuel performance code has been under development in the UK since the mid-1980s with contributions made by both the fuel vendor (BNFL) and the utility (British Energy). In recent years it has become the principal code for UO 2 fuel licensing for both PWR and AGR reactor systems in the UK and has also been used by BNFL in support of overseas UO 2 and MOX fuel business. A significant new programme of work has recently been initiated by BNFL to further develop the code specifically for MOX fuel application. Model development is proceeding hand in hand with a major programme of MOX fuel testing and PIE studies, with the objective of producing a fuel modelling code suitable for mechanistic analysis, as well as for licensing applications. This paper gives an overview of the model developments being undertaken and of the experimental data being used to underpin and to validate the code. The paper provides a summary of the code development programme together with specific examples of new models produced. (author)

  9. Development and validation of a weight-bearing finite element model for total knee replacement.

    Science.gov (United States)

    Woiczinski, M; Steinbrück, A; Weber, P; Müller, P E; Jansson, V; Schröder, Ch

    2016-01-01

    Total knee arthroplasty (TKA) is a successful procedure for osteoarthritis. However, some patients (19%) do have pain after surgery. A finite element model was developed based on boundary conditions of a knee rig. A 3D-model of an anatomical full leg was generated from magnetic resonance image data and a total knee prosthesis was implanted without patella resurfacing. In the finite element model, a restarting procedure was programmed in order to hold the ground reaction force constant with an adapted quadriceps muscle force during a squat from 20° to 105° of flexion. Knee rig experimental data were used to validate the numerical model in the patellofemoral and femorotibial joint. Furthermore, sensitivity analyses of Young's modulus of the patella cartilage, posterior cruciate ligament (PCL) stiffness, and patella tendon origin were performed. Pearson's correlations for retropatellar contact area, pressure, patella flexion, and femorotibial ap-movement were near to 1. Lowest root mean square error for retropatellar pressure, patella flexion, and femorotibial ap-movement were found for the baseline model setup with Young's modulus of 5 MPa for patella cartilage, a downscaled PCL stiffness of 25% compared to the literature given value and an anatomical origin of the patella tendon. The results of the conducted finite element model are comparable with the experimental results. Therefore, the finite element model developed in this study can be used for further clinical investigations and will help to better understand the clinical aspects after TKA with an unresurfaced patella.

  10. Assessment for Complex Learning Resources: Development and Validation of an Integrated Model

    Directory of Open Access Journals (Sweden)

    Gudrun Wesiak

    2013-01-01

    Full Text Available Today’s e-learning systems meet the challenge to provide interactive, personalized environments that support self-regulated learning as well as social collaboration and simulation. At the same time assessment procedures have to be adapted to the new learning environments by moving from isolated summative assessments to integrated assessment forms. Therefore, learning experiences enriched with complex didactic resources - such as virtualized collaborations and serious games - have emerged. In this extension of [1] an integrated model for e-assessment (IMA is outlined, which incorporates complex learning resources and assessment forms as main components for the development of an enriched learning experience. For a validation the IMA was presented to a group of experts from the fields of cognitive science, pedagogy, and e-learning. The findings from the validation lead to several refinements of the model, which mainly concern the component forms of assessment and the integration of social aspects. Both aspects are accounted for in the revised model, the former by providing a detailed sub-model for assessment forms.

  11. Development and Validation of Predictive Models of Cardiac Mortality and Transplantation in Resynchronization Therapy

    Directory of Open Access Journals (Sweden)

    Eduardo Arrais Rocha

    2015-01-01

    Full Text Available Abstract Background: 30-40% of cardiac resynchronization therapy cases do not achieve favorable outcomes. Objective: This study aimed to develop predictive models for the combined endpoint of cardiac death and transplantation (Tx at different stages of cardiac resynchronization therapy (CRT. Methods: Prospective observational study of 116 patients aged 64.8 ± 11.1 years, 68.1% of whom had functional class (FC III and 31.9% had ambulatory class IV. Clinical, electrocardiographic and echocardiographic variables were assessed by using Cox regression and Kaplan-Meier curves. Results: The cardiac mortality/Tx rate was 16.3% during the follow-up period of 34.0 ± 17.9 months. Prior to implantation, right ventricular dysfunction (RVD, ejection fraction < 25% and use of high doses of diuretics (HDD increased the risk of cardiac death and Tx by 3.9-, 4.8-, and 5.9-fold, respectively. In the first year after CRT, RVD, HDD and hospitalization due to congestive heart failure increased the risk of death at hazard ratios of 3.5, 5.3, and 12.5, respectively. In the second year after CRT, RVD and FC III/IV were significant risk factors of mortality in the multivariate Cox model. The accuracy rates of the models were 84.6% at preimplantation, 93% in the first year after CRT, and 90.5% in the second year after CRT. The models were validated by bootstrapping. Conclusion: We developed predictive models of cardiac death and Tx at different stages of CRT based on the analysis of simple and easily obtainable clinical and echocardiographic variables. The models showed good accuracy and adjustment, were validated internally, and are useful in the selection, monitoring and counseling of patients indicated for CRT.

  12. Development and validation of mechanical model for saturated/unsaturated bentonite buffer

    International Nuclear Information System (INIS)

    Yamamoto, S.; Komine, H.; Kato, S.

    2010-01-01

    Document available in extended abstract form only. Development and validation of mechanical models for bentonite buffer and backfill materials are one of important subjects to appropriately evaluate long term behaviour or condition of the EBS in radioactive waste disposal. The Barcelona Basic Model (BBM), which is one of extensions of the modified Cam-Clay model for unsaturated and expansive soil, has been developed and widely applied to several problems by using the coupled THM code, Code B right. Advantage of the model is that mechanical characteristics of buffer and backfill materials under not only saturated condition but also unsaturated one are taken account as well as swelling characteristics due to wetting. In this study the BBM is compared with already existing experimental data and already developed another model in terms of swelling characteristics of Japanese bentonite Kunigel-V1, and is validated in terms of consolidation characteristics based on newly performed controlled-suction oedometer tests for the Kunigel-V1 bentonite. Komine et al. (2003) have proposed a model (set of equations) for predicting swelling characteristics based on the diffuse double layer concept and the van der Waals force concept etc. They performed a lot of swelling deformation tests of bentonite and sand-bentonite mixture to confirm the applicability of the model. The BBM well agrees with the model proposed by Komine et al. and the experimental data in terms of swelling characteristics. Compression index and swelling index depending on suction are introduced in the BBM. Controlled-suction consolidation tests (oedometer tests) were performed to confirm the applicability of the suction dependent indexes to unsaturated bentonite. Compacted bentonite with initial dry density of 1.0 Mg/m 3 was tested. Constant suction, 80 kPa, 280 kPa and 480 kPa was applied and kept during the consolidation tests. Applicability of the BBM to consolidation and swelling behaviour of saturated and

  13. Validation of the hdm models forcrack initiation and development, rutting and roughness of the pavement

    Directory of Open Access Journals (Sweden)

    Ognjenović Slobodan

    2017-01-01

    Full Text Available Worldwide practice recommends validation of the HDM models with some other software that can be used for comparison of the forecasting results. The program package MATLAB is used in this case, as it enables for modelling of all the HDM models. A statistic validation of the results of the forecasts concerning the condition of the pavements in HDM with the on-field measuring results was also performed. This paper shall present the results of the validation of the coefficients of calibration of the deterioration models in HDM 4 on the Macedonian highways.

  14. Development and validation of models for bubble coalescence and breakup. Final report

    International Nuclear Information System (INIS)

    Liao, Y.; Lucas, D.

    2013-02-01

    A new generalized model for bubble coalescence and breakup has been developed. It is based on physical considerations and takes into account various mechanisms that can lead to bubble coalescence and breakup. First, in a detailed literature review, the available models were compiled and analyzed. It turned out that many of them show a contradictory behaviour. None of these models allows the prediction of the evolution of bubble size distributions along a pipe flow for a wide range of combinations of flow rates of the gas and the liquid phase. The new model has been extensively studied in a simplified Test-Solver. Although this does not cover all details of a developing flow along the pipe, it allows - in contrast to a CFD code - to conduct a large number of variational calculations to investigate the influence of individual sizes and models. Coalescence and breakup cannot be considered separately from other phenomena and models that reflect these phenomena. There are close interactions with the turbulence of the liquid phase and the momentum exchange between phases. Since the dissipation rate of turbulent kinetic energy is a direct input parameter for the new model, the turbulence modelling has been studied very carefully. To validate the model, a special experimental series for air-water flows was used, conducted at the TOPFLOW facility in an 8-meter long DN200 pipe. The data are characterized by high quality and were produced within the TOPFLOW-II project. The test series aims to provide a basis for the work presented here. Predicting the evolution of the bubble size distribution along the pipe could be improved significantly in comparison to the previous standard models for bubble coalescence and breakup implemented in CFX. However some quantitative discrepancies remain. The full model equations as well as an implementation as ''User-FORTRAN'' in CFX are available and can be used for further work on the simulation of poly-disperse bubbly flows.

  15. Development and validation of corium oxidation model for the VAPEX code

    International Nuclear Information System (INIS)

    Blinkov, V.N.; Melikhov, V.I.; Davydov, M.V.; Melikhov, O.I.; Borovkova, E.M.

    2011-01-01

    In light water reactor core melt accidents, the molten fuel (corium) can be brought into contact with coolant water in the course of the melt relocation in-vessel and ex-vessel as well as in an accident mitigation action of water addition. Mechanical energy release from such an interaction is of interest in evaluating the structural integrity of the reactor vessel as well as of the containment. Usually, the source for the energy release is considered to be the rapid transfer of heat from the molten fuel to the water ('vapor explosion'). When the fuel contains a chemically reactive metal component, there could be an additional source for the energy release, which is the heat release and hydrogen production due to the metal-water chemical reaction. In Electrogorsk Research and Engineering Center the computer code VAPEX (VAPor EXplosion) has been developed for analysis of the molten fuel coolant interaction. Multifield approach is used for modeling of dynamics of following phases: water, steam, melt jet, melt droplets, debris. The VAPEX code was successfully validated on FARO experimental data. Hydrogen generation was observed in FARO tests even though corium didn't contain metal component. The reason for hydrogen generation was not clear, so, simplified empirical model of hydrogen generation was implemented in the VAPEX code to take into account input of hydrogen into pressure increase. This paper describes new more detailed model of hydrogen generation due to the metal-water chemical reaction and results of its validation on ZREX experiments. (orig.)

  16. Non-isothermal processes during the drying of bare soil: Model Development and Validation

    Science.gov (United States)

    Sleep, B.; Talebi, A.; O'Carrol, D. M.

    2017-12-01

    Several coupled liquid water, water vapor, and heat transfer models have been developed either to study non-isothermal processes in the subsurface immediately below the ground surface, or to predict the evaporative flux from the ground surface. Equilibrium phase change between water and gas phases is typically assumed in these models. Recently, a few studies have questioned this assumption and proposed a coupled model considering kinetic phase change. However, none of these models were validated against real field data. In this study, a non-isothermal coupled model incorporating kinetic phase change was developed and examined against the measured data from a green roof test module. The model also incorporated a new surface boundary condition for water vapor transport at the ground surface. The measured field data included soil moisture content and temperature at different depths up to the depth of 15 cm below the ground surface. Lysimeter data were collected to determine the evaporation rates. Short and long wave radiation, wind velocity, air ambient temperature and relative humidity were measured and used as model input. Field data were collected for a period of three months during the warm seasons in south eastern Canada. The model was calibrated using one drying period and then several other drying periods were simulated. In general, the model underestimated the evaporation rates in the early stage of the drying period, however, the cumulative evaporation was in good agreement with the field data. The model predicted the trends in temperature and moisture content at the different depths in the green roof module. The simulated temperature was lower than the measured temperature for most of the simulation time with the maximum difference of 5 ° C. The simulated moisture content changes had the same temporal trend as the lysimeter data for the events simulated.

  17. The development and validation of a numerical integration method for non-linear viscoelastic modeling

    Science.gov (United States)

    Ramo, Nicole L.; Puttlitz, Christian M.

    2018-01-01

    Compelling evidence that many biological soft tissues display both strain- and time-dependent behavior has led to the development of fully non-linear viscoelastic modeling techniques to represent the tissue’s mechanical response under dynamic conditions. Since the current stress state of a viscoelastic material is dependent on all previous loading events, numerical analyses are complicated by the requirement of computing and storing the stress at each step throughout the load history. This requirement quickly becomes computationally expensive, and in some cases intractable, for finite element models. Therefore, we have developed a strain-dependent numerical integration approach for capturing non-linear viscoelasticity that enables calculation of the current stress from a strain-dependent history state variable stored from the preceding time step only, which improves both fitting efficiency and computational tractability. This methodology was validated based on its ability to recover non-linear viscoelastic coefficients from simulated stress-relaxation (six strain levels) and dynamic cyclic (three frequencies) experimental stress-strain data. The model successfully fit each data set with average errors in recovered coefficients of 0.3% for stress-relaxation fits and 0.1% for cyclic. The results support the use of the presented methodology to develop linear or non-linear viscoelastic models from stress-relaxation or cyclic experimental data of biological soft tissues. PMID:29293558

  18. Anode partial flooding modelling of proton exchange membrane fuel cells: Model development and validation

    International Nuclear Information System (INIS)

    Xing, Lei; Du, Shangfeng; Chen, Rui; Mamlouk, Mohamed; Scott, Keith

    2016-01-01

    A two-dimensional along-the-channel CFD (computational fluid dynamic) model, coupled with a two-phase flow model of liquid water and gas transport for a PEM (proton exchange membrane) fuel cell is described. The model considers non-isothermal operation and thus the non-uniform temperature distribution in the cell structure. Water phase-transfer between the vapour, liquid water and dissolved phase is modelled with the combinational transport mechanism through the membrane. Liquid water saturation is simulated inside the electrodes and channels at both the anode and cathode sides. Three types of models are compared for the HOR (hydrogen oxidation reaction) and ORR (oxygen reduction reaction) in catalyst layers, including Butler–Volmer (B–V), liquid water saturation corrected B–V and agglomerate mechanisms. Temperature changes in MEA (membrane electrode assembly) and channels due to electrochemical reaction, ohmic resistance and water phase-transfer are analysed as a function of current density. Nonlinear relations of liquid water saturations with respect to current densities at both the anode and cathode are regressed. At low and high current densities, liquid water saturation at the anode linearly increases as a consequence of the linear increase of liquid water saturation at the cathode. In contrast, exponential relation is found to be more accurate at medium current densities. - Highlights: • A fully coupled 2D, along-the-channel, two-phase flow, non-isothermal, CFD model is developed. • Temperature rise due to electrochemical reactions, ohmic resistance and water phase-transfer is analysed. • Mathematical expressions of liquid water saturation against current density at anode and cathode are regressed. • Relationship between the liquid water saturation at anode and cathode is built.

  19. Development and validation of a model for CANDU-6 SDS2 poison injection analysis

    International Nuclear Information System (INIS)

    Lee, B. W.; Jung, C. J.; Min, B. J.; Yoon, H. J.; Choi, J. H.; Jang, D. S.

    2002-01-01

    In CANDU-6 reactor there are two independent reactor shutdown systems. The shutdown system no. 2(SDS2) injects the liquid poison into the moderator tank by high pressure via small holes on the 6 nozzle pipes and stops the nuclear chain reaction. To ensure the safe shutdown of a reactor loaded with either DUPIC or SEU fuels it is necessary for the poison curtains generated by jets provide quick, and enough negative reactivity to the reactor during the early stage of the accident. In order to produce the neutron cross section necessary to perform this work, the poison concentration distribution during the transient is necessary. The motivation for this work arose from the fact that the computer code package for performing this task is not transfered to Korea yet. In this study, a set of models for analyzing the transient poison concentration induced by this high pressure poison injection jet activated upon the reactor trip in a CANDU-6 reactor moderator tank has been developed and used to generate the poison concentration distribution of the poison curtains induced by the high pressure jets injected into the vacant region between the pressure tube banks. The poison injection rate through the jet holes drilled on the nozzle pipes is obtained by a 1-D transient hydrodynamic code called, ALITRIG, and this injection rate is used to provide the inlet boundary condition to a 3-D CFD model of the moderator tank based on CFX4.3, a commercial CFD code developed by AEA technology, to simulate the formation of the poison jet curtain inside the moderator tank. For the validation, a simulation for a generic CANDU-6 SDS2 design poison jet growth experiment was made to evaluate this model's capability against experiment. As no concentration field was measured and only the growth of the poison jet height was obtained by high speed camera, the validation was limited as such. The result showed that if one assume the jet front corresponds to 200 ppm of the poison the model succeed to

  20. A diagnostic model for the detection of sensitization to wheat allergens was developed and validated in bakery workers

    NARCIS (Netherlands)

    Suarthana, Eva; Vergouwe, Yvonne; Moons, Karel G.; de Monchy, Jan; Grobbee, Diederick; Heederik, Dick; Meijer, Evert

    Objectives: To develop and validate a prediction model to detect sensitization to wheat allergens in bakery workers. Study Design and Setting: The prediction model was developed in 867 Dutch bakery workers (development set, prevalence of sensitization 13%) and included questionnaire items (candidate

  1. Development and validation of a dynamical atmosphere-vegetation-soil HTO transport and OBT formation model

    Energy Technology Data Exchange (ETDEWEB)

    Ota, Masakazu, E-mail: ohta.masakazu@jaea.go.jp [Research Group for Environmental Science, Division of Environment and Radiation, Nuclear Science and Engineering Directorate, Japan Atomic Energy Agency (Japan); Nagai, Haruyasu [Research Group for Environmental Science, Division of Environment and Radiation, Nuclear Science and Engineering Directorate, Japan Atomic Energy Agency (Japan)

    2011-09-15

    A numerical model simulating transport of tritiated water (HTO) in atmosphere-soil-vegetation system, and, accumulation of organically bound tritium (OBT) in vegetative leaves was developed. Characteristic of the model is, for calculating tritium transport, it incorporates a dynamical atmosphere-soil-vegetation model (SOLVEG-II) that calculates transport of heat and water, and, exchange of CO{sub 2}. The processes included for calculating tissue free water tritium (TFWT) in leaves are HTO exchange between canopy air and leaf cellular water, root uptake of aqueous HTO in soil, photosynthetic assimilation of TFWT into OBT, and, TFWT formation from OBT through respiration. Tritium fluxes at the last two processes are input to a carbohydrate compartment model in leaves that calculates OBT translocation from leaves and allocation in them, by using photosynthesis and respiration rate in leaves. The developed model was then validated through a simulation of an existing experiment of acute exposure of grape plants to atmospheric HTO. Calculated TFWT concentration in leaves increased soon after the start of HTO exposure, reaching to equilibrium with the atmospheric HTO within a few hours, and then rapidly decreased after the end of the exposure. Calculated non-exchangeable OBT amount in leaves linearly increased during the exposure, and after the exposure, rapidly decreased in daytime, and, moderately nighttime. These variations in the calculated TFWT concentrations and OBT amounts, each mainly controlled by HTO exchange between canopy air and leaf cellular water and by carbohydrates translocation from leaves, fairly agreed with the observations within average errors of a factor of two. - Highlights: > TFWT retention and OBT formation in leaves were modeled > The model fairly well calculates TFWT concentration after an acute HTO exposure > The model well assesses OBT formation and attenuation of OBT amount in leaves.

  2. Developing and validating a model to predict the success of an IHCS implementation: the Readiness for Implementation Model

    Science.gov (United States)

    Gustafson, David H; Hawkins, Robert P; Brennan, Patricia F; Dinauer, Susan; Johnson, Pauley R; Siegler, Tracy

    2010-01-01

    Objective To develop and validate the Readiness for Implementation Model (RIM). This model predicts a healthcare organization's potential for success in implementing an interactive health communication system (IHCS). The model consists of seven weighted factors, with each factor containing five to seven elements. Design Two decision-analytic approaches, self-explicated and conjoint analysis, were used to measure the weights of the RIM with a sample of 410 experts. The RIM model with weights was then validated in a prospective study of 25 IHCS implementation cases. Measurements Orthogonal main effects design was used to develop 700 conjoint-analysis profiles, which varied on seven factors. Each of the 410 experts rated the importance and desirability of the factors and their levels, as well as a set of 10 different profiles. For the prospective 25-case validation, three time-repeated measures of the RIM scores were collected for comparison with the implementation outcomes. Results Two of the seven factors, ‘organizational motivation’ and ‘meeting user needs,’ were found to be most important in predicting implementation readiness. No statistically significant difference was found in the predictive validity of the two approaches (self-explicated and conjoint analysis). The RIM was a better predictor for the 1-year implementation outcome than the half-year outcome. Limitations The expert sample, the order of the survey tasks, the additive model, and basing the RIM cut-off score on experience are possible limitations of the study. Conclusion The RIM needs to be empirically evaluated in institutions adopting IHCS and sustaining the system in the long term. PMID:20962135

  3. Developing R&D Portfolio Business Validity Simulation Model and System

    Directory of Open Access Journals (Sweden)

    Hyun Jin Yeo

    2015-01-01

    Full Text Available The R&D has been recognized as critical method to take competitiveness by not only companies but also nations with its value creation such as patent value and new product. Therefore, R&D has been a decision maker’s burden in that it is hard to decide how much money to invest, how long time one should spend, and what technology to develop which means it accompanies resources such as budget, time, and manpower. Although there are diverse researches about R&D evaluation, business factors are not concerned enough because almost all previous studies are technology oriented evaluation with one R&D technology based. In that, we early proposed R&D business aspect evaluation model which consists of nine business model components. In this research, we develop a simulation model and system evaluating a company or industry’s R&D portfolio with business model point of view and clarify default and control parameters to facilitate evaluator’s business validity work in each evaluation module by integrate to one screen.

  4. Developing R&D portfolio business validity simulation model and system.

    Science.gov (United States)

    Yeo, Hyun Jin; Im, Kwang Hyuk

    2015-01-01

    The R&D has been recognized as critical method to take competitiveness by not only companies but also nations with its value creation such as patent value and new product. Therefore, R&D has been a decision maker's burden in that it is hard to decide how much money to invest, how long time one should spend, and what technology to develop which means it accompanies resources such as budget, time, and manpower. Although there are diverse researches about R&D evaluation, business factors are not concerned enough because almost all previous studies are technology oriented evaluation with one R&D technology based. In that, we early proposed R&D business aspect evaluation model which consists of nine business model components. In this research, we develop a simulation model and system evaluating a company or industry's R&D portfolio with business model point of view and clarify default and control parameters to facilitate evaluator's business validity work in each evaluation module by integrate to one screen.

  5. Developing R&D Portfolio Business Validity Simulation Model and System

    Science.gov (United States)

    2015-01-01

    The R&D has been recognized as critical method to take competitiveness by not only companies but also nations with its value creation such as patent value and new product. Therefore, R&D has been a decision maker's burden in that it is hard to decide how much money to invest, how long time one should spend, and what technology to develop which means it accompanies resources such as budget, time, and manpower. Although there are diverse researches about R&D evaluation, business factors are not concerned enough because almost all previous studies are technology oriented evaluation with one R&D technology based. In that, we early proposed R&D business aspect evaluation model which consists of nine business model components. In this research, we develop a simulation model and system evaluating a company or industry's R&D portfolio with business model point of view and clarify default and control parameters to facilitate evaluator's business validity work in each evaluation module by integrate to one screen. PMID:25893209

  6. MATSIM -The Development and Validation of a Numerical Voxel Model based on the MATROSHKA Phantom

    Science.gov (United States)

    Beck, Peter; Rollet, Sofia; Berger, Thomas; Bergmann, Robert; Hajek, Michael; Latocha, Marcin; Vana, Norbert; Zechner, Andrea; Reitz, Guenther

    The AIT Austrian Institute of Technology coordinates the project MATSIM (MATROSHKA Simulation) in collaboration with the Vienna University of Technology and the German Aerospace Center. The aim of the project is to develop a voxel-based model of the MATROSHKA anthro-pomorphic torso used at the International Space Station (ISS) as foundation to perform Monte Carlo high-energy particle transport simulations for different irradiation conditions. Funded by the Austrian Space Applications Programme (ASAP), MATSIM is a co-investigation with the European Space Agency (ESA) ELIPS project MATROSHKA, an international collaboration of more than 18 research institutes and space agencies from all over the world, under the science and project lead of the German Aerospace Center. The MATROSHKA facility is designed to determine the radiation exposure of an astronaut onboard ISS and especially during an ex-travehicular activity. The numerical model developed in the frame of MATSIM is validated by reference measurements. In this report we give on overview of the model development and compare photon and neutron irradiations of the detector-equipped phantom torso with Monte Carlo simulations using FLUKA. Exposure to Co-60 photons was realized in the standard ir-radiation laboratory at Seibersdorf, while investigations with neutrons were performed at the thermal column of the Vienna TRIGA Mark-II reactor. The phantom was loaded with passive thermoluminescence dosimeters. In addition, first results of the calculated dose distribution within the torso are presented for a simulated exposure in low-Earth orbit.

  7. Multivariable prediction model for suspected giant cell arteritis: development and validation

    Directory of Open Access Journals (Sweden)

    Ing EB

    2017-11-01

    Full Text Available Edsel B Ing,1 Gabriela Lahaie Luna,2 Andrew Toren,3 Royce Ing,4 John J Chen,5 Nitika Arora,6 Nurhan Torun,7 Otana A Jakpor,8 J Alexander Fraser,9 Felix J Tyndel,10 Arun NE Sundaram,10 Xinyang Liu,11 Cindy TY Lam,1 Vivek Patel,12 Ezekiel Weis,13 David Jordan,14 Steven Gilberg,14 Christian Pagnoux,15 Martin ten Hove21Department of Ophthalmology and Vision Sciences, University of Toronto Medical School, Toronto, 2Department of Ophthalmology, Queen’s University, Kingston, ON, 3Department of Ophthalmology, University of Laval, Quebec, QC, 4Toronto Eyelid, Strabismus and Orbit Surgery Clinic, Toronto, ON, Canada; 5Mayo Clinic, Department of Ophthalmology and Neurology, 6Mayo Clinic, Department of Ophthalmology, Rochester, MN, 7Department of Surgery, Division of Ophthalmology, Harvard Medical School, Boston, MA, 8Harvard Medical School, Boston, MA, USA; 9Department of Clinical Neurological Sciences and Ophthalmology, Western University, London, 10Department of Medicine, University of Toronto Medical School, Toronto, ON, Canada; 11Department of Medicine, Fudan University Shanghai Medical College, Shanghai, People’s Republic of China; 12Roski Eye Institute, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA; 13Departments of Ophthalmology, Universities of Alberta and Calgary, Edmonton and Calgary, AB, 14Department of Ophthalmology, University of Ottawa, Ottawa, ON, 15Vasculitis Clinic, Mount Sinai Hospital, Toronto, ON, CanadaPurpose: To develop and validate a diagnostic prediction model for patients with suspected giant cell arteritis (GCA.Methods: A retrospective review of records of consecutive adult patients undergoing temporal artery biopsy (TABx for suspected GCA was conducted at seven university centers. The pathologic diagnosis was considered the final diagnosis. The predictor variables were age, gender, new onset headache, clinical temporal artery abnormality, jaw claudication, ischemic vision loss (VL, diplopia

  8. Development and validation of a CFD model predicting the backfill process of a nuclear waste gallery

    International Nuclear Information System (INIS)

    Gopala, Vinay Ramohalli; Lycklama a Nijeholt, Jan-Aiso; Bakker, Paul; Haverkate, Benno

    2011-01-01

    Research highlights: → This work presents the CFD simulation of the backfill process of Supercontainers with nuclear waste emplaced in a disposal gallery. → The cement-based material used for backfill is grout and the flow of grout is modelled as a Bingham fluid. → The model is verified against an analytical solution and validated against the flowability tests for concrete. → Comparison between backfill plexiglas experiment and simulation shows a distinct difference in the filling pattern. → The numerical model needs to be further developed to include segregation effects and thixotropic behavior of grout. - Abstract: Nuclear waste material may be stored in underground tunnels for long term storage. The example treated in this article is based on the current Belgian disposal concept for High-Level Waste (HLW), in which the nuclear waste material is packed in concrete shielded packages, called Supercontainers, which are inserted into these tunnels. After placement of the packages in the underground tunnels, the remaining voids between the packages and the tunnel lining is filled-up with a cement-based material called grout in order to encase the stored containers into the underground spacing. This encasement of the stored containers inside the tunnels is known as the backfill process. A good backfill process is necessary to stabilize the waste gallery against ground settlements. A numerical model to simulate the backfill process can help to improve and optimize the process by ensuring a homogeneous filling with no air voids and also optimization of the injection positions to achieve a homogeneous filling. The objective of the present work is to develop such a numerical code that can predict the backfill process well and validate the model against the available experiments and analytical solutions. In the present work the rheology of Grout is modelled as a Bingham fluid which is implemented in OpenFOAM - a finite volume-based open source computational fluid

  9. Validation of simulation models

    DEFF Research Database (Denmark)

    Rehman, Muniza; Pedersen, Stig Andur

    2012-01-01

    In philosophy of science, the interest for computational models and simulations has increased heavily during the past decades. Different positions regarding the validity of models have emerged but the views have not succeeded in capturing the diversity of validation methods. The wide variety...

  10. Anatomical Cystocele Recurrence: Development and Internal Validation of a Prediction Model.

    Science.gov (United States)

    Vergeldt, Tineke F M; van Kuijk, Sander M J; Notten, Kim J B; Kluivers, Kirsten B; Weemhoff, Mirjam

    2016-02-01

    To develop a prediction model that estimates the risk of anatomical cystocele recurrence after surgery. The databases of two multicenter prospective cohort studies were combined, and we performed a retrospective secondary analysis of these data. Women undergoing an anterior colporrhaphy without mesh materials and without previous pelvic organ prolapse (POP) surgery filled in a questionnaire, underwent translabial three-dimensional ultrasonography, and underwent staging of POP preoperatively and postoperatively. We developed a prediction model using multivariable logistic regression and internally validated it using standard bootstrapping techniques. The performance of the prediction model was assessed by computing indices of overall performance, discriminative ability, calibration, and its clinical utility by computing test characteristics. Of 287 included women, 149 (51.9%) had anatomical cystocele recurrence. Factors included in the prediction model were assisted delivery, preoperative cystocele stage, number of compartments involved, major levator ani muscle defects, and levator hiatal area during Valsalva. Potential predictors that were excluded after backward elimination because of high P values were age, body mass index, number of vaginal deliveries, and family history of POP. The shrinkage factor resulting from the bootstrap procedure was 0.91. After correction for optimism, Nagelkerke's R and the Brier score were 0.15 and 0.22, respectively. This indicates satisfactory model fit. The area under the receiver operating characteristic curve of the prediction model was 71.6% (95% confidence interval 65.7-77.5). After correction for optimism, the area under the receiver operating characteristic curve was 69.7%. This prediction model, including history of assisted delivery, preoperative stage, number of compartments, levator defects, and levator hiatus, estimates the risk of anatomical cystocele recurrence.

  11. Development and validation of a mathematical model for growth of pathogens in cut melons.

    Science.gov (United States)

    Li, Di; Friedrich, Loretta M; Danyluk, Michelle D; Harris, Linda J; Schaffner, Donald W

    2013-06-01

    Many outbreaks of foodborne illness associated with the consumption of fresh-cut melons have been reported. The objective of our research was to develop a mathematical model that predicts the growth rate of Salmonella on fresh-cut cantaloupe over a range of storage temperatures and to validate that model by using Salmonella and Escherichia coli O157:H7 on cantaloupe, honeydew, and watermelon, using both new data and data from the published studies. The growth of Salmonella on honeydew and watermelon and E. coli O157:H7 on cantaloupe, honeydew, and watermelon was monitored at temperatures of 4 to 25°C. The Ratkowsky (or square-root model) was used to describe Salmonella growth on cantaloupe as a function of storage temperature. Our results show that the levels of Salmonella on fresh-cut cantaloupe with an initial load of 3 log CFU/g can reach over 7 log CFU/g at 25°C within 24 h. No growth was observed at 4°C. A linear correlation was observed between the square root of Salmonella growth rate and temperature, such that √growth rate = 0.026 × (T - 5.613), R(2) = 0.9779. The model was generally suitable for predicting the growth of both Salmonella and E. coli O157:H7 on cantaloupe, honeydew, and watermelon, for both new data and data from the published literature. When compared with existing models for growth of Salmonella, the new model predicts a theoretic minimum growth temperature similar to the ComBase Predictive Models and Pathogen Modeling Program models but lower than other food-specific models. The ComBase Prediction Models results are very similar to the model developed in this study. Our research confirms that Salmonella can grow quickly and reach high concentrations when cut cantaloupe is stored at ambient temperatures, without visual signs of spoilage. Our model provides a fast and cost-effective method to estimate the effects of storage temperature on fresh-cut melon safety and could also be used in subsequent quantitative microbial risk

  12. Development and validation of outcome prediction models for aneurysmal subarachnoid haemorrhage : The SAHIT multinational cohort study

    NARCIS (Netherlands)

    Jaja, Blessing N R; Saposnik, Gustavo; Lingsma, Hester F.; Macdonald, Erin; Thorpe, Kevin E.; Mamdani, Muhammed; Steyerberg, Ewout W.; Molyneux, Andrew; Manoel, Airton Leonardo De Oliveira; Schatlo, Bawarjan; Hanggi, Daniel; Hasan, David M.; Wong, George K C; Etminan, Nima; Fukuda, Hitoshi; Torner, James C.; Schaller, Karl L.; Suarez, Jose I.; Stienen, Martin N.; Vergouwen, Mervyn D.I.; Rinkel, Gabriel J.E.; Spears, Julian; Cusimano, Michael D.; Todd, Michael; Le Roux, Peter; Kirkpatrick, Peter J.; Pickard, John; Van Den Bergh, Walter M.; Murray, Gordon D; Johnston, S. Claiborne; Yamagata, Sen; Mayer, Stephan A.; Schweizer, Tom A.; Macdonald, R. Loch

    2018-01-01

    Objective To develop and validate a set of practical prediction tools that reliably estimate the outcome of subarachnoid haemorrhage from ruptured intracranial aneurysms (SAH). Design Cohort study with logistic regression analysis to combine predictors and treatment modality. Setting Subarachnoid

  13. Physical validation issue of the NEPTUNE two-phase modelling: validation plan to be adopted, experimental programs to be set up and associated instrumentation techniques developed

    International Nuclear Information System (INIS)

    Pierre Peturaud; Eric Hervieu

    2005-01-01

    Full text of publication follows: A long-term joint development program for the next generation of nuclear reactors simulation tools has been launched in 2001 by EDF (Electricite de France) and CEA (Commissariat a l'Energie Atomique). The NEPTUNE Project constitutes the Thermal-Hydraulics part of this comprehensive program. Along with the underway development of this new two-phase flow software platform, the physical validation of the involved modelling is a crucial issue, whatever the modelling scale is, and the present paper deals with this issue. After a brief recall about the NEPTUNE platform, the general validation strategy to be adopted is first of all clarified by means of three major features: (i) physical validation in close connection with the concerned industrial applications, (ii) involving (as far as possible) a two-step process successively focusing on dominant separate models and assessing the whole modelling capability, (iii) thanks to the use of relevant data with respect to the validation aims. Based on this general validation process, a four-step generic work approach has been defined; it includes: (i) a thorough analysis of the concerned industrial applications to identify the key physical phenomena involved and associated dominant basic models, (ii) an assessment of these models against the available validation pieces of information, to specify the additional validation needs and define dedicated validation plans, (iii) an inventory and assessment of existing validation data (with respect to the requirements specified in the previous task) to identify the actual needs for new validation data, (iv) the specification of the new experimental programs to be set up to provide the needed new data. This work approach has been applied to the NEPTUNE software, focusing on 8 high priority industrial applications, and it has resulted in the definition of (i) the validation plan and experimental programs to be set up for the open medium 3D modelling

  14. Performance of a pavement solar energy collector: Model development and validation

    International Nuclear Information System (INIS)

    Guldentops, Gert; Nejad, Alireza Mahdavi; Vuye, Cedric; Van den bergh, Wim; Rahbar, Nima

    2016-01-01

    Highlights: • A novel numerical model is developed that predicts the thermal behavior of a pavement solar collector. • A parametric study is conducted on the sensitivity of the system to changes in design parameters. • A new methodology is developed to perform a long-term performance analysis of the system. - Abstract: Current aims regarding environmental protection, like reduction of fossil fuel consumption and greenhouse gas emissions, require the development of new technologies. These new technologies enable the production of renewable energy, which is both cleaner and more abundant in comparison to using fossil fuels for energy production. This necessity encourages researchers to develop new ways to capture solar energy, and if possible, store it for later use. In this paper, the Pavement Solar Collector (PSC), and its use to extract low temperature thermal energy, is studied. Such a system, which harvests energy by flowing water through a heat exchanger embedded in the pavement structure, could have a significant energy output since pavement materials tend to absorb large amounts of solar radiation. The main objective of this paper is to develop a modeling framework for the PSC system and validate it with a self-instructed experiment. Such a model will allow for a detailed parametric study of the system to optimize the design, as well as an investigation on the effect of aging (e.g. decreasing solar absorptivity) on the performance of the system. A long-term energy output of the system that is currently lacking is calculated based on results of the study on weather parameters. This newly acquired data could be the start of a comprehensive data set on the performance of a PSC, which leads to a comprehensive feasibility study of the system.

  15. Towards the development of improved tests for negative symptoms of schizophrenia in a validated animal model.

    Science.gov (United States)

    Sahin, Ceren; Doostdar, Nazanin; Neill, Joanna C

    2016-10-01

    Negative symptoms in schizophrenia remain an unmet clinical need. There is no licensed treatment specifically for this debilitating aspect of the disorder and effect sizes of new therapies are too small to make an impact on quality of life and function. Negative symptoms are multifactorial but often considered in terms of two domains, expressive deficit incorporating blunted affect and poverty of speech and avolition incorporating asociality and lack of drive. There is a clear need for improved understanding of the neurobiology of negative symptoms which can be enabled through the use of carefully validated animal models. While there are several tests for assessing sociability in animals, tests for blunted affect in schizophrenia are currently lacking. Two paradigms have recently been developed for assessing negative affect of relevance to depression in rats. Here we assess their utility for studying negative symptoms in schizophrenia using our well validated model for schizophrenia of sub-chronic (sc) treatment with Phencyclidine (PCP) in adult female rats. Results demonstrate that sc PCP treatment produces a significant negative affect bias in response to a high value reward in the optimistic and affective bias tests. Our results are not easily explained by the known cognitive deficits induced by sc PCP and support the hypothesis of a negative affective bias in this model. We suggest that further refinement of these two tests will provide a means to investigate the neurobiological basis of negative affect in schizophrenia, thus supporting the assessment of efficacy of new targets for this currently untreated symptom domain. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Assessment of the Value, Impact, and Validity of the Jobs and Economic Development Impacts (JEDI) Suite of Models

    Energy Technology Data Exchange (ETDEWEB)

    Billman, L.; Keyser, D.

    2013-08-01

    The Jobs and Economic Development Impacts (JEDI) models, developed by the National Renewable Energy Laboratory (NREL) for the U.S. Department of Energy (DOE) Office of Energy Efficiency and Renewable Energy (EERE), use input-output methodology to estimate gross (not net) jobs and economic impacts of building and operating selected types of renewable electricity generation and fuel plants. This analysis provides the DOE with an assessment of the value, impact, and validity of the JEDI suite of models. While the models produce estimates of jobs, earnings, and economic output, this analysis focuses only on jobs estimates. This validation report includes an introduction to JEDI models, an analysis of the value and impact of the JEDI models, and an analysis of the validity of job estimates generated by JEDI model through comparison to other modeled estimates and comparison to empirical, observed jobs data as reported or estimated for a commercial project, a state, or a region.

  17. Development and validation of spray models for investigating diesel engine combustion and emissions

    Science.gov (United States)

    Som, Sibendu

    Diesel engines intrinsically generate NOx and particulate matter which need to be reduced significantly in order to comply with the increasingly stringent regulations worldwide. This motivates the diesel engine manufacturers to gain fundamental understanding of the spray and combustion processes so as to optimize these processes and reduce engine emissions. Strategies being investigated to reduce engine's raw emissions include advancements in fuel injection systems, efficient nozzle orifice design, injection and combustion control strategies, exhaust gas recirculation, use of alternative fuels such as biodiesel etc. This thesis explores several of these approaches (such as nozzle orifice design, injection control strategy, and biodiesel use) by performing computer modeling of diesel engine processes. Fuel atomization characteristics are known to have a significant effect on the combustion and emission processes in diesel engines. Primary fuel atomization is induced by aerodynamics in the near nozzle region as well as cavitation and turbulence from the injector nozzle. The breakup models that are currently used in diesel engine simulations generally consider aerodynamically induced breakup using the Kelvin-Helmholtz (KH) instability model, but do not account for inner nozzle flow effects. An improved primary breakup (KH-ACT) model incorporating cavitation and turbulence effects along with aerodynamically induced breakup is developed and incorporated in the computational fluid dynamics code CONVERGE. The spray simulations using KH-ACT model are "quasi-dynamically" coupled with inner nozzle flow (using FLUENT) computations. This presents a novel tool to capture the influence of inner nozzle flow effects such as cavitation and turbulence on spray, combustion, and emission processes. Extensive validation is performed against the non-evaporating spray data from Argonne National Laboratory. Performance of the KH and KH-ACT models is compared against the evaporating and

  18. Development and validation of prediction models for endometrial cancer in postmenopausal bleeding.

    Science.gov (United States)

    Wong, Alyssa Sze-Wai; Cheung, Chun Wai; Fung, Linda Wen-Ying; Lao, Terence Tzu-Hsi; Mol, Ben Willem J; Sahota, Daljit Singh

    2016-08-01

    To develop and assess the accuracy of risk prediction models to diagnose endometrial cancer in women having postmenopausal bleeding (PMB). A retrospective cohort study of 4383 women in a One-stop PMB clinic from a university teaching hospital in Hong Kong. Clinical risk factors, transvaginal ultrasonic measurement of endometrial thickness (ET) and endometrial histology were obtained from consecutive women between 2002 and 2013. Two models to predict risk of endometrial cancer were developed and assessed, one based on patient characteristics alone and a second incorporated ET with patient characteristics. Endometrial histology was used as the reference standard. The split-sample internal validation and bootstrapping technique were adopted. The optimal threshold for prediction of endometrial cancer by the final models was determined using a receiver-operating characteristics (ROC) curve and Youden Index. The diagnostic gain was compared to a reference strategy of measuring ET only by comparing the AUC using the Delong test. Out of 4383 women with PMB, 168 (3.8%) were diagnosed with endometrial cancer. ET alone had an area under curve (AUC) of 0.92 (95% confidence intervals [CIs] 0.89-0.94). In the patient characteristics only model, independent predictors of cancer were age at presentation, age at menopause, body mass index, nulliparity and recurrent vaginal bleeding. The AUC and Youdens Index of the patient characteristic only model were respectively 0.73 (95% CI 0.67-0.80) and 0.72 (Sensitivity=66.5%; Specificity=68.9%; +ve LR=2.14; -ve LR=0.49). ET, age at presentation, nulliparity and recurrent vaginal bleeding were independent predictors in the patient characteristics plus ET model. The AUC and Youdens Index of the patient characteristic plus ET model where respectively 0.92 (95% CI 0.88-0.96) and 0.71 (Sensitivity=82.7%; Specificity=88.3%; +ve LR=6.38; -ve LR=0.2). Comparison of AUC indicated that a history alone model was inferior to a model using ET alone

  19. Development and validation of advanced theoretical modeling for churn-turbulent flows and subsequent transitions

    Energy Technology Data Exchange (ETDEWEB)

    Montoya Zabala, Gustavo Adolfo

    2015-07-01

    The applicability of CFD codes for two-phase flows has always been limited to special cases due to the very complex nature of its interface. Due to its tremendous computational cost, methods based on direct resolution of the interface are not applicable to most problems of practical relevance. Instead, averaging procedures are commonly used for these applications, such as the Eulerian-Eulerian approach, which necessarily means losing detailed information on the interfacial structure. In order to allow widespread application of the two-fluid approach, closure models are required to reintroduce in the simulations the correct interfacial mass, momentum, and heat transfer. It is evident that such closure models will strongly depend on the specific flow pattern. When considering vertical pipe flow with low gas volume flow rates, bubbly flow occurs. With increasing gas volume flow rates larger bubbles are generated by bubble coalescence, which further leads to transition to slug, churn-turbulent, and annular flow. Considering, as an example, a heated tube producing steam by evaporation, as in the case of a vertical steam generator, all these flow patterns including transitions are expected to occur in the system. Despite extensive attempts, robust and accurate simulations approaches for such conditions are still lacking. The purpose of this dissertation is the development, testing, and validation of a multifield model for adiabatic gas-liquid flows at high gas volume fractions, for which a multiple-size bubble approach has been implemented by separating the gas structures into a specified number of groups, each of which represents a prescribed range of sizes. A fully-resolved continuous gas phase is also computed, and represents all the gas structures which are large enough to be resolved within the computational mesh. The concept, known as GENeralized TwO Phase flow or GENTOP, is formulated as an extension to the bubble population balance approach known as the

  20. Development and validation of advanced theoretical modeling for churn-turbulent flows and subsequent transitions

    International Nuclear Information System (INIS)

    Montoya Zabala, Gustavo Adolfo

    2015-01-01

    The applicability of CFD codes for two-phase flows has always been limited to special cases due to the very complex nature of its interface. Due to its tremendous computational cost, methods based on direct resolution of the interface are not applicable to most problems of practical relevance. Instead, averaging procedures are commonly used for these applications, such as the Eulerian-Eulerian approach, which necessarily means losing detailed information on the interfacial structure. In order to allow widespread application of the two-fluid approach, closure models are required to reintroduce in the simulations the correct interfacial mass, momentum, and heat transfer. It is evident that such closure models will strongly depend on the specific flow pattern. When considering vertical pipe flow with low gas volume flow rates, bubbly flow occurs. With increasing gas volume flow rates larger bubbles are generated by bubble coalescence, which further leads to transition to slug, churn-turbulent, and annular flow. Considering, as an example, a heated tube producing steam by evaporation, as in the case of a vertical steam generator, all these flow patterns including transitions are expected to occur in the system. Despite extensive attempts, robust and accurate simulations approaches for such conditions are still lacking. The purpose of this dissertation is the development, testing, and validation of a multifield model for adiabatic gas-liquid flows at high gas volume fractions, for which a multiple-size bubble approach has been implemented by separating the gas structures into a specified number of groups, each of which represents a prescribed range of sizes. A fully-resolved continuous gas phase is also computed, and represents all the gas structures which are large enough to be resolved within the computational mesh. The concept, known as GENeralized TwO Phase flow or GENTOP, is formulated as an extension to the bubble population balance approach known as the

  1. Predicting the 6-month risk of severe hypoglycemia among adults with diabetes: Development and external validation of a prediction model.

    Science.gov (United States)

    Schroeder, Emily B; Xu, Stan; Goodrich, Glenn K; Nichols, Gregory A; O'Connor, Patrick J; Steiner, John F

    2017-07-01

    To develop and externally validate a prediction model for the 6-month risk of a severe hypoglycemic event among individuals with pharmacologically treated diabetes. The development cohort consisted of 31,674 Kaiser Permanente Colorado members with pharmacologically treated diabetes (2007-2015). The validation cohorts consisted of 38,764 Kaiser Permanente Northwest members and 12,035 HealthPartners members. Variables were chosen that would be available in electronic health records. We developed 16-variable and 6-variable models, using a Cox counting model process that allows for the inclusion of multiple 6-month observation periods per person. Across the three cohorts, there were 850,992 6-month observation periods, and 10,448 periods with at least one severe hypoglycemic event. The six-variable model contained age, diabetes type, HgbA1c, eGFR, history of a hypoglycemic event in the prior year, and insulin use. Both prediction models performed well, with good calibration and c-statistics of 0.84 and 0.81 for the 16-variable and 6-variable models, respectively. In the external validation cohorts, the c-statistics were 0.80-0.84. We developed and validated two prediction models for predicting the 6-month risk of hypoglycemia. The 16-variable model had slightly better performance than the 6-variable model, but in some practice settings, use of the simpler model may be preferred. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. Development and Validation of a Model to Determine Risk of Progression of Barrett's Esophagus to Neoplasia.

    Science.gov (United States)

    Parasa, Sravanthi; Vennalaganti, Sreekar; Gaddam, Srinivas; Vennalaganti, Prashanth; Young, Patrick; Gupta, Neil; Thota, Prashanthi; Cash, Brooks; Mathur, Sharad; Sampliner, Richard; Moawad, Fouad; Lieberman, David; Bansal, Ajay; Kennedy, Kevin F; Vargo, John; Falk, Gary; Spaander, Manon; Bruno, Marco; Sharma, Prateek

    2018-04-01

    A system is needed to determine the risk of patients with Barrett's esophagus for progression to high-grade dysplasia (HGD) and esophageal adenocarcinoma (EAC). We developed and validated a model to determine of progression to HGD or EAC in patients with BE, based on demographic data and endoscopic and histologic findings at the time of index endoscopy. We performed a longitudinal study of patients with BE at 5 centers in United States and 1 center in Netherlands enrolled in the Barrett's Esophagus Study database from 1985 through 2014. Patients were excluded from the analysis if they had less than 1 year of follow-up, were diagnosed with HGD or EAC within the past year, were missing baseline histologic data, or had no intestinal metaplasia. Seventy percent of the patients were used to derive the model and 30% were used for the validation study. The primary outcome was development of HGD or EAC during the follow-up period (median, 5.9 years). Survival analysis was performed using the Kaplan-Meier method. We assigned a specific number of points to each BE risk factor, and point totals (scores) were used to create categories of low, intermediate, and high risk. We used Cox regression to compute hazard ratios and 95% confidence intervals to determine associations between risk of progression and scores. Of 4584 patients in the database, 2697 were included in our analysis (84.1% men; 87.6% Caucasian; mean age, 55.4 ± 20.1 years; mean body mass index, 27.9 ± 5.5 kg/m 2 ; mean length of BE, 3.7 ± 3.2 cm). During the follow-up period, 154 patients (5.7%) developed HGD or EAC, with an annual rate of progression of 0.95%. Male sex, smoking, length of BE, and baseline-confirmed low-grade dysplasia were significantly associated with progression. Scores assigned identified patients with BE that progressed to HGD or EAC with a c-statistic of 0.76 (95% confidence interval, 0.72-0.80; P Esophagus score) based on male sex, smoking, length of BE, and baseline low-grade dysplasia

  3. 3D vs 2D laparoscopic systems: Development of a performance quantitative validation model.

    Science.gov (United States)

    Ghedi, Andrea; Donarini, Erica; Lamera, Roberta; Sgroi, Giovanni; Turati, Luca; Ercole, Cesare

    2015-01-01

    The new technology ensures 3D laparoscopic vision by adding depth to the traditional two dimensions. This realistic vision gives the surgeon the feeling of operating in real space. Hospital of Treviglio-Caravaggio isn't an university or scientific institution; in 2014 a new 3D laparoscopic technology was acquired therefore it led to evaluation of the of the appropriateness in term of patient outcome and safety. The project aims at achieving the development of a quantitative validation model that would ensure low cost and a reliable measure of the performance of 3D technology versus 2D mode. In addition, it aims at demonstrating how new technologies, such as open source hardware and software and 3D printing, could help research with no significant cost increase. For these reasons, in order to define criteria of appropriateness in the use of 3D technologies, it was decided to perform a study to technically validate the use of the best technology in terms of effectiveness, efficiency and safety in the use of a system between laparoscopic vision in 3D and the traditional 2D. 30 surgeons were enrolled in order to perform an exercise through the use of laparoscopic forceps inside a trainer. The exercise consisted of having surgeons with different level of seniority, grouped by type of specialization (eg. surgery, urology, gynecology), exercising videolaparoscopy with two technologies (2D and 3D) through the use of a anthropometric phantom. The target assigned to the surgeon was that to pass "needle and thread" without touching the metal part in the shortest time possible. The rings selected for the exercise had each a coefficient of difficulty determined by depth, diameter, angle from the positioning and from the point of view. The analysis of the data collected from the above exercise has mathematically confirmed that the 3D technique ensures a learning curve lower in novice and greater accuracy in the performance of the task with respect to 2D.

  4. An inverse radiation model for optical determination of temperature and species concentration: Development and validation

    DEFF Research Database (Denmark)

    Ren, Tao; Modest, Michael F.; Fateev, Alexander

    2015-01-01

    2010 (Rothman et al. (2010) [1]), which contains line-by-line (LBL) information for several combustion gas species, such as CO2 and H2O, was used to predict gas spectral transmissivities. The model was validated by retrieving temperatures and species concentrations from experimental CO2 and H2O...

  5. Two-phase 1D+1D model of a DMFC: development and validation on extensive operating conditions range

    Energy Technology Data Exchange (ETDEWEB)

    Casalegno, A.; Marchesi, R.; Parenti, D. [Dipartimento di Energetica, Politecnico di Milano (Italy)

    2008-02-15

    A two-phase 1D+1D model of a direct methanol fuel cell (DMFC) is developed, considering overall mass balance, methanol transport in gas phase through anode diffusion layer, methanol and water crossover. The model is quantitatively validated on an extensive range of operating conditions, 24 polarisation curves. The model accurately reproduces DMFC performance in the validation range and, outside this, it is able to predict values under feasible operating conditions. Finally, the estimations of methanol crossover flux are qualitatively and quantitatively similar to experimental measures and the main local quantities' trends are coherent with results obtained with more complex models. (Abstract Copyright [2008], Wiley Periodicals, Inc.)

  6. Development of Camera Model and Geometric Calibration/validation of Xsat IRIS Imagery

    Science.gov (United States)

    Kwoh, L. K.; Huang, X.; Tan, W. J.

    2012-07-01

    XSAT, launched on 20 April 2011, is the first micro-satellite designed and built in Singapore. It orbits the Earth at altitude of 822 km in a sun synchronous orbit. The satellite carries a multispectral camera IRIS with three spectral bands - 0.52~0.60 mm for Green, 0.63~0.69 mm for Red and 0.76~0.89 mm for NIR at 12 m resolution. In the design of IRIS camera, the three bands were acquired by three lines of CCDs (NIR, Red and Green). These CCDs were physically separated in the focal plane and their first pixels not absolutely aligned. The micro-satellite platform was also not stable enough to allow for co-registration of the 3 bands with simple linear transformation. In the camera model developed, this platform stability was compensated with 3rd to 4th order polynomials for the satellite's roll, pitch and yaw attitude angles. With the camera model, the camera parameters such as the band to band separations, the alignment of the CCDs relative to each other, as well as the focal length of the camera can be validated or calibrated. The results of calibration with more than 20 images showed that the band to band along-track separation agreed well with the pre-flight values provided by the vendor (0.093° and 0.046° for the NIR vs red and for green vs red CCDs respectively). The cross-track alignments were 0.05 pixel and 5.9 pixel for the NIR vs red and green vs red CCDs respectively. The focal length was found to be shorter by about 0.8%. This was attributed to the lower operating temperature which XSAT is currently operating. With the calibrated parameters and the camera model, a geometric level 1 multispectral image with RPCs can be generated and if required, orthorectified imagery can also be produced.

  7. Development and Validity of a Silicone Renal Tumor Model for Robotic Partial Nephrectomy Training.

    Science.gov (United States)

    Monda, Steven M; Weese, Jonathan R; Anderson, Barrett G; Vetter, Joel M; Venkatesh, Ramakrishna; Du, Kefu; Andriole, Gerald L; Figenshau, Robert S

    2018-04-01

    To provide a training tool to address the technical challenges of robot-assisted laparoscopic partial nephrectomy, we created silicone renal tumor models using 3-dimensional printed molds of a patient's kidney with a mass. In this study, we assessed the face, content, and construct validity of these models. Surgeons of different training levels completed 4 simulations on silicone renal tumor models. Participants were surveyed on the usefulness and realism of the model as a training tool. Performance was measured using operation-specific metrics, self-reported operative demands (NASA Task Load Index [NASA TLX]), and blinded expert assessment (Global Evaluative Assessment of Robotic Surgeons [GEARS]). Twenty-four participants included attending urologists, endourology fellows, urology residents, and medical students. Post-training surveys of expert participants yielded mean results of 79.2 on the realism of the model's overall feel and 90.2 on the model's overall usefulness for training. Renal artery clamp times and GEARS scores were significantly better in surgeons further in training (P ≤.005 and P ≤.025). Renal artery clamp times, preserved renal parenchyma, positive margins, NASA TLX, and GEARS scores were all found to improve across trials (P <.001, P = .025, P = .024, P ≤.020, and P ≤.006, respectively). Face, content, and construct validity were demonstrated in the use of a silicone renal tumor model in a cohort of surgeons of different training levels. Expert participants deemed the model useful and realistic. Surgeons of higher training levels performed better than less experienced surgeons in various study metrics, and improvements within individuals were observed over sequential trials. Future studies should aim to assess model predictive validity, namely, the association between model performance improvements and improvements in live surgery. Copyright © 2018 Elsevier Inc. All rights reserved.

  8. Development and validation of extensive growth and growth boundary models for psychrotolerant pseudomonads in seafood, meat and vegetable products

    DEFF Research Database (Denmark)

    Martinez Rios, Veronica; Dalgaard, Paw

    Extensive growth and growth boundary models were developed and validated for psychrotolerant pseudomonads growing in seafood, meat and vegetable products. The new models were developed by expanding anexisting cardinal parameter-type model for growth of pseudomonads in milk (Martinez-Rios et al......, when observed and predicted μmax -values were compared. Thus, on average μmax -values for seafood and meat products were overestimated by 14%. Additionally, the reference growth rate parameter μref25˚C was calibrated by fitting the model to 21 μmax -values in vegetable products. This resulted in a μref......25˚C -value of 0.54 1/h. The calibrated vegetable model wassuccessfully validated using 51 μmax -values for psychrotolerant pseudomonads in vegetables. Average bias and accuracy factor values of 1.24 and 1.38 were obtained, respectively. Lag time models were developed by using relative lag times from...

  9. Validation Data and Model Development for Fuel Assembly Response to Seismic Loads

    International Nuclear Information System (INIS)

    Bardet, Philippe; Ricciardi, Guillaume

    2016-01-01

    Vibrations are inherently present in nuclear reactors, especially in cores and steam generators of pressurized water reactors (PWR). They can have significant effects on local heat transfer and wear and tear in the reactor and often set safety margins. The simulation of these multiphysics phenomena from first principles requires the coupling of several codes, which is one the most challenging task in modern computer simulation. Here an ambitious multiphysics multidisciplinary validation campaign is conducted. It relied on an integrated team of experimentalists and code developers to acquire benchmark and validation data for fluid-structure interaction codes. Data are focused on PWR fuel bundle behavior during seismic transients.

  10. Validation Data and Model Development for Fuel Assembly Response to Seismic Loads

    Energy Technology Data Exchange (ETDEWEB)

    Bardet, Philippe [George Washington Univ., Washington, DC (United States); Ricciardi, Guillaume [Atomic Energy Commission (CEA) (France)

    2016-01-31

    Vibrations are inherently present in nuclear reactors, especially in cores and steam generators of pressurized water reactors (PWR). They can have significant effects on local heat transfer and wear and tear in the reactor and often set safety margins. The simulation of these multiphysics phenomena from first principles requires the coupling of several codes, which is one the most challenging tasks in modern computer simulation. Here an ambitious multiphysics multidisciplinary validation campaign is conducted. It relied on an integrated team of experimentalists and code developers to acquire benchmark and validation data for fluid-structure interaction codes. Data are focused on PWR fuel bundle behavior during seismic transients.

  11. Development and validation of a mortality risk model for pediatric sepsis

    Science.gov (United States)

    Chen, Mengshi; Lu, Xiulan; Hu, Li; Liu, Pingping; Zhao, Wenjiao; Yan, Haipeng; Tang, Liang; Zhu, Yimin; Xiao, Zhenghui; Chen, Lizhang; Tan, Hongzhuan

    2017-01-01

    Abstract Pediatric sepsis is a burdensome public health problem. Assessing the mortality risk of pediatric sepsis patients, offering effective treatment guidance, and improving prognosis to reduce mortality rates, are crucial. We extracted data derived from electronic medical records of pediatric sepsis patients that were collected during the first 24 hours after admission to the pediatric intensive care unit (PICU) of the Hunan Children's hospital from January 2012 to June 2014. A total of 788 children were randomly divided into a training (592, 75%) and validation group (196, 25%). The risk factors for mortality among these patients were identified by conducting multivariate logistic regression in the training group. Based on the established logistic regression equation, the logit probabilities for all patients (in both groups) were calculated to verify the model's internal and external validities. According to the training group, 6 variables (brain natriuretic peptide, albumin, total bilirubin, D-dimer, lactate levels, and mechanical ventilation in 24 hours) were included in the final logistic regression model. The areas under the curves of the model were 0.854 (0.826, 0.881) and 0.844 (0.816, 0.873) in the training and validation groups, respectively. The Mortality Risk Model for Pediatric Sepsis we established in this study showed acceptable accuracy to predict the mortality risk in pediatric sepsis patients. PMID:28514310

  12. Numerical studies and metric development for validation of magnetohydrodynamic models on the HIT-SI experiment

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, C., E-mail: hansec@uw.edu [PSI-Center, University of Washington, Seattle, Washington 98195 (United States); Columbia University, New York, New York 10027 (United States); Victor, B.; Morgan, K.; Hossack, A.; Sutherland, D. [HIT-SI Group, University of Washington, Seattle, Washington 98195 (United States); Jarboe, T.; Nelson, B. A. [HIT-SI Group, University of Washington, Seattle, Washington 98195 (United States); PSI-Center, University of Washington, Seattle, Washington 98195 (United States); Marklin, G. [PSI-Center, University of Washington, Seattle, Washington 98195 (United States)

    2015-05-15

    We present application of three scalar metrics derived from the Biorthogonal Decomposition (BD) technique to evaluate the level of agreement between macroscopic plasma dynamics in different data sets. BD decomposes large data sets, as produced by distributed diagnostic arrays, into principal mode structures without assumptions on spatial or temporal structure. These metrics have been applied to validation of the Hall-MHD model using experimental data from the Helicity Injected Torus with Steady Inductive helicity injection experiment. Each metric provides a measure of correlation between mode structures extracted from experimental data and simulations for an array of 192 surface-mounted magnetic probes. Numerical validation studies have been performed using the NIMROD code, where the injectors are modeled as boundary conditions on the flux conserver, and the PSI-TET code, where the entire plasma volume is treated. Initial results from a comprehensive validation study of high performance operation with different injector frequencies are presented, illustrating application of the BD method. Using a simplified (constant, uniform density and temperature) Hall-MHD model, simulation results agree with experimental observation for two of the three defined metrics when the injectors are driven with a frequency of 14.5 kHz.

  13. Political Representation and Gender Inequalities Testing the Validity of Model Developed for Pakistan using a Data Set of Malaysia

    OpenAIRE

    Najeebullah Khan; Adnan Hussein; Zahid Awan; Bakhtiar Khan

    2012-01-01

    This study measured the impacts of six independent variables (political rights, election system type, political quota, literacy rate, labor force participation and GDP per capita at current price in US dollar) on the dependent variable (percentage of women representation in national legislature) using multiple linear regression models. At a first step we developed and tested the model without of sample data of Pakistan. For model construction and validation ten years data from the year 1999 a...

  14. Development and external validation of a risk-prediction model to predict 5-year overall survival in advanced larynx cancer.

    Science.gov (United States)

    Petersen, Japke F; Stuiver, Martijn M; Timmermans, Adriana J; Chen, Amy; Zhang, Hongzhen; O'Neill, James P; Deady, Sandra; Vander Poorten, Vincent; Meulemans, Jeroen; Wennerberg, Johan; Skroder, Carl; Day, Andrew T; Koch, Wayne; van den Brekel, Michiel W M

    2018-05-01

    TNM-classification inadequately estimates patient-specific overall survival (OS). We aimed to improve this by developing a risk-prediction model for patients with advanced larynx cancer. Cohort study. We developed a risk prediction model to estimate the 5-year OS rate based on a cohort of 3,442 patients with T3T4N0N+M0 larynx cancer. The model was internally validated using bootstrapping samples and externally validated on patient data from five external centers (n = 770). The main outcome was performance of the model as tested by discrimination, calibration, and the ability to distinguish risk groups based on tertiles from the derivation dataset. The model performance was compared to a model based on T and N classification only. We included age, gender, T and N classification, and subsite as prognostic variables in the standard model. After external validation, the standard model had a significantly better fit than a model based on T and N classification alone (C statistic, 0.59 vs. 0.55, P statistic to 0.68. A risk prediction model for patients with advanced larynx cancer, consisting of readily available clinical variables, gives more accurate estimations of the estimated 5-year survival rate when compared to a model based on T and N classification alone. 2c. Laryngoscope, 128:1140-1145, 2018. © 2017 The American Laryngological, Rhinological and Otological Society, Inc.

  15. Development and field validation of a regional, management-scale habitat model: A koala Phascolarctos cinereus case study.

    Science.gov (United States)

    Law, Bradley; Caccamo, Gabriele; Roe, Paul; Truskinger, Anthony; Brassil, Traecey; Gonsalves, Leroy; McConville, Anna; Stanton, Matthew

    2017-09-01

    Species distribution models have great potential to efficiently guide management for threatened species, especially for those that are rare or cryptic. We used MaxEnt to develop a regional-scale model for the koala Phascolarctos cinereus at a resolution (250 m) that could be used to guide management. To ensure the model was fit for purpose, we placed emphasis on validating the model using independently-collected field data. We reduced substantial spatial clustering of records in coastal urban areas using a 2-km spatial filter and by modeling separately two subregions separated by the 500-m elevational contour. A bias file was prepared that accounted for variable survey effort. Frequency of wildfire, soil type, floristics and elevation had the highest relative contribution to the model, while a number of other variables made minor contributions. The model was effective in discriminating different habitat suitability classes when compared with koala records not used in modeling. We validated the MaxEnt model at 65 ground-truth sites using independent data on koala occupancy (acoustic sampling) and habitat quality (browse tree availability). Koala bellows ( n  = 276) were analyzed in an occupancy modeling framework, while site habitat quality was indexed based on browse trees. Field validation demonstrated a linear increase in koala occupancy with higher modeled habitat suitability at ground-truth sites. Similarly, a site habitat quality index at ground-truth sites was correlated positively with modeled habitat suitability. The MaxEnt model provided a better fit to estimated koala occupancy than the site-based habitat quality index, probably because many variables were considered simultaneously by the model rather than just browse species. The positive relationship of the model with both site occupancy and habitat quality indicates that the model is fit for application at relevant management scales. Field-validated models of similar resolution would assist in

  16. Development, description and validation of a Tritium Environmental Release Model (TERM).

    Science.gov (United States)

    Jeffers, Rebecca S; Parker, Geoffrey T

    2014-01-01

    Tritium is a radioisotope of hydrogen that exists naturally in the environment and may also be released through anthropogenic activities. It bonds readily with hydrogen and oxygen atoms to form tritiated water, which then cycles through the hydrosphere. This paper seeks to model the migration of tritiated species throughout the environment - including atmospheric, river and coastal systems - more comprehensively and more consistently across release scenarios than is currently in the literature. A review of the features and underlying conceptual models of some existing tritium release models was conducted, and an underlying aggregated conceptual process model defined, which is presented. The new model, dubbed 'Tritium Environmental Release Model' (TERM), was then tested against multiple validation sets from literature, including experimental data and reference tests for tritium models. TERM has been shown to be capable of providing reasonable results which are broadly comparable with atmospheric HTO release models from the literature, spanning both continuous and discrete release conditions. TERM also performed well when compared with atmospheric data. TERM is believed to be a useful tool for examining discrete and continuous atmospheric releases or combinations thereof. TERM also includes further capabilities (e.g. river and coastal release scenarios) that may be applicable to certain scenarios that atmospheric models alone may not handle well. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. Can preventable adverse events be predicted among hospitalized older patients? The development and validation of a predictive model.

    NARCIS (Netherlands)

    Steeg, L. van de; Langelaan, M.; Wagner, C.

    2014-01-01

    Objective: To develop and validate a predictive model for preventable adverse events (AEs) in hospitalized older patients, using clinically important risk factors that are readily available on admission. Design: Data from two retrospective patient record review studies on AEs were used. Risk factors

  18. Development and validation of a prediction model for long-term sickness absence based on occupational health survey variables

    NARCIS (Netherlands)

    Roelen, Corne; Thorsen, Sannie; Heymans, Martijn; Twisk, Jos; Bultmann, Ute; Bjorner, Jakob

    2018-01-01

    Purpose: The purpose of this study is to develop and validate a prediction model for identifying employees at increased risk of long-term sickness absence (LTSA), by using variables commonly measured in occupational health surveys. Materials and methods: Based on the literature, 15 predictor

  19. Development and validation of a multivariate prediction model for patients with acute pancreatitis in Intensive Care Medicine.

    Science.gov (United States)

    Zubia-Olaskoaga, Felix; Maraví-Poma, Enrique; Urreta-Barallobre, Iratxe; Ramírez-Puerta, María-Rosario; Mourelo-Fariña, Mónica; Marcos-Neira, María-Pilar; García-García, Miguel Ángel

    2018-03-01

    Development and validation of a multivariate prediction model for patients with acute pancreatitis (AP) admitted in Intensive Care Units (ICU). A prospective multicenter observational study, in 1 year period, in 46 international ICUs (EPAMI study). adults admitted to an ICU with AP and at least one organ failure. Development of a multivariate prediction model, using the worst data of the stay in ICU, based in multivariate analysis, simple imputation in a development cohort. The model was validated in another cohort. 374 patients were included (mortality of 28.9%). Variables with statistical significance in multivariate analysis were age, no alcoholic and no biliary etiology, development of shock, development of respiratory failure, need of continuous renal replacement therapy, and intra-abdominal pressure. The model created with these variables presented an AUC of ROC curve of 0.90 (CI 95% 0.81-0.94) in the validation cohort. We developed a multivariable prediction model, and AP cases could be classified as low mortality risk (between 2 and 9.5 points, mortality of 1.35%), moderate mortality risk (between 10 and 12.5 points, 28.92% of mortality), and high mortality risk (13 points of more, mortality of 88.37%). Our model presented better AUC of ROC curve than APACHE II (0.91 vs 0.80) and SOFA in the first 24 h (0.91 vs 0.79). We developed and validated a multivariate prediction model, which can be applied in any moment of the stay in ICU, with better discriminatory power than APACHE II and SOFA in the first 24 h. Copyright © 2018 IAP and EPC. Published by Elsevier B.V. All rights reserved.

  20. A Multivariate Model for Prediction of Obstructive Coronary Disease in Patients with Acute Chest Pain: Development and Validation

    Directory of Open Access Journals (Sweden)

    Luis Cláudio Lemos Correia

    Full Text Available Abstract Background: Currently, there is no validated multivariate model to predict probability of obstructive coronary disease in patients with acute chest pain. Objective: To develop and validate a multivariate model to predict coronary artery disease (CAD based on variables assessed at admission to the coronary care unit (CCU due to acute chest pain. Methods: A total of 470 patients were studied, 370 utilized as the derivation sample and the subsequent 100 patients as the validation sample. As the reference standard, angiography was required to rule in CAD (stenosis ≥ 70%, while either angiography or a negative noninvasive test could be used to rule it out. As predictors, 13 baseline variables related to medical history, 14 characteristics of chest discomfort, and eight variables from physical examination or laboratory tests were tested. Results: The prevalence of CAD was 48%. By logistic regression, six variables remained independent predictors of CAD: age, male gender, relief with nitrate, signs of heart failure, positive electrocardiogram, and troponin. The area under the curve (AUC of this final model was 0.80 (95% confidence interval [95%CI] = 0.75 - 0.84 in the derivation sample and 0.86 (95%CI = 0.79 - 0.93 in the validation sample. Hosmer-Lemeshow's test indicated good calibration in both samples (p = 0.98 and p = 0.23, respectively. Compared with a basic model containing electrocardiogram and troponin, the full model provided an AUC increment of 0.07 in both derivation (p = 0.0002 and validation (p = 0.039 samples. Integrated discrimination improvement was 0.09 in both derivation (p < 0.001 and validation (p < 0.0015 samples. Conclusion: A multivariate model was derived and validated as an accurate tool for estimating the pretest probability of CAD in patients with acute chest pain.

  1. Development and validation of a combined phased acoustical radiosity and image source model for predicting sound fields in rooms

    DEFF Research Database (Denmark)

    Marbjerg, Gerd Høy; Brunskog, Jonas; Jeong, Cheol-Ho

    2015-01-01

    A model, combining acoustical radiosity and the image source method, including phase shifts on reflection, has been developed. The model is denoted Phased Acoustical Radiosity and Image Source Method (PARISM), and it has been developed in order to be able to model both specular and diffuse...... radiosity by regarding the model as being stochastic. Three methods of implementation are proposed and investigated, and finally, recommendations are made for their use. Validation of the image source method is done by comparison with finite element simulations of a rectangular room with a porous absorber...

  2. The Development and Empirical Validation of an E-based Supply Chain Strategy Optimization Model

    DEFF Research Database (Denmark)

    Kotzab, Herbert; Skjoldager, Niels; Vinum, Thorkil

    2003-01-01

    Examines the formulation of supply chain strategies in complex environments. Argues that current state‐of‐the‐art e‐business and supply chain management, combined into the concept of e‐SCM, as well as the use of transaction cost theory, network theory and resource‐based theory, altogether can...... be used to form a model for analyzing supply chains with the purpose of reducing the uncertainty of formulating supply chain strategies. Presents e‐supply chain strategy optimization model (e‐SOM) as a way to analyze supply chains in a structured manner as regards strategic preferences for supply chain...... design, relations and resources in the chains with the ultimate purpose of enabling the formulation of optimal, executable strategies for specific supply chains. Uses research results for a specific supply chain to validate the usefulness of the model....

  3. Teaching the Basics: Development and Validation of a Distal Radius Reduction and Casting Model.

    Science.gov (United States)

    Seeley, Mark A; Fabricant, Peter D; Lawrence, J Todd R

    2017-09-01

    Approximately one-third of reduced pediatric distal radius fractures redisplace, resulting in further treatment. Two major modifiable risk factors for loss of reduction are reduction adequacy and cast quality. Closed reduction and immobilization of distal radius fractures is an Accreditation Council for Graduate Medical Education residency milestone. Teaching and assessing competency could be improved with a life-like simulation training tool. Our goal was to develop and validate a realistic distal radius fracture reduction and casting simulator as determined by (1) a questionnaire regarding the "realism" of the model and (2) the quantitative assessments of reduction time, residual angulation, and displacement. A distal radius fracture model was created with radiopaque bony segments and articulating elbows and shoulders. Simulated periosteum and internal deforming forces required proper reduction and casting techniques to achieve and maintain reduction. The forces required were estimated through an iterative process through feedback from experienced clinicians. Embedded monofilaments allowed for quantitative assessment of residual displacement and angulation through the use of fluoroscopy. Subjects were asked to perform closed reduction and apply a long arm fiberglass cast. Primary performance variables assessed included reduction time, residual angulation, and displacement. Secondary performance variables consisted of number of fluoroscopic images, casting time, and cast index (defined as the ratio of the internal width of the forearm cast in the sagittal plane to the internal width in the coronal plane at the fracture site). Subject grading was performed by two blinded reviewers. Interrater reliability was nearly perfect across all measurements (intraclass correlation coefficient range, 0.94-0.99), thus disagreements in measurements were handled by averaging the assessed values. After completion the participants answered a Likert-based questionnaire regarding the

  4. Development and validation of a physics-based urban fire spread model

    OpenAIRE

    HIMOTO, Keisuke; TANAKA, Takeyoshi

    2008-01-01

    A computational model for fire spread in a densely built urban area is developed. The model is distinct from existing models in that it explicitly describes fire spread phenomena with physics-based knowledge achieved in the field of fire safety engineering. In the model, urban fire is interpreted as an ensemble of multiple building fires; that is, the fire spread is simulated by predicting behaviors of individual building fires under the thermal influence of neighboring building fires. Adopte...

  5. Development and validation of SUCROS-Cotton : A potential crop growth simulation model for cotton

    NARCIS (Netherlands)

    Zhang, L.; Werf, van der W.; Cao, W.; Li, B.; Pan, X.; Spiertz, J.H.J.

    2008-01-01

    A model for the development, growth and potential production of cotton (SUCROS-Cotton) was developed. Particular attention was given to the phenological development of the plant and the plasticity of fruit growth in response to temperature, radiation, daylength, variety traits, and management. The

  6. Acute Kidney Injury in Trauma Patients Admitted to Critical Care: Development and Validation of a Diagnostic Prediction Model.

    Science.gov (United States)

    Haines, Ryan W; Lin, Shih-Pin; Hewson, Russell; Kirwan, Christopher J; Torrance, Hew D; O'Dwyer, Michael J; West, Anita; Brohi, Karim; Pearse, Rupert M; Zolfaghari, Parjam; Prowle, John R

    2018-02-26

    Acute Kidney Injury (AKI) complicating major trauma is associated with increased mortality and morbidity. Traumatic AKI has specific risk factors and predictable time-course facilitating diagnostic modelling. In a single centre, retrospective observational study we developed risk prediction models for AKI after trauma based on data around intensive care admission. Models predicting AKI were developed using data from 830 patients, using data reduction followed by logistic regression, and were independently validated in a further 564 patients. AKI occurred in 163/830 (19.6%) with 42 (5.1%) receiving renal replacement therapy (RRT). First serum creatinine and phosphate, units of blood transfused in first 24 h, age and Charlson score discriminated need for RRT and AKI early after trauma. For RRT c-statistics were good to excellent: development: 0.92 (0.88-0.96), validation: 0.91 (0.86-0.97). Modelling AKI stage 2-3, c-statistics were also good, development: 0.81 (0.75-0.88) and validation: 0.83 (0.74-0.92). The model predicting AKI stage 1-3 performed moderately, development: c-statistic 0.77 (0.72-0.81), validation: 0.70 (0.64-0.77). Despite good discrimination of need for RRT, positive predictive values (PPV) at the optimal cut-off were only 23.0% (13.7-42.7) in development. However, PPV for the alternative endpoint of RRT and/or death improved to 41.2% (34.8-48.1) highlighting death as a clinically relevant endpoint to RRT.

  7. Development and validation of a laparoscopic hysterectomy cuff closure simulation model for surgical training.

    Science.gov (United States)

    Tunitsky-Bitton, Elena; Propst, Katie; Muffly, Tyler

    2016-03-01

    The number of robotically assisted hysterectomies is increasing, and therefore, the opportunities for trainees to become competent in performing traditional laparoscopic hysterectomy are decreasing. Simulation-based training is ideal for filling this gap in training. The objective of the study was to design a surgical model for training in laparoscopic vaginal cuff closure and to present evidence of its validity and reliability as an assessment and training tool. Participants included gynecology staff and trainees at 2 tertiary care centers. Experienced surgeons were also recruited at the combined International Urogynecologic Association and American Urogynecologic Society scientific meeting. Participants included 19 experts and 21 trainees. All participants were recorded using the laparoscopic hysterectomy cuff closure simulation model. The model was constructed using the an advanced uterine manipulation system with a sacrocolopexy tip/vaginal stent, a vaginal cuff constructed from neoprene material and lined with a swimsuit material (nylon and spandex) secured to the vaginal stent with a plastic cable tie. The uterine manipulation system was attached to the fundamentals of laparoscopic surgery laparoscopic training box trainer using a metal bracket. Performance was evaluated using the Global Operative Assessment of Laparoscopic Skills scale. In addition, needle handling, knot tying, and incorporation of epithelial edge were also evaluated. The Student t test was used to compare the scores and the operating times between the groups. Intrarater reliability between the scores by the 2 masked experts was measured using the interclass correlation coefficient. Total and annual experience with laparoscopic suturing and specifically vaginal cuff closure varied greatly among the participants. For the construct validity, the participants in the expert group received significantly higher scores in each of the domains of the Global Operative Assessment of Laparoscopic Skills

  8. Toward the Development and Validation of a Career Coach Competency Model

    Science.gov (United States)

    Hatala, John-Paul; Hisey, Lee

    2011-01-01

    The career coaching profession is a dynamic field that has grown over the last decade. However, there exists a limitation to this field's development, as there is no universally accepted definition or empirically based competencies. There were three phases to the study. In the first phase, a conceptual model was developed that highlights four…

  9. Development and validation of a free-piston engine generator numerical model

    International Nuclear Information System (INIS)

    Jia, Boru; Zuo, Zhengxing; Tian, Guohong; Feng, Huihua; Roskilly, A.P.

    2015-01-01

    Highlights: • Detailed numerical model of free-piston engine generator is presented. • Sub models for both starting process and steady operation are derived. • Simulation results show good agreement with prototype test data. • Engine performance with different starting motor force and varied loads are simulated. • The efficiency of the prototype is estimated to be 31.5% at a power output of 4 kW under full load. - Abstract: This paper focuses on the numerical modelling of a spark ignited free-piston engine generator and the model validation with test results. Detailed sub-models for both starting process and steady operation were derived. The compression and expansion processes were not regarded as ideal gas isentropic processes; both heat transfer and air leakage were taken into consideration. The simulation results show good agreement with the prototype test data for both the starting process and steady operation. During the starting process, the difference of the in-cylinder gas pressure can be controlled within 1 bar for every running cycle. For the steady operation process, the difference was less than 5% and the areas enclosed on the pressure–volume diagram were similar, indicating that the power produced by the engine and the engine efficiency could be predicted by this model. Based on this model, the starting process with different starting motor forces and the combustion process with various throttle openings were simulated. The engine performance during stable operation at 100% engine load was predicted, and the efficiency of the prototype was estimated to be 31.5% at power output of 4 kW

  10. CFD Model Development and validation for High Temperature Gas Cooled Reactor Cavity Cooling System (RCCS) Applications

    International Nuclear Information System (INIS)

    Hassan, Yassin; Corradini, Michael; Tokuhiro, Akira; Wei, Thomas Y.C.

    2014-01-01

    The Reactor Cavity Cooling Systems (RCCS) is a passive safety system that will be incorporated in the VTHR design. The system was designed to remove the heat from the reactor cavity and maintain the temperature of structures and concrete walls under desired limits during normal operation (steady-state) and accident scenarios. A small scale (1:23) water-cooled experimental facility was scaled, designed, and constructed in order to study the complex thermohydraulic phenomena taking place in the RCCS during steady-state and transient conditions. The facility represents a portion of the reactor vessel with nine stainless steel coolant risers and utilizes water as coolant. The facility was equipped with instrumentation to measure temperatures and flow rates and a general verification was completed during the shakedown. A model of the experimental facility was prepared using RELAP5-3D and simulations were performed to validate the scaling procedure. The experimental data produced during the steady-state run were compared with the simulation results obtained using RELAP5-3D. The overall behavior of the facility met the expectations. The facility capabilities were confirmed to be very promising in performing additional experimental tests, including flow visualization, and produce data for code validation.

  11. CFD Model Development and validation for High Temperature Gas Cooled Reactor Cavity Cooling System (RCCS) Applications

    Energy Technology Data Exchange (ETDEWEB)

    Hassan, Yassin [Univ. of Wisconsin, Madison, WI (United Texas A & M Univ., College Station, TX (United States); Corradini, Michael; Tokuhiro, Akira; Wei, Thomas Y.C.

    2014-07-14

    The Reactor Cavity Cooling Systems (RCCS) is a passive safety system that will be incorporated in the VTHR design. The system was designed to remove the heat from the reactor cavity and maintain the temperature of structures and concrete walls under desired limits during normal operation (steady-state) and accident scenarios. A small scale (1:23) water-cooled experimental facility was scaled, designed, and constructed in order to study the complex thermohydraulic phenomena taking place in the RCCS during steady-state and transient conditions. The facility represents a portion of the reactor vessel with nine stainless steel coolant risers and utilizes water as coolant. The facility was equipped with instrumentation to measure temperatures and flow rates and a general verification was completed during the shakedown. A model of the experimental facility was prepared using RELAP5-3D and simulations were performed to validate the scaling procedure. The experimental data produced during the steady-state run were compared with the simulation results obtained using RELAP5-3D. The overall behavior of the facility met the expectations. The facility capabilities were confirmed to be very promising in performing additional experimental tests, including flow visualization, and produce data for code validation.

  12. A clinical reasoning model focused on clients' behaviour change with reference to physiotherapists: its multiphase development and validation.

    Science.gov (United States)

    Elvén, Maria; Hochwälder, Jacek; Dean, Elizabeth; Söderlund, Anne

    2015-05-01

    A biopsychosocial approach and behaviour change strategies have long been proposed to serve as a basis for addressing current multifaceted health problems. This emphasis has implications for clinical reasoning of health professionals. This study's aim was to develop and validate a conceptual model to guide physiotherapists' clinical reasoning focused on clients' behaviour change. Phase 1 consisted of the exploration of existing research and the research team's experiences and knowledge. Phases 2a and 2b consisted of validation and refinement of the model based on input from physiotherapy students in two focus groups (n = 5 per group) and from experts in behavioural medicine (n = 9). Phase 1 generated theoretical and evidence bases for the first version of a model. Phases 2a and 2b established the validity and value of the model. The final model described clinical reasoning focused on clients' behaviour change as a cognitive, reflective, collaborative and iterative process with multiple interrelated levels that included input from the client and physiotherapist, a functional behavioural analysis of the activity-related target behaviour and the selection of strategies for behaviour change. This unique model, theory- and evidence-informed, has been developed to help physiotherapists to apply clinical reasoning systematically in the process of behaviour change with their clients.

  13. Developing and Validating Path-Dependent Uncertainty Estimates for use with the Regional Seismic Travel Time (RSTT) Model

    Science.gov (United States)

    Begnaud, M. L.; Anderson, D. N.; Phillips, W. S.; Myers, S. C.; Ballard, S.

    2016-12-01

    The Regional Seismic Travel Time (RSTT) tomography model has been developed to improve travel time predictions for regional phases (Pn, Sn, Pg, Lg) in order to increase seismic location accuracy, especially for explosion monitoring. The RSTT model is specifically designed to exploit regional phases for location, especially when combined with teleseismic arrivals. The latest RSTT model (version 201404um) has been released (http://www.sandia.gov/rstt). Travel time uncertainty estimates for RSTT are determined using one-dimensional (1D), distance-dependent error models, that have the benefit of being very fast to use in standard location algorithms, but do not account for path-dependent variations in error, and structural inadequacy of the RSTTT model (e.g., model error). Although global in extent, the RSTT tomography model is only defined in areas where data exist. A simple 1D error model does not accurately model areas where RSTT has not been calibrated. We are developing and validating a new error model for RSTT phase arrivals by mathematically deriving this multivariate model directly from a unified model of RSTT embedded into a statistical random effects model that captures distance, path and model error effects. An initial method developed is a two-dimensional path-distributed method using residuals. The goals for any RSTT uncertainty method are for it to be both readily useful for the standard RSTT user as well as improve travel time uncertainty estimates for location. We have successfully tested using the new error model for Pn phases and will demonstrate the method and validation of the error model for Sn, Pg, and Lg phases.

  14. Development, Implementation and Experimental Validations of Activation Products Models for Water Pool Reactors

    International Nuclear Information System (INIS)

    Petriw, S.N.

    2001-01-01

    Some parameters were obtained both calculations and experiments in order to determined the source of the meaning activation products in water pool reactors. In this case, the study was done in RA-6 reactor (Centro Atomico Bariloche - Argentina).In normal operation, neutron flux on core activates aluminium plates.The activity on coolant water came from its impurities activation and meanly from some quantity of aluminium that, once activated, leave the cladding and is transported by water cooling system.This quantity depends of the 'recoil range' of each activation reaction.The 'staying time' on pool (the time that nuclides are circulating on the reactor pool) is another characteristic parameter of the system.Stationary state activity of some nuclides depends of this time.Also, several theoretical models of activation on coolant water system are showed, and their experimental validations

  15. Locating the Seventh Cervical Spinous Process: Development and Validation of a Multivariate Model Using Palpation and Personal Information.

    Science.gov (United States)

    Ferreira, Ana Paula A; Póvoa, Luciana C; Zanier, José F C; Ferreira, Arthur S

    2017-02-01

    The aim of this study was to develop and validate a multivariate prediction model, guided by palpation and personal information, for locating the seventh cervical spinous process (C7SP). A single-blinded, cross-sectional study at a primary to tertiary health care center was conducted for model development and temporal validation. One-hundred sixty participants were prospectively included for model development (n = 80) and time-split validation stages (n = 80). The C7SP was located using the thorax-rib static method (TRSM). Participants underwent chest radiography for assessment of the inner body structure located with TRSM and using radio-opaque markers placed over the skin. Age, sex, height, body mass, body mass index, and vertex-marker distance (D V-M ) were used to predict the distance from the C7SP to the vertex (D V-C7 ). Multivariate linear regression modeling, limits of agreement plot, histogram of residues, receiver operating characteristic curves, and confusion tables were analyzed. The multivariate linear prediction model for D V-C7 (in centimeters) was D V-C7 = 0.986D V-M + 0.018(mass) + 0.014(age) - 1.008. Receiver operating characteristic curves had better discrimination of D V-C7 (area under the curve = 0.661; 95% confidence interval = 0.541-0.782; P = .015) than D V-M (area under the curve = 0.480; 95% confidence interval = 0.345-0.614; P = .761), with respective cutoff points at 23.40 cm (sensitivity = 41%, specificity = 63%) and 24.75 cm (sensitivity = 69%, specificity = 52%). The C7SP was correctly located more often when using predicted D V-C7 in the validation sample than when using the TRSM in the development sample: n = 53 (66%) vs n = 32 (40%), P information. Copyright © 2016. Published by Elsevier Inc.

  16. Development and Validation of an Enhanced Coupled-Field Model for PZT Cantilever Bimorph Energy Harvester

    Directory of Open Access Journals (Sweden)

    Long Zhang

    2013-01-01

    Full Text Available The power source with the limited life span has motivated the development of the energy harvesters that can scavenge the ambient environment energy and convert it into the electrical energy. With the coupled field characteristics of structure to electricity, piezoelectric energy harvesters are under consideration as a means of converting the mechanical energy to the electrical energy, with the goal of realizing completely self-powered sensor systems. In this paper, two previous models in the literatures for predicting the open-circuit and close-circuit voltages of a piezoelectric cantilever bimorph (PCB energy harvester are first described, that is, the mechanical equivalent spring mass-damper model and the electrical equivalent circuit model. Then, the development of an enhanced coupled field model for the PCB energy harvester based on another previous model in the literature using a conservation of energy method is presented. Further, the laboratory experiments are carried out to evaluate the enhanced coupled field model and the other two previous models in the literatures. The comparison results show that the enhanced coupled field model can better predict the open-circuit and close-circuit voltages of the PCB energy harvester with a proof mass bonded at the free end of the structure in order to increase the energy-harvesting level of the system.

  17. Dynamic model development and validation for a nitrifying moving bed biofilter: Effect of temperature and influent load on the performance

    DEFF Research Database (Denmark)

    Sin, Gürkan; Weijma, Jan; Spanjers, Henri

    2008-01-01

    A mathematical model with adequate complexity integrating hydraulics, biofilm and microbial conversion processes is successfully developed for a continuously moving bed biofilter performing tertiary nitrification. The model was calibrated and validated using data from Nether Stowey pilot plant...... on the ammonium removal efficiency, doubling nitrification capacity every 5 degrees C increase. However, at temperatures higher than 20 degrees C, the biofilm thickness starts to decrease due to increased decay rate. The influent nitrogen load was also found to be influential on the filter performance, while...... the hydraulic loading had relatively negligible impact. Overall, the calibrated model can now reliably be used for design and process optimization purposes....

  18. Development and validation of clinical prediction models for mortality, functional outcome and cognitive impairment after stroke: a study protocol.

    Science.gov (United States)

    Fahey, Marion; Rudd, Anthony; Béjot, Yannick; Wolfe, Charles; Douiri, Abdel

    2017-08-18

    Stroke is a leading cause of adult disability and death worldwide. The neurological impairments associated with stroke prevent patients from performing basic daily activities and have enormous impact on families and caregivers. Practical and accurate tools to assist in predicting outcome after stroke at patient level can provide significant aid for patient management. Furthermore, prediction models of this kind can be useful for clinical research, health economics, policymaking and clinical decision support. 2869 patients with first-ever stroke from South London Stroke Register (SLSR) (1995-2004) will be included in the development cohort. We will use information captured after baseline to construct multilevel models and a Cox proportional hazard model to predict cognitive impairment, functional outcome and mortality up to 5 years after stroke. Repeated random subsampling validation (Monte Carlo cross-validation) will be evaluated in model development. Data from participants recruited to the stroke register (2005-2014) will be used for temporal validation of the models. Data from participants recruited to the Dijon Stroke Register (1985-2015) will be used for external validation. Discrimination, calibration and clinical utility of the models will be presented. Patients, or for patients who cannot consent their relatives, gave written informed consent to participate in stroke-related studies within the SLSR. The SLSR design was approved by the ethics committees of Guy's and St Thomas' NHS Foundation Trust, Kings College Hospital, Queens Square and Westminster Hospitals (London). The Dijon Stroke Registry was approved by the Comité National des Registres and the InVS and has authorisation of the Commission Nationale de l'Informatique et des Libertés. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  19. Modeling the Relationship between Safety Climate and Safety Performance in a Developing Construction Industry: A Cross-Cultural Validation Study.

    Science.gov (United States)

    Zahoor, Hafiz; Chan, Albert P C; Utama, Wahyudi P; Gao, Ran; Zafar, Irfan

    2017-03-28

    This study attempts to validate a safety performance (SP) measurement model in the cross-cultural setting of a developing country. In addition, it highlights the variations in investigating the relationship between safety climate (SC) factors and SP indicators. The data were collected from forty under-construction multi-storey building projects in Pakistan. Based on the results of exploratory factor analysis, a SP measurement model was hypothesized. It was tested and validated by conducting confirmatory factor analysis on calibration and validation sub-samples respectively. The study confirmed the significant positive impact of SC on safety compliance and safety participation , and negative impact on number of self-reported accidents/injuries . However, number of near-misses could not be retained in the final SP model because it attained a lower standardized path coefficient value. Moreover, instead of safety participation , safety compliance established a stronger impact on SP. The study uncovered safety enforcement and promotion as a novel SC factor, whereas safety rules and work practices was identified as the most neglected factor. The study contributed to the body of knowledge by unveiling the deviations in existing dimensions of SC and SP. The refined model is expected to concisely measure the SP in the Pakistani construction industry, however, caution must be exercised while generalizing the study results to other developing countries.

  20. Development and validation of a prediction model for insulin-associated hypoglycemia in non-critically ill hospitalized adults.

    Science.gov (United States)

    Mathioudakis, Nestoras Nicolas; Everett, Estelle; Routh, Shuvodra; Pronovost, Peter J; Yeh, Hsin-Chieh; Golden, Sherita Hill; Saria, Suchi

    2018-01-01

    To develop and validate a multivariable prediction model for insulin-associated hypoglycemia in non-critically ill hospitalized adults. We collected pharmacologic, demographic, laboratory, and diagnostic data from 128 657 inpatient days in which at least 1 unit of subcutaneous insulin was administered in the absence of intravenous insulin, total parenteral nutrition, or insulin pump use (index days). These data were used to develop multivariable prediction models for biochemical and clinically significant hypoglycemia (blood glucose (BG) of ≤70 mg/dL and model development and validation, respectively. Using predictors of age, weight, admitting service, insulin doses, mean BG, nadir BG, BG coefficient of variation (CV BG ), diet status, type 1 diabetes, type 2 diabetes, acute kidney injury, chronic kidney disease (CKD), liver disease, and digestive disease, our model achieved a c-statistic of 0.77 (95% CI 0.75 to 0.78), positive likelihood ratio (+LR) of 3.5 (95% CI 3.4 to 3.6) and negative likelihood ratio (-LR) of 0.32 (95% CI 0.30 to 0.35) for prediction of biochemical hypoglycemia. Using predictors of sex, weight, insulin doses, mean BG, nadir BG, CV BG , diet status, type 1 diabetes, type 2 diabetes, CKD stage, and steroid use, our model achieved a c-statistic of 0.80 (95% CI 0.78 to 0.82), +LR of 3.8 (95% CI 3.7 to 4.0) and -LR of 0.2 (95% CI 0.2 to 0.3) for prediction of clinically significant hypoglycemia. Hospitalized patients at risk of insulin-associated hypoglycemia can be identified using validated prediction models, which may support the development of real-time preventive interventions.

  1. Development-based Trust: Proposing and Validating a New Trust Measurement Model for Buyer-Seller Relationships

    Directory of Open Access Journals (Sweden)

    José Mauro da Costa Hernandez

    2010-04-01

    Full Text Available This study proposes and validates a trust measurement model for buyer-seller relationships. Baptized as development-based trust, the model encompasses three dimensions of trust: calculus-based, knowledge-based and identification-based. In addition to recognizing that trust is a multidimensional construct, the model also assumes that trust can evolve to take on a different character depending on the stage of the relationship. In order to test the proposed model and compare it to the characteristic-based trust measurement model, the measure most frequently used in the buyer-seller relationship literature, data were collected from 238 clients of an IT product wholesaler. The results show that the scales are valid and reliable and the proposed development-based trust measurement model is superior to the characteristic-based trust measurement model in terms of its ability to explain certain variables of interest in buyer-seller relationships (long-term relationship orientation, information sharing, behavioral loyalty and future intentions. Implications for practice, limitations and suggestions for future studies are discussed.

  2. From control to causation: Validating a 'complex systems model' of running-related injury development and prevention.

    Science.gov (United States)

    Hulme, A; Salmon, P M; Nielsen, R O; Read, G J M; Finch, C F

    2017-11-01

    There is a need for an ecological and complex systems approach for better understanding the development and prevention of running-related injury (RRI). In a previous article, we proposed a prototype model of the Australian recreational distance running system which was based on the Systems Theoretic Accident Mapping and Processes (STAMP) method. That model included the influence of political, organisational, managerial, and sociocultural determinants alongside individual-level factors in relation to RRI development. The purpose of this study was to validate that prototype model by drawing on the expertise of both systems thinking and distance running experts. This study used a modified Delphi technique involving a series of online surveys (December 2016- March 2017). The initial survey was divided into four sections containing a total of seven questions pertaining to different features associated with the prototype model. Consensus in opinion about the validity of the prototype model was reached when the number of experts who agreed or disagreed with survey statement was ≥75% of the total number of respondents. A total of two Delphi rounds was needed to validate the prototype model. Out of a total of 51 experts who were initially contacted, 50.9% (n = 26) completed the first round of the Delphi, and 92.3% (n = 24) of those in the first round participated in the second. Most of the 24 full participants considered themselves to be a running expert (66.7%), and approximately a third indicated their expertise as a systems thinker (33.3%). After the second round, 91.7% of the experts agreed that the prototype model was a valid description of the Australian distance running system. This is the first study to formally examine the development and prevention of RRI from an ecological and complex systems perspective. The validated model of the Australian distance running system facilitates theoretical advancement in terms of identifying practical system

  3. Development and validation of the Bullying and Cyberbullying Scale for Adolescents: A multi-dimensional measurement model.

    Science.gov (United States)

    Thomas, Hannah J; Scott, James G; Coates, Jason M; Connor, Jason P

    2018-05-03

    Intervention on adolescent bullying is reliant on valid and reliable measurement of victimization and perpetration experiences across different behavioural expressions. This study developed and validated a survey tool that integrates measurement of both traditional and cyber bullying to test a theoretically driven multi-dimensional model. Adolescents from 10 mainstream secondary schools completed a baseline and follow-up survey (N = 1,217; M age  = 14 years; 66.2% male). The Bullying and cyberbullying Scale for Adolescents (BCS-A) developed for this study comprised parallel victimization and perpetration subscales, each with 20 items. Additional measures of bullying (Olweus Global Bullying and the Forms of Bullying Scale [FBS]), as well as measures of internalizing and externalizing problems, school connectedness, social support, and personality, were used to further assess validity. Factor structure was determined, and then, the suitability of items was assessed according to the following criteria: (1) factor interpretability, (2) item correlations, (3) model parsimony, and (4) measurement equivalence across victimization and perpetration experiences. The final models comprised four factors: physical, verbal, relational, and cyber. The final scale was revised to two 13-item subscales. The BCS-A demonstrated acceptable concurrent and convergent validity (internalizing and externalizing problems, school connectedness, social support, and personality), as well as predictive validity over 6 months. The BCS-A has sound psychometric properties. This tool establishes measurement equivalence across types of involvement and behavioural forms common among adolescents. An improved measurement method could add greater rigour to the evaluation of intervention programmes and also enable interventions to be tailored to subscale profiles. © 2018 The British Psychological Society.

  4. Modelling the fate of sulphur-35 in crops. 2. Development and validation of the CROPS-35 model

    International Nuclear Information System (INIS)

    Collins, Chris; Cunningham, Nathan

    2005-01-01

    Gas-cooled nuclear power plants in the UK release sulphur-35 during their routine operation, which can be readily assimilated by vegetation. It is therefore necessary to be able to model the uptake of such releases in order to quantify any potential contamination of the food chain. A model is described which predicts the concentration of 35 S in crop components following an aerial gaseous release. Following deposition the allocation to crop components is determined by an export function from a labile pool, the leaves, to those components growing most actively post exposure. The growth rates are determined by crop growth data, which is also used to determine the concentration. The loss of activity is controlled by radioactive decay only. The paper describes the calibration and the validation of the model. To improve the model, further experimental work is required particularly on the export kinetics of 35 S. It may be possible to adapt such a modelling approach to the prediction of crop content for gaseous releases of 3 H and 14 C from nuclear facilities. - The calibration and validation of a model for the prediction of the fate of 35 S in vegetation is described

  5. Development of coupled models and their validation against experiments -DECOVALEX project

    International Nuclear Information System (INIS)

    Stephansson, O.; Jing, L.; Kautsky, F.

    1995-01-01

    DECOVALEX is an international co-operative research project for theoretical and experimental studies of coupled thermal, hydrological and mechanical processes in hard rocks. Different mathematical models and computer codes have been developed by research teams from different countries. These models and codes are used to study the so-called Bench Mark Test and Test Case problems developed within this project. Bench-Mark Tests are defined as hypothetical initial-boundary value problems of a generic nature, and Test Cases are experimental investigations of part or full aspects of coupled thermo-hydro-mechanical processes in hard rocks. Analytical and semi-analytical solutions related to coupled T-H-M processes are also developed for problems with simpler geometry and initial-boundary conditions. These solutions are developed to verify algorithms and their computer implementations. In this contribution the motivation, organization and approaches and current status of the project are presented, together with definitions of Bench-Mark Tests and Test Case problems. The definition and part of results for a BMT problem (BMT3) for a near-field repository model are described as an example. (authors). 3 refs., 11 figs., 3 tabs

  6. Development, external validation and clinical usefulness of a practical prediction model for radiation-induced dysphagia in lung cancer patients

    International Nuclear Information System (INIS)

    Dehing-Oberije, Cary; De Ruysscher, Dirk; Petit, Steven; Van Meerbeeck, Jan; Vandecasteele, Katrien; De Neve, Wilfried; Dingemans, Anne Marie C.; El Naqa, Issam; Deasy, Joseph; Bradley, Jeff; Huang, Ellen; Lambin, Philippe

    2010-01-01

    Introduction: Acute dysphagia is a distressing dose-limiting toxicity occurring frequently during concurrent chemo-radiation or high-dose radiotherapy for lung cancer. It can lead to treatment interruptions and thus jeopardize survival. Although a number of predictive factors have been identified, it is still not clear how these could offer assistance for treatment decision making in daily clinical practice. Therefore, we have developed and validated a nomogram to predict this side-effect. In addition, clinical usefulness was assessed by comparing model predictions to physicians' predictions. Materials and methods: Clinical data from 469 inoperable lung cancer patients, treated with curative intent, were collected prospectively. A prediction model for acute radiation-induced dysphagia was developed. Model performance was evaluated by the c-statistic and assessed using bootstrapping as well as two external datasets. In addition, a prospective study was conducted comparing model to physicians' predictions in 138 patients. Results: The final multivariate model consisted of age, gender, WHO performance status, mean esophageal dose (MED), maximum esophageal dose (MAXED) and overall treatment time (OTT). The c-statistic, assessed by bootstrapping, was 0.77. External validation yielded an AUC of 0.94 on the Ghent data and 0.77 on the Washington University St. Louis data for dysphagia ≥ grade 3. Comparing model predictions to the physicians' predictions resulted in an AUC of 0.75 versus 0.53, respectively. Conclusions: The proposed model performed well was successfully validated and demonstrated the ability to predict acute severe dysphagia remarkably better than the physicians. Therefore, this model could be used in clinical practice to identify patients at high or low risk.

  7. Applying the concept of consumer confusion to healthcare: development and validation of a patient confusion model.

    Science.gov (United States)

    Gebele, Christoph; Tscheulin, Dieter K; Lindenmeier, Jörg; Drevs, Florian; Seemann, Ann-Kathrin

    2014-01-01

    As patient autonomy and consumer sovereignty increase, information provision is considered essential to decrease information asymmetries between healthcare service providers and patients. However, greater availability of third party information sources can have negative side effects. Patients can be confused by the nature, as well as the amount, of quality information when making choices among competing health care providers. Therefore, the present study explores how information may cause patient confusion and affect the behavioral intention to choose a health care provider. Based on a quota sample of German citizens (n = 198), the present study validates a model of patient confusion in the context of hospital choice. The study results reveal that perceived information overload, perceived similarity, and perceived ambiguity of health information impact the affective and cognitive components of patient confusion. Confused patients have a stronger inclination to hastily narrow down their set of possible decision alternatives. Finally, an empirical analysis reveals that the affective and cognitive components of patient confusion mediate perceived information overload, perceived similarity, and perceived ambiguity of information. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  8. Development and validation of a heuristic model for evaluation of the team performance of operators in nuclear power plants

    International Nuclear Information System (INIS)

    Kim, Sa Kil; Byun, Seong Nam; Lee, Dhong Hoon

    2011-01-01

    Highlights: → We develop an estimation model for evaluation of the team performance of MCR. → To build the model, we extract team performance factors through reviewing literatures and identifying behavior markers. → We validate that the model is adaptable to the advanced MCR of nuclear power plants. → As a result, we find that the model is a systematic and objective to measure team performance. - Abstract: The global concerns about safety in the digital technology of the main control room (MCR) are growing as domestic and foreign nuclear power plants are developed with computerized control facilities and human-system interfaces. In a narrow space, the digital technology contributes to a control room environment, which can facilitate the acquisition of all the information needed for operation. Thus, although an individual performance of the advanced MCR can be further improved; there is a limit in expecting an improvement in team performance. The team performance depends on organic coherence as a whole team rather than on the knowledge and skill of an individual operator. Moreover, a good team performance improves communication between and within teams in an efficient manner, and then it can be conducive to addressing unsafe conditions. Respecting this, it is important and necessary to develop methodological technology for the evaluation of operators' teamwork or collaboration, thus enhancing operational performance in nuclear power plant at the MCR. The objectives of this research are twofold: to develop a systematic methodology for evaluation of the team performance of MCR operators in consideration of advanced MCR characteristics, and to validate that the methodology is adaptable to the advanced MCR of nuclear power plants. In order to achieve these two objectives, first, team performance factors were extracted through literature reviews and methodological study concerning team performance theories. Second, the team performance factors were identified and

  9. Development and validation of deterioration models for concrete bridge decks - phase 2 : mechanics-based degradation models.

    Science.gov (United States)

    2013-06-01

    This report summarizes a research project aimed at developing degradation models for bridge decks in the state of Michigan based on durability mechanics. A probabilistic framework to implement local-level mechanistic-based models for predicting the c...

  10. Design, development and validation of software for modelling dietary exposure to food chemicals and nutrients.

    Science.gov (United States)

    McNamara, C; Naddy, B; Rohan, D; Sexton, J

    2003-10-01

    The Monte Carlo computational system for stochastic modelling of dietary exposure to food chemicals and nutrients is presented. This system was developed through a European Commission-funded research project. It is accessible as a Web-based application service. The system allows and supports very significant complexity in the data sets used as the model input, but provides a simple, general purpose, linear kernel for model evaluation. Specific features of the system include the ability to enter (arbitrarily) complex mathematical or probabilistic expressions at each and every input data field, automatic bootstrapping on subjects and on subject food intake diaries, and custom kernels to apply brand information such as market share and loyalty to the calculation of food and chemical intake.

  11. Development and validation of the shutdown cooling system CATHENA model for Gentilly-2

    International Nuclear Information System (INIS)

    Lecuyer, H.; Hasnaoui, C.; Sabourin, G.; Chapados, S.

    2008-01-01

    A CATHENA representation of the Gentilly-2 Shutdown Cooling system has been developed for Hydro-Quebec. The model includes the SDCS circuit piping, valves, pumps and heat exchangers. The model is integrated in the G2 CATHENA overall plant model and coupled with the plant control software simulator TROLG2 to allow the simulation of various plant operational modes using the SDCS. Results have been obtained for normal cooling of the primary heat transport system following a planned shut down (transition from full power to shutdown) and for two special SDCS configurations that were used on September 14 and 15, 2006 at Gentilly-2. The results show close match with values measured at Gentilly-2 during either steady or transient states. (author)

  12. Development and validation of the shutdown cooling system CATHENA model for Gentilly-2

    Energy Technology Data Exchange (ETDEWEB)

    Lecuyer, H.; Hasnaoui, C. [Nucleonex Inc., Westmount, Quebec (Canada); Sabourin, G. [Atomic Energy of Canada Limited, Montreal, Quebec (Canada); Chapados, S. [Hydro-Quebec, Unite Analyse et Fiabilite, Montreal, Quebec (Canada)

    2008-07-01

    A CATHENA representation of the Gentilly-2 Shutdown Cooling system has been developed for Hydro-Quebec. The model includes the SDCS circuit piping, valves, pumps and heat exchangers. The model is integrated in the G2 CATHENA overall plant model and coupled with the plant control software simulator TROLG2 to allow the simulation of various plant operational modes using the SDCS. Results have been obtained for normal cooling of the primary heat transport system following a planned shut down (transition from full power to shutdown) and for two special SDCS configurations that were used on September 14 and 15, 2006 at Gentilly-2. The results show close match with values measured at Gentilly-2 during either steady or transient states. (author)

  13. Potassium titanyl phosphate laser tissue ablation: development and experimental validation of a new numerical model.

    Science.gov (United States)

    Elkhalil, Hossam; Akkin, Taner; Pearce, John; Bischof, John

    2012-10-01

    The photoselective vaporization of prostate (PVP) green light (532 nm) laser is increasingly being used as an alternative to the transurethral resection of prostate (TURP) for treatment of benign prostatic hyperplasia (BPH) in older patients and those who are poor surgical candidates. In order to achieve the goals of increased tissue removal volume (i.e., "ablation" in the engineering sense) and reduced collateral thermal damage during the PVP green light treatment, a two dimensional computational model for laser tissue ablation based on available parameters in the literature has been developed and compared to experiments. The model is based on the control volume finite difference and the enthalpy method with a mechanistically defined energy necessary to ablate (i.e., physically remove) a volume of tissue (i.e., energy of ablation E(ab)). The model was able to capture the general trends experimentally observed in terms of ablation and coagulation areas, their ratio (therapeutic index (TI)), and the ablation rate (AR) (mm(3)/s). The model and experiment were in good agreement at a smaller working distance (WD) (distance from the tissue in mm) and a larger scanning speed (SS) (laser scan speed in mm/s). However, the model and experiment deviated somewhat with a larger WD and a smaller SS; this is most likely due to optical shielding and heat diffusion in the laser scanning direction, which are neglected in the model. This model is a useful first step in the mechanistic prediction of PVP based BPH laser tissue ablation. Future modeling efforts should focus on optical shielding, heat diffusion in the laser scanning direction (i.e., including 3D effects), convective heat losses at the tissue boundary, and the dynamic optical, thermal, and coagulation properties of BPH tissue.

  14. Development and validation of a prediction model for tube feeding dependence after curative (chemo- radiation in head and neck cancer.

    Directory of Open Access Journals (Sweden)

    Kim Wopken

    Full Text Available BACKGROUND: Curative radiotherapy or chemoradiation for head and neck cancer (HNC may result in severe acute and late side effects, including tube feeding dependence. The purpose of this prospective cohort study was to develop a prediction model for tube feeding dependence 6 months (TUBEM6 after curative (chemo- radiotherapy in HNC patients. PATIENTS AND METHODS: Tube feeding dependence was scored prospectively. To develop the multivariable model, a group LASSO analysis was carried out, with TUBEM6 as the primary endpoint (n = 427. The model was then validated in a test cohort (n = 183. The training cohort was divided into three groups based on the risk of TUBEM6 to test whether the model could be extrapolated to later time points (12, 18 and 24 months. RESULTS: Most important predictors for TUBEM6 were weight loss prior to treatment, advanced T-stage, positive N-stage, bilateral neck irradiation, accelerated radiotherapy and chemoradiation. Model performance was good, with an Area under the Curve of 0.86 in the training cohort and 0.82 in the test cohort. The TUBEM6-based risk groups were significantly associated with tube feeding dependence at later time points (p<0.001. CONCLUSION: We established an externally validated predictive model for tube feeding dependence after curative radiotherapy or chemoradiation, which can be used to predict TUBEM6.

  15. Development and validation of an extensive growth and growth boundary model for psychrotolerant Lactobacillus spp. in seafood and meat products

    DEFF Research Database (Denmark)

    Mejlholm, Ole; Dalgaard, Paw

    2013-01-01

    A new and extensive growth and growth boundary model for psychrotolerant Lactobacillus spp. was developed and validated for processed and unprocessed products of seafood and meat. The new model was developed by refitting and expanding an existing cardinal parameter model for growth and the growth...... of psychrotolerant Lactobacillus spp. was clearly demonstrated. The new model can be used to predict growth of psychrotolerant Lactobacillus spp. in seafood and meat products e.g. prediction of the time to a critical cell concentration of bacteria is considered useful for establishing the shelf life. In addition...... boundary of lactic acid bacteria (LAB) in processed seafood (O. Mejlholm and P. Dalgaard, J. Food Prot. 70. 2485–2497, 2007). Initially, to estimate values for the maximum specific growth rate at the reference temperature of 25°C (μref) and the theoretical minimum temperature that prevents growth...

  16. Development, validation and application of multi-point kinetics model in RELAP5 for analysis of asymmetric nuclear transients

    Energy Technology Data Exchange (ETDEWEB)

    Pradhan, Santosh K., E-mail: santosh@aerb.gov.in [Nuclear Safety Analysis Division, Atomic Energy Regulatory Board, Mumbai 400094 (India); Obaidurrahman, K. [Nuclear Safety Analysis Division, Atomic Energy Regulatory Board, Mumbai 400094 (India); Iyer, Kannan N. [Department of Mechanical Engineering, IIT Bombay, Mumbai 400076 (India); Gaikwad, Avinash J. [Nuclear Safety Analysis Division, Atomic Energy Regulatory Board, Mumbai 400094 (India)

    2016-04-15

    Highlights: • A multi-point kinetics model is developed for RELAP5 system thermal hydraulics code. • Model is validated against extensive 3D kinetics code. • RELAP5 multi-point kinetics formulation is used to investigate critical break for LOCA in PHWR. - Abstract: Point kinetics approach in system code RELAP5 limits its use for many of the reactivity induced transients, which involve asymmetric core behaviour. Development of fully coupled 3D core kinetics code with system thermal-hydraulics is the ultimate requirement in this regard; however coupling and validation of 3D kinetics module with system code is cumbersome and it also requires access to source code. An intermediate approach with multi-point kinetics is appropriate and relatively easy to implement for analysis of several asymmetric transients for large cores. Multi-point kinetics formulation is based on dividing the entire core into several regions and solving ODEs describing kinetics in each region. These regions are interconnected by spatial coupling coefficients which are estimated from diffusion theory approximation. This model offers an advantage that associated ordinary differential equations (ODEs) governing multi-point kinetics formulation can be solved using numerical methods to the desired level of accuracy and thus allows formulation based on user defined control variables, i.e., without disturbing the source code and hence also avoiding associated coupling issues. Euler's method has been used in the present formulation to solve several coupled ODEs internally at each time step. The results have been verified against inbuilt point-kinetics models of RELAP5 and validated against 3D kinetics code TRIKIN. The model was used to identify the critical break in RIH of a typical large PHWR core. The neutronic asymmetry produced in the core due to the system induced transient was effectively handled by the multi-point kinetics model overcoming the limitation of in-built point kinetics model

  17. A human life-stage physiologically based pharmacokinetic and pharmacodynamic model for chlorpyrifos: development and validation.

    Science.gov (United States)

    Smith, Jordan Ned; Hinderliter, Paul M; Timchalk, Charles; Bartels, Michael J; Poet, Torka S

    2014-08-01

    Sensitivity to some chemicals in animals and humans are known to vary with age. Age-related changes in sensitivity to chlorpyrifos have been reported in animal models. A life-stage physiologically based pharmacokinetic and pharmacodynamic (PBPK/PD) model was developed to predict disposition of chlorpyrifos and its metabolites, chlorpyrifos-oxon (the ultimate toxicant) and 3,5,6-trichloro-2-pyridinol (TCPy), as well as B-esterase inhibition by chlorpyrifos-oxon in humans. In this model, previously measured age-dependent metabolism of chlorpyrifos and chlorpyrifos-oxon were integrated into age-related descriptions of human anatomy and physiology. The life-stage PBPK/PD model was calibrated and tested against controlled adult human exposure studies. Simulations suggest age-dependent pharmacokinetics and response may exist. At oral doses ⩾0.6mg/kg of chlorpyrifos (100- to 1000-fold higher than environmental exposure levels), 6months old children are predicted to have higher levels of chlorpyrifos-oxon in blood and higher levels of red blood cell cholinesterase inhibition compared to adults from equivalent doses. At lower doses more relevant to environmental exposures, simulations predict that adults will have slightly higher levels of chlorpyrifos-oxon in blood and greater cholinesterase inhibition. This model provides a computational framework for age-comparative simulations that can be utilized to predict chlorpyrifos disposition and biological response over various postnatal life stages. Copyright © 2013 Elsevier Inc. All rights reserved.

  18. A Human Life-Stage Physiologically Based Pharmacokinetic and Pharmacodynamic Model for Chlorpyrifos: Development and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Jordan N.; Hinderliter, Paul M.; Timchalk, Charles; Bartels, M. J.; Poet, Torka S.

    2014-08-01

    Sensitivity to chemicals in animals and humans are known to vary with age. Age-related changes in sensitivity to chlorpyrifos have been reported in animal models. A life-stage physiologically based pharmacokinetic and pharmacodynamic (PBPK/PD) model was developed to computationally predict disposition of CPF and its metabolites, chlorpyrifos-oxon (the ultimate toxicant) and 3,5,6-trichloro-2-pyridinol (TCPy), as well as B-esterase inhibition by chlorpyrifos-oxon in humans. In this model, age-dependent body weight was calculated from a generalized Gompertz function, and compartments (liver, brain, fat, blood, diaphragm, rapid, and slow) were scaled based on body weight from polynomial functions on a fractional body weight basis. Blood flows among compartments were calculated as a constant flow per compartment volume. The life-stage PBPK/PD model was calibrated and tested against controlled adult human exposure studies. Model simulations suggest age-dependent pharmacokinetics and response may exist. At oral doses ≥ 0.55 mg/kg of chlorpyrifos (significantly higher than environmental exposure levels), 6 mo old children are predicted to have higher levels of chlorpyrifos-oxon in blood and higher levels of red blood cell cholinesterase inhibition compared to adults from equivalent oral doses of chlorpyrifos. At lower doses that are more relevant to environmental exposures, the model predicts that adults will have slightly higher levels of chlorpyrifos-oxon in blood and greater cholinesterase inhibition. This model provides a computational framework for age-comparative simulations that can be utilized to predict CPF disposition and biological response over various postnatal life-stages.

  19. Biotrickling filter modeling for styrene abatement. Part 1: Model development, calibration and validation on an industrial scale.

    Science.gov (United States)

    San-Valero, Pau; Dorado, Antonio D; Martínez-Soria, Vicente; Gabaldón, Carmen

    2018-01-01

    A three-phase dynamic mathematical model based on mass balances describing the main processes in biotrickling filtration: convection, mass transfer, diffusion, and biodegradation was calibrated and validated for the simulation of an industrial styrene-degrading biotrickling filter. The model considered the key features of the industrial operation of biotrickling filters: variable conditions of loading and intermittent irrigation. These features were included in the model switching from the mathematical description of periods with and without irrigation. Model equations were based on the mass balances describing the main processes in biotrickling filtration: convection, mass transfer, diffusion, and biodegradation. The model was calibrated with steady-state data from a laboratory biotrickling filter treating inlet loads at 13-74 g C m -3 h -1 and at empty bed residence time of 30-15 s. The model predicted the dynamic emission in the outlet of the biotrickling filter, simulating the small peaks of concentration occurring during irrigation. The validation of the model was performed using data from a pilot on-site biotrickling filter treating styrene installed in a fiber-reinforced facility. The model predicted the performance of the biotrickling filter working under high-oscillating emissions at an inlet load in a range of 5-23 g C m -3 h -1 and at an empty bed residence time of 31 s for more than 50 days, with a goodness of fit of 0.84. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Development and validation of a predictive risk model for all-cause mortality in type 2 diabetes.

    Science.gov (United States)

    Robinson, Tom E; Elley, C Raina; Kenealy, Tim; Drury, Paul L

    2015-06-01

    Type 2 diabetes is common and is associated with an approximate 80% increase in the rate of mortality. Management decisions may be assisted by an estimate of the patient's absolute risk of adverse outcomes, including death. This study aimed to derive a predictive risk model for all-cause mortality in type 2 diabetes. We used primary care data from a large national multi-ethnic cohort of patients with type 2 diabetes in New Zealand and linked mortality records to develop a predictive risk model for 5-year risk of mortality. We then validated this model using information from a separate cohort of patients with type 2 diabetes. 26,864 people were included in the development cohort with a median follow up time of 9.1 years. We developed three models initially using demographic information and then progressively more clinical detail. The final model, which also included markers of renal disease, proved to give best prediction of all-cause mortality with a C-statistic of 0.80 in the development cohort and 0.79 in the validation cohort (7610 people) and was well calibrated. Ethnicity was a major factor with hazard ratios of 1.37 for indigenous Maori, 0.41 for East Asian and 0.55 for Indo Asian compared with European (P<0.001). We have developed a model using information usually available in primary care that provides good assessment of patient's risk of death. Results are similar to models previously published from smaller cohorts in other countries and apply to a wider range of patient ethnic groups. Copyright © 2015. Published by Elsevier Ireland Ltd.

  1. Development and validation of a new turbocharger simulation methodology for marine two stroke diesel engine modelling and diagnostic applications

    International Nuclear Information System (INIS)

    Sakellaridis, Nikolaos F.; Raptotasios, Spyridon I.; Antonopoulos, Antonis K.; Mavropoulos, Georgios C.; Hountalas, Dimitrios T.

    2015-01-01

    Engine cycle simulation models are increasingly used in diesel engine simulation and diagnostic applications, reducing experimental effort. Turbocharger simulation plays an important role in model's ability to accurately predict engine performance and emissions. The present work describes the development of a complete engine simulation model for marine Diesel engines based on a new methodology for turbocharger modelling utilizing physically based meanline models for compressor and turbine. Simulation accuracy is evaluated against engine bench measurements. The methodology was developed to overcome the problem of limited experimental maps availability for compressor and turbine, often encountered in large marine diesel engine simulation and diagnostic studies. Data from the engine bench are used to calibrate the models, as well as to estimate turbocharger shaft mechanical efficiency. Closed cycle and gas exchange are modelled using an existing multizone thermodynamic model. The proposed methodology is applied on a 2-stroke marine diesel engine and its evaluation is based on the comparison of predictions against measured engine data. It is demonstrated model's ability to predict engine response with load variation regarding both turbocharger performance and closed cycle parameters, as well as NOx emission trends, making it an effective tool for both engine diagnostic and optimization studies. - Highlights: • Marine two stroke diesel engine simulation model. • Turbine and compressor simulation using physical meanline models. • Methodology to derive T/C component efficiency and T/C shaft mechanical efficiency. • Extensive validation of predictions against experimental data.

  2. Development and validation of a computational model of the knee joint for the evaluation of surgical treatments for osteoarthritis.

    Science.gov (United States)

    Mootanah, R; Imhauser, C W; Reisse, F; Carpanen, D; Walker, R W; Koff, M F; Lenhoff, M W; Rozbruch, S R; Fragomen, A T; Dewan, Z; Kirane, Y M; Cheah, K; Dowell, J K; Hillstrom, H J

    2014-01-01

    A three-dimensional (3D) knee joint computational model was developed and validated to predict knee joint contact forces and pressures for different degrees of malalignment. A 3D computational knee model was created from high-resolution radiological images to emulate passive sagittal rotation (full-extension to 65°-flexion) and weight acceptance. A cadaveric knee mounted on a six-degree-of-freedom robot was subjected to matching boundary and loading conditions. A ligament-tuning process minimised kinematic differences between the robotically loaded cadaver specimen and the finite element (FE) model. The model was validated by measured intra-articular force and pressure measurements. Percent full scale error between FE-predicted and in vitro-measured values in the medial and lateral compartments were 6.67% and 5.94%, respectively, for normalised peak pressure values, and 7.56% and 4.48%, respectively, for normalised force values. The knee model can accurately predict normalised intra-articular pressure and forces for different loading conditions and could be further developed for subject-specific surgical planning.

  3. Development and validation of a prediction model for measurement variability of lung nodule volumetry in patients with pulmonary metastases.

    Science.gov (United States)

    Hwang, Eui Jin; Goo, Jin Mo; Kim, Jihye; Park, Sang Joon; Ahn, Soyeon; Park, Chang Min; Shin, Yeong-Gil

    2017-08-01

    To develop a prediction model for the variability range of lung nodule volumetry and validate the model in detecting nodule growth. For model development, 50 patients with metastatic nodules were prospectively included. Two consecutive CT scans were performed to assess volumetry for 1,586 nodules. Nodule volume, surface voxel proportion (SVP), attachment proportion (AP) and absolute percentage error (APE) were calculated for each nodule and quantile regression analyses were performed to model the 95% percentile of APE. For validation, 41 patients who underwent metastasectomy were included. After volumetry of resected nodules, sensitivity and specificity for diagnosis of metastatic nodules were compared between two different thresholds of nodule growth determination: uniform 25% volume change threshold and individualized threshold calculated from the model (estimated 95% percentile APE). SVP and AP were included in the final model: Estimated 95% percentile APE = 37.82 · SVP + 48.60 · AP-10.87. In the validation session, the individualized threshold showed significantly higher sensitivity for diagnosis of metastatic nodules than the uniform 25% threshold (75.0% vs. 66.0%, P = 0.004) CONCLUSION: Estimated 95% percentile APE as an individualized threshold of nodule growth showed greater sensitivity in diagnosing metastatic nodules than a global 25% threshold. • The 95 % percentile APE of a particular nodule can be predicted. • Estimated 95 % percentile APE can be utilized as an individualized threshold. • More sensitive diagnosis of metastasis can be made with an individualized threshold. • Tailored nodule management can be provided during nodule growth follow-up.

  4. Development and validation of a thermodynamic model for the performance analysis of a gamma Stirling engine prototype

    International Nuclear Information System (INIS)

    Araoz, Joseph A.; Cardozo, Evelyn; Salomon, Marianne; Alejo, Lucio; Fransson, Torsten H.

    2015-01-01

    This work presents the development and validation of a numerical model that represents the performance of a gamma Stirling engine prototype. The model follows a modular approach considering ideal adiabatic working spaces; limited internal and external heat transfer through the heat exchangers; and mechanical and thermal losses during the cycle. In addition, it includes the calculation of the mechanical efficiency taking into account the crank mechanism effectiveness and the forced work during the cycle. Consequently, the model aims to predict the work that can be effectively taken from the shaft. The model was compared with experimental data obtained in an experimental rig built for the engine prototype. The results showed an acceptable degree of accuracy when comparing with the experimental data, with errors ranging from ±1% to ±8% for the temperature in the heater side, less than ±1% error for the cooler temperatures, and ±1 to ±8% for the brake power calculations. Therefore, the model was probed adequate for study of the prototype performance. In addition, the results of the simulation reflected the limited performance obtained during the prototype experiments, and a first analysis of the results attributed this to the forced work during the cycle. The implemented model is the basis for a subsequent parametric analysis that will complement the results presented. - Highlights: • A numerical model for a Stirling engine was developed. • A mechanical efficiency analysis was included in the model. • The model was validated with experimental data of a novel prototype. • The model results permit a deeper insight into the engine operation

  5. Development and validation of a dynamic outcome prediction model for paracetamol-induced acute liver failure

    DEFF Research Database (Denmark)

    Bernal, William; Wang, Yanzhong; Maggs, James

    2016-01-01

    : The models developed here show very good discrimination and calibration, confirmed in independent datasets, and suggest that many patients undergoing transplantation based on existing criteria might have survived with medical management alone. The role and indications for emergency liver transplantation......BACKGROUND: Early, accurate prediction of survival is central to management of patients with paracetamol-induced acute liver failure to identify those needing emergency liver transplantation. Current prognostic tools are confounded by recent improvements in outcome independent of emergency liver...... transplantation, and constrained by static binary outcome prediction. We aimed to develop a simple prognostic tool to reflect current outcomes and generate a dynamic updated estimation of risk of death. METHODS: Patients with paracetamol-induced acute liver failure managed at intensive care units in the UK...

  6. Development and validation of a prediction model for long-term sickness absence based on occupational health survey variables.

    Science.gov (United States)

    Roelen, Corné; Thorsen, Sannie; Heymans, Martijn; Twisk, Jos; Bültmann, Ute; Bjørner, Jakob

    2018-01-01

    The purpose of this study is to develop and validate a prediction model for identifying employees at increased risk of long-term sickness absence (LTSA), by using variables commonly measured in occupational health surveys. Based on the literature, 15 predictor variables were retrieved from the DAnish National working Environment Survey (DANES) and included in a model predicting incident LTSA (≥4 consecutive weeks) during 1-year follow-up in a sample of 4000 DANES participants. The 15-predictor model was reduced by backward stepwise statistical techniques and then validated in a sample of 2524 DANES participants, not included in the development sample. Identification of employees at increased LTSA risk was investigated by receiver operating characteristic (ROC) analysis; the area-under-the-ROC-curve (AUC) reflected discrimination between employees with and without LTSA during follow-up. The 15-predictor model was reduced to a 9-predictor model including age, gender, education, self-rated health, mental health, prior LTSA, work ability, emotional job demands, and recognition by the management. Discrimination by the 9-predictor model was significant (AUC = 0.68; 95% CI 0.61-0.76), but not practically useful. A prediction model based on occupational health survey variables identified employees with an increased LTSA risk, but should be further developed into a practically useful tool to predict the risk of LTSA in the general working population. Implications for rehabilitation Long-term sickness absence risk predictions would enable healthcare providers to refer high-risk employees to rehabilitation programs aimed at preventing or reducing work disability. A prediction model based on health survey variables discriminates between employees at high and low risk of long-term sickness absence, but discrimination was not practically useful. Health survey variables provide insufficient information to determine long-term sickness absence risk profiles. There is a need for

  7. Development of a FBR fuel pin bundle deformation analysis code 'BAMBOO' . Development of a dispersion model and its validation

    International Nuclear Information System (INIS)

    Uwaba, Tomoyuki; Ukai, Shigeharu; Asaga, Takeo

    2002-03-01

    Bundle Duct Interaction (BDI) is one of the life limiting factors of a FBR fuel subassembly. Under the BDI condition, the fuel pin dispersion would occur mainly by the deviation of the wire position due to the irradiation. In this study the effect of the dispersion on the bundle deformation was evaluated by using the BAMBOO code and following results were obtained. (1) A new contact analysis model was introduced in BAMBOO code. This model considers the contact condition at the axial position other than the nodal point of the beam element that composes the fuel pin. This improvement made it possible in the bundle deformation analysis to cause fuel pin dispersion due to the deviations of the wire position. (2) This model was validated with the results of the out-of-pile compression test with the wire deviation. The calculated pin-to-duct and pin-to-pin clearances with the dispersion model almost agreed with the test results. Therefore it was confirmed that the BAMBOO code reasonably predicts the bundle deformation with the dispersion. (3) In the dispersion bundle the pin-to-pin clearances widely scattered. And the minimum pin-to-duct clearance increased or decreased depending on the dispersion condition compared to the no-dispersion bundle. This result suggests the possibility that the considerable dispersion would affect the thermal integrity of the bundle. (author)

  8. Verification and validation of models

    International Nuclear Information System (INIS)

    Herbert, A.W.; Hodgkinson, D.P.; Jackson, C.P.; Lever, D.A.; Robinson, P.C.

    1986-12-01

    The numerical accuracy of the computer models for groundwater flow and radionuclide transport that are to be used in repository safety assessment must be tested, and their ability to describe experimental data assessed: they must be verified and validated respectively. Also appropriate ways to use the codes in performance assessments, taking into account uncertainties in present data and future conditions, must be studied. These objectives are being met by participation in international exercises, by developing bench-mark problems, and by analysing experiments. In particular the project has funded participation in the HYDROCOIN project for groundwater flow models, the Natural Analogues Working Group, and the INTRAVAL project for geosphere models. (author)

  9. Development and validation of a model TRIGA Mark III reactor with code MCNP5

    International Nuclear Information System (INIS)

    Galicia A, J.; Francois L, J. L.; Aguilar H, F.

    2015-09-01

    The main purpose of this paper is to obtain a model of the reactor core TRIGA Mark III that accurately represents the real operating conditions to 1 M Wth, using the Monte Carlo code MCNP5. To provide a more detailed analysis, different models of the reactor core were realized by simulating the control rods extracted and inserted in conditions in cold (293 K) also including an analysis for shutdown margin, so that satisfied the Operation Technical Specifications. The position they must have the control rods to reach a power equal to 1 M Wth, were obtained from practice entitled Operation in Manual Mode performed at Instituto Nacional de Investigaciones Nucleares (ININ). Later, the behavior of the K eff was analyzed considering different temperatures in the fuel elements, achieving calculate subsequently the values that best represent the actual reactor operation. Finally, the calculations in the developed model for to obtain the distribution of average flow of thermal, epithermal and fast neutrons in the six new experimental facilities are presented. (Author)

  10. Validation of a Previously Developed Geospatial Model That Predicts the Prevalence of Listeria monocytogenes in New York State Produce Fields.

    Science.gov (United States)

    Weller, Daniel; Shiwakoti, Suvash; Bergholz, Peter; Grohn, Yrjo; Wiedmann, Martin; Strawn, Laura K

    2016-02-01

    Technological advancements, particularly in the field of geographic information systems (GIS), have made it possible to predict the likelihood of foodborne pathogen contamination in produce production environments using geospatial models. Yet, few studies have examined the validity and robustness of such models. This study was performed to test and refine the rules associated with a previously developed geospatial model that predicts the prevalence of Listeria monocytogenes in produce farms in New York State (NYS). Produce fields for each of four enrolled produce farms were categorized into areas of high or low predicted L. monocytogenes prevalence using rules based on a field's available water storage (AWS) and its proximity to water, impervious cover, and pastures. Drag swabs (n = 1,056) were collected from plots assigned to each risk category. Logistic regression, which tested the ability of each rule to accurately predict the prevalence of L. monocytogenes, validated the rules based on water and pasture. Samples collected near water (odds ratio [OR], 3.0) and pasture (OR, 2.9) showed a significantly increased likelihood of L. monocytogenes isolation compared to that for samples collected far from water and pasture. Generalized linear mixed models identified additional land cover factors associated with an increased likelihood of L. monocytogenes isolation, such as proximity to wetlands. These findings validated a subset of previously developed rules that predict L. monocytogenes prevalence in produce production environments. This suggests that GIS and geospatial models can be used to accurately predict L. monocytogenes prevalence on farms and can be used prospectively to minimize the risk of preharvest contamination of produce. Copyright © 2016, American Society for Microbiology. All Rights Reserved.

  11. Validation of a Previously Developed Geospatial Model That Predicts the Prevalence of Listeria monocytogenes in New York State Produce Fields

    Science.gov (United States)

    Weller, Daniel; Shiwakoti, Suvash; Bergholz, Peter; Grohn, Yrjo; Wiedmann, Martin

    2015-01-01

    Technological advancements, particularly in the field of geographic information systems (GIS), have made it possible to predict the likelihood of foodborne pathogen contamination in produce production environments using geospatial models. Yet, few studies have examined the validity and robustness of such models. This study was performed to test and refine the rules associated with a previously developed geospatial model that predicts the prevalence of Listeria monocytogenes in produce farms in New York State (NYS). Produce fields for each of four enrolled produce farms were categorized into areas of high or low predicted L. monocytogenes prevalence using rules based on a field's available water storage (AWS) and its proximity to water, impervious cover, and pastures. Drag swabs (n = 1,056) were collected from plots assigned to each risk category. Logistic regression, which tested the ability of each rule to accurately predict the prevalence of L. monocytogenes, validated the rules based on water and pasture. Samples collected near water (odds ratio [OR], 3.0) and pasture (OR, 2.9) showed a significantly increased likelihood of L. monocytogenes isolation compared to that for samples collected far from water and pasture. Generalized linear mixed models identified additional land cover factors associated with an increased likelihood of L. monocytogenes isolation, such as proximity to wetlands. These findings validated a subset of previously developed rules that predict L. monocytogenes prevalence in produce production environments. This suggests that GIS and geospatial models can be used to accurately predict L. monocytogenes prevalence on farms and can be used prospectively to minimize the risk of preharvest contamination of produce. PMID:26590280

  12. Development and validation of rear impact computer simulation model of an adult manual transit wheelchair with a seated occupant.

    Science.gov (United States)

    Salipur, Zdravko; Bertocci, Gina

    2010-01-01

    It has been shown that ANSI WC19 transit wheelchairs that are crashworthy in frontal impact exhibit catastrophic failures in rear impact and may not be able to provide stable seating support and thus occupant protection for the wheelchair occupant. Thus far only limited sled test and computer simulation data have been available to study rear impact wheelchair safety. Computer modeling can be used as an economic and comprehensive tool to gain critical knowledge regarding wheelchair integrity and occupant safety. This study describes the development and validation of a computer model simulating an adult wheelchair-seated occupant subjected to a rear impact event. The model was developed in MADYMO and validated rigorously using the results of three similar sled tests conducted to specifications provided in the draft ISO/TC 173 standard. Outcomes from the model can provide critical wheelchair loading information to wheelchair and tiedown manufacturers, resulting in safer wheelchair designs for rear impact conditions. (c) 2009 IPEM. Published by Elsevier Ltd. All rights reserved.

  13. Validation process of simulation model

    International Nuclear Information System (INIS)

    San Isidro, M. J.

    1998-01-01

    It is presented a methodology on empirical validation about any detailed simulation model. This king of validation it is always related with an experimental case. The empirical validation has a residual sense, because the conclusions are based on comparisons between simulated outputs and experimental measurements. This methodology will guide us to detect the fails of the simulation model. Furthermore, it can be used a guide in the design of posterior experiments. Three steps can be well differentiated: Sensitivity analysis. It can be made with a DSA, differential sensitivity analysis, and with a MCSA, Monte-Carlo sensitivity analysis. Looking the optimal domains of the input parameters. It has been developed a procedure based on the Monte-Carlo methods and Cluster techniques, to find the optimal domains of these parameters. Residual analysis. This analysis has been made on the time domain and on the frequency domain, it has been used the correlation analysis and spectral analysis. As application of this methodology, it is presented the validation carried out on a thermal simulation model on buildings, Esp., studying the behavior of building components on a Test Cell of LECE of CIEMAT. (Author) 17 refs

  14. Development and validation of a strategic repositioning model for defense and aerospace contractors

    Science.gov (United States)

    Bers, John A.

    Strategic repositioning refers to the organized efforts of defense contractors to "reposition" a technology that they have developed for a defense sector customer into a civilian or commercial market. The strategic repositioning model developed here is a structural model: it seeks to isolate the factors that influence choice of strategy, which in turn influences the organization's performance. The model draws from the prior experience of contractors (through interviews and surveys) and companies in other sectors (through a review of the relevant published research). (1) Over all, the model accounted for 55% of the variance in financial performance of the sample and 35% for the underlying population. (2) Key success factors include a rigorous planning process, a target market in the growth (vs. incubation) stage, a priority on market leadership as well as financial return, the ability to operate in an ambiguous business environment, and a relatively short time horizon but strong corporate support. (3) The greatest challenges that a contractor is likely to encounter are understanding his new customers' buying practices, strong competition, and adapting his technology to their needs and price expectations. (4) To address these issues contractors often involve partners in their entry strategy, but partnerships of equals tend to be more difficult to bring off than direct entry strategies. (5) The two major target market categories--government and commercial--present different challenges. Commercial customers are more likely to resist doing business with the contractor, while contractors entering government and other noncommercial markets are more likely to encounter price resistance, low technical sophistication among customers, and difficulties reaching their customer base. (6) Despite these differences across markets, performance is not influenced by the target market category, nor by the type of product or service or the contractor's functional orientation (marketing

  15. Using Rasch models to develop and validate an environmental thinking learning progression

    Science.gov (United States)

    Hashimoto-Martell, Erin A.

    Environmental understanding is highly relevant in today's global society. Social, economic, and political structures are connected to the state of environmental degradation and exploitation, and disproportionately affect those in poor or urban communities (Brulle & Pellow, 2006; Executive Order No. 12898, 1994). Environmental education must challenge the way we live, and our social and ecological quality of life, with the goal of responsible action. The development of a learning progression in environmental thinking, along with a corresponding assessment, could provide a tool that could be used across environmental education programs to help evaluate and guide programmatic decisions. This study sought to determine if a scale could be constructed that allowed individuals to be ordered along a continuum of environmental thinking. First, I developed the Environmental Thinking Learning Progression, a scale of environmental thinking from novice to advanced, based on the current available research and literature. The scale consisted of four subscales, each measuring a different aspect of environmental thinking: place consciousness, human connection, agency, and science concepts. Second, a measurement instrument was developed, so that the data appropriately fit the model using Rasch analysis. A Rasch analysis of the data placed respondents along a continuum, given the range of item difficulty for each subscale. Across three iterations of instrument revision and data collection, findings indicated that the items were ordered in a hierarchical way that corresponded to the construct of environmental thinking. Comparisons between groups showed that the average score of respondents who had participated in environmental education programs was significantly higher than those who had not. A comparison between males and females showed no significant difference in average measure, however, there were varied significant differences between how racial/ethnic groups performed. Overall

  16. Development and Validation of a Rule-Based Strength Scaling Method for Musculoskeletal Modelling

    DEFF Research Database (Denmark)

    Oomen, Pieter; Annegarn, Janneke; Rasmussen, John

    2015-01-01

    performed maximal isometric knee extensions. A multiple linear regression analysis (MLR) resulted in an empirical strength scaling equation, accounting for age, mass, height, gender, segment masses and segment lengths. For validation purpose, 20 newly included healthy subjects performed a maximal isometric...

  17. Novel intrinsic-based submodel for char particle gasification in entrained-flow gasifiers: Model development, validation and illustration

    International Nuclear Information System (INIS)

    Schulze, S.; Richter, A.; Vascellari, M.; Gupta, A.; Meyer, B.; Nikrityuk, P.A.

    2016-01-01

    Highlights: • Model resolving intra-particle species transport for char conversion was formulated. • TGA experiments of char particle conversion in gas flow were conducted. • The experimental results for char conversion validated the model. • CFD simulations of endothermic reactor with developed model were carried out. - Abstract: The final carbon conversion rate is of critical importance in the efficiency of gasifiers. Therefore, comprehensive modeling of char particle conversion is of primary interest for designing new gasifiers. This work presents a novel intrinsic-based submodel for the gasification of a char particle moving in a hot flue gas environment considering CO 2 and H 2 O as inlet species. The first part of the manuscript describes the model and its derivation. Validations against experiments carried out in this work for German lignite char are reported in the second part. The comparison between submodel predictions and experimental data shows good agreement. The importance of char porosity change during gasification is demonstrated. The third part presents the results of CFD simulations using the new submodel and a surface-based submodel for a generic endothermic gasifier. The focus of CFD simulations is to demonstrate the crucial role of intrinsic based heterogeneous reactions in the adequate prediction of carbon conversion rates.

  18. Development and validation of a numerical model of the swine head subjected to open-field blasts

    Science.gov (United States)

    Kalra, A.; Zhu, F.; Feng, K.; Saif, T.; Kallakuri, S.; Jin, X.; Yang, K.; King, A.

    2017-11-01

    A finite element model of the head of a 55-kg Yucatan pig was developed to calculate the incident pressure and corresponding intracranial pressure due to the explosion of 8 lb (3.63 kg) of C4 at three different distances. The results from the model were validated by comparing findings with experimentally obtained data from five pigs at three different blast overpressure levels: low (150 kPa), medium (275 kPa), and high (400 kPa). The peak values of intracranial pressures from numerical model at different locations of the brain such as the frontal, central, left temporal, right temporal, parietal, and occipital regions were compared with experimental values. The model was able to predict the peak pressure with reasonable percentage differences. The differences for peak incident and intracranial pressure values between the simulation results and the experimental values were found to be less than 2.2 and 29.3%, respectively, at all locations other than the frontal region. Additionally, a series of parametric studies shows that the intracranial pressure was very sensitive to sensor locations, the presence of air bubbles, and reflections experienced during the experiments. Further efforts will be undertaken to correlate the different biomechanical response parameters, such as the intracranial pressure gradient, stress, and strain results obtained from the validated model with injured brain locations once the histology data become available.

  19. Development and validation of a combined phased acoustical radiosity and image source model for predicting sound fields in rooms.

    Science.gov (United States)

    Marbjerg, Gerd; Brunskog, Jonas; Jeong, Cheol-Ho; Nilsson, Erling

    2015-09-01

    A model, combining acoustical radiosity and the image source method, including phase shifts on reflection, has been developed. The model is denoted Phased Acoustical Radiosity and Image Source Method (PARISM), and it has been developed in order to be able to model both specular and diffuse reflections with complex-valued and angle-dependent boundary conditions. This paper mainly describes the combination of the two models and the implementation of the angle-dependent boundary conditions. It furthermore describes how a pressure impulse response is obtained from the energy-based acoustical radiosity by regarding the model as being stochastic. Three methods of implementation are proposed and investigated, and finally, recommendations are made for their use. Validation of the image source method is done by comparison with finite element simulations of a rectangular room with a porous absorber ceiling. Results from the full model are compared with results from other simulation tools and with measurements. The comparisons of the full model are done for real-valued and angle-independent surface properties. The proposed model agrees well with both the measured results and the alternative theories, and furthermore shows a more realistic spatial variation than energy-based methods due to the fact that interference is considered.

  20. Development and validation of deterioration models for concrete bridge decks - phase 1 : artificial intelligence models and bridge management system.

    Science.gov (United States)

    2013-06-01

    This research documents the development and evaluation of artificial neural network (ANN) models to predict the condition ratings of concrete highway bridge decks in Michigan. Historical condition assessments chronicled in the national bridge invento...

  1. Validating Dart Model

    Directory of Open Access Journals (Sweden)

    Mazur Jolanta

    2014-12-01

    Full Text Available The primary objective of the study was to quantitatively test the DART model, which despite being one of the most popular representations of co-creation concept was so far studied almost solely with qualitative methods. To this end, the researchers developed a multiple measurement scale and employed it in interviewing managers. The statistical evidence for adequacy of the model was obtained through CFA with AMOS software. The findings suggest that the DART model may not be an accurate representation of co-creation practices in companies. From the data analysis it was evident that the building blocks of DART had too much of conceptual overlap to be an effective framework for quantitative analysis. It was also implied that the phenomenon of co-creation is so rich and multifaceted that it may be more adequately captured by a measurement model where co-creation is conceived as a third-level factor with two layers of intermediate latent variables.

  2. Development and validation of a local time stepping-based PaSR solver for combustion and radiation modeling

    DEFF Research Database (Denmark)

    Pang, Kar Mun; Ivarsson, Anders; Haider, Sajjad

    2013-01-01

    In the current work, a local time stepping (LTS) solver for the modeling of combustion, radiative heat transfer and soot formation is developed and validated. This is achieved using an open source computational fluid dynamics code, OpenFOAM. Akin to the solver provided in default assembly i...... library in the edcSimpleFoam solver which was introduced during the 6th OpenFOAM workshop is modified and coupled with the current solver. One of the main amendments made is the integration of soot radiation submodel since this is significant in rich flames where soot particles are formed. The new solver...

  3. Development and validation of a human biomechanical model for rib fracture and thorax injuries in blunt impact.

    Science.gov (United States)

    Cai, Zhihua; Lan, Fengchong; Chen, Jiqing

    2015-07-01

    From 1990 to approximately 50,000-120,000 people die annually of road traffic accidents in China. Traffic accidents are the main cause of death of Chinese adults aged 15-45 years. This study aimed to determine the biomechanical response and injury tolerance of the human body in traffic accidents. The subject was a 35-year-old male with a height of 170 cm, weight of 70 kg and Chinese characteristics at the 50th percentile. Geometry was generated by computed tomography and magnetic resonance imaging. A human-body biomechanical model was then developed. The model featured in great detail the main anatomical characteristics of skeletal tissues, soft tissues and internal organs, including the head, neck, shoulder, thoracic cage, abdomen, spine, pelvis, pleurae and lungs, heart, aorta, arms, legs, and other muscle tissues and skeletons. The material properties of all tissues in the human body model were obtained from the literature. Material properties were developed in the LS-DYNA code to simulate the mechanical behaviour of the biological tissues in the human body. The model was validated against cadaver responses to frontal and side impact. The predicted model response reasonably agreed with the experimental data, and the model can further be used to evaluate thoracic injury in real-world crashes. We believe that the transportation industry can use numerical models in the future to simultaneously reduce physical testing and improve automotive safety.

  4. Development and validation of a stochastic model for potential growth of Listeria monocytogenes in naturally contaminated lightly preserved seafood.

    Science.gov (United States)

    Mejlholm, Ole; Bøknæs, Niels; Dalgaard, Paw

    2015-02-01

    A new stochastic model for the simultaneous growth of Listeria monocytogenes and lactic acid bacteria (LAB) was developed and validated on data from naturally contaminated samples of cold-smoked Greenland halibut (CSGH) and cold-smoked salmon (CSS). During industrial processing these samples were added acetic and/or lactic acids. The stochastic model was developed from an existing deterministic model including the effect of 12 environmental parameters and microbial interaction (O. Mejlholm and P. Dalgaard, Food Microbiology, submitted for publication). Observed maximum population density (MPD) values of L. monocytogenes in naturally contaminated samples of CSGH and CSS were accurately predicted by the stochastic model based on measured variability in product characteristics and storage conditions. Results comparable to those from the stochastic model were obtained, when product characteristics of the least and most preserved sample of CSGH and CSS were used as input for the existing deterministic model. For both modelling approaches, it was shown that lag time and the effect of microbial interaction needs to be included to accurately predict MPD values of L. monocytogenes. Addition of organic acids to CSGH and CSS was confirmed as a suitable mitigation strategy against the risk of growth by L. monocytogenes as both types of products were in compliance with the EU regulation on ready-to-eat foods. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Development and validation of a catalytic recombiner model for the containment code RALOC MOD4.0

    International Nuclear Information System (INIS)

    Rohde, J.; Klein-Hebling, W.; Chakraborty, A.K.

    1997-01-01

    This paper reports on the development of a catalytic recombiner model for the containment code RALOC MOD4.0 /KLH 95, KLH 96/ and the detailed validation work, carried out at GRS. The model was qualified by using the results of medium and large scale experiments, being performed in Germany /KAN 91/. The comparison of measured data with the calculations demonstrates, that this new model is suitable for real plant applications to investigate the overall effectiveness of a catalytic recombiner system under severe accident conditions for large dry containments of German PWR design. The results of such investigations will serve as the basis to work out some guidance for the determination of the system capacity needed and an optimal positioning of such devices in containments. (author)

  6. Crack Detection in Fibre Reinforced Plastic Structures Using Embedded Fibre Bragg Grating Sensors: Theory, Model Development and Experimental Validation

    DEFF Research Database (Denmark)

    Pereira, Gilmar Ferreira; Mikkelsen, Lars Pilgaard; McGugan, Malcolm

    2015-01-01

    properties. When applying this concept to different structures, sensor systems and damage types, a combination of damage mechanics, monitoring technology, and modelling is required. The primary objective of this article is to demonstrate such a combination. This article is divided in three main topics......: the damage mechanism (delamination of FRP), the structural health monitoring technology (fibre Bragg gratings to detect delamination), and the finite element method model of the structure that incorporates these concepts into a final and integrated damage-monitoring concept. A novel method for assessing...... by the end-user. Conjointly, a novel model for sensor output prediction (virtual sensor) was developed using this FBG sensor crack monitoring concept and implemented in a finite element method code. The monitoring method was demonstrated and validated using glass fibre double cantilever beam specimens...

  7. Perpetual Model Validation

    Science.gov (United States)

    2017-03-01

    25]. This inference process is carried out by a tool referred to as Hynger (Hybrid iNvariant GEneratoR), overviewed in Figure 4, which is a MATLAB ...initially on memory access patterns. A monitoring module will check, at runtime that the observed memory access pattern matches the pattern the software is...necessary. By using the developed approach, a model may be derived from initial tests or simulations , which will then be formally checked at runtime

  8. Development and validation of the ASTEC-Na thermal-hydraulic models

    Energy Technology Data Exchange (ETDEWEB)

    Herranz, L. E.; Perez, S.; Bandini, G.; Jacq, F.; Parisi, C.; Berna, C.

    2014-07-01

    Last years the interest in sodium-cooled fast reactors (SFR) has been fostered worldwide by the search for higher nuclear energy sustainability. This has been reflected in the various international initiatives like GEN-IV International Forum, INPRO or ESNII platforms. At the same time, innovative nuclear reactor designs, particularly SFR, are aiming at even higher safety standards than current LWRs. A proof of it is the consideration of severe accidents since the earliest stages of reactor design. commonalities of LWR and SFR severe accident scenarios suggest that some of the knowledge achieved in the LWR arena might be applicable to some extent to SFRs. This is the spirit underneath of the EU-JASMIN project, which generic goal is developing the ASTEC-Na code from the LWR ASTEC platform. This will entail to t extend and adapt some existing models as well as to implement new ones in all the areas covered, from neutronics and pin thermo-mechanics and pin thermo-mechanics to the in-containment source term behavior by these, going through the indispensable Na thermal-hydraulics. (Author)

  9. Development and validation of effective models for simulation of stratification and mixing phenomena in a pool of water

    International Nuclear Information System (INIS)

    Li, H.; Kudinov, P.; Villanueva, W.

    2011-06-01

    and numerical schemes, (b) propose necessary improvements in GOTHIC sub-grid scale modeling, and (c) to validate proposed models. Results obtained with the EHS model shows that GOTHIC can predict development of thermal stratification in the pool if adequate grid resolution is provided. An equation for the effective momentum is proposed based on feasibility studies of the EMS model and analysis of the measured data in the test with chugging regime of steam injection. An experiment with higher resolution in space and time of oscillatory flow inside the blowdown pipe is highly desirable to uniquely determine model coefficients. Implementation of EHS/EMS model in GOTHIC and their validation against new PPOOLEX experiment is underway. (Author)

  10. Development and validation of effective models for simulation of stratification and mixing phenomena in a pool of water

    Energy Technology Data Exchange (ETDEWEB)

    Li, H.; Kudinov, P.; Villanueva, W. (Royal Institute of Technology (KTH). Div. of Nuclear Power Safety (Sweden))

    2011-06-15

    's physical models and numerical schemes, (b) propose necessary improvements in GOTHIC sub-grid scale modeling, and (c) to validate proposed models. Results obtained with the EHS model shows that GOTHIC can predict development of thermal stratification in the pool if adequate grid resolution is provided. An equation for the effective momentum is proposed based on feasibility studies of the EMS model and analysis of the measured data in the test with chugging regime of steam injection. An experiment with higher resolution in space and time of oscillatory flow inside the blowdown pipe is highly desirable to uniquely determine model coefficients. Implementation of EHS/EMS model in GOTHIC and their validation against new PPOOLEX experiment is underway. (Author)

  11. Development of the Galaxy Chronic Obstructive Pulmonary Disease (COPD) Model Using Data from ECLIPSE: Internal Validation of a Linked-Equations Cohort Model.

    Science.gov (United States)

    Briggs, Andrew H; Baker, Timothy; Risebrough, Nancy A; Chambers, Mike; Gonzalez-McQuire, Sebastian; Ismaila, Afisi S; Exuzides, Alex; Colby, Chris; Tabberer, Maggie; Muellerova, Hana; Locantore, Nicholas; Rutten van Mölken, Maureen P M H; Lomas, David A

    2017-05-01

    The recent joint International Society for Pharmacoeconomics and Outcomes Research / Society for Medical Decision Making Modeling Good Research Practices Task Force emphasized the importance of conceptualizing and validating models. We report a new model of chronic obstructive pulmonary disease (COPD) (part of the Galaxy project) founded on a conceptual model, implemented using a novel linked-equation approach, and internally validated. An expert panel developed a conceptual model including causal relationships between disease attributes, progression, and final outcomes. Risk equations describing these relationships were estimated using data from the Evaluation of COPD Longitudinally to Identify Predictive Surrogate Endpoints (ECLIPSE) study, with costs estimated from the TOwards a Revolution in COPD Health (TORCH) study. Implementation as a linked-equation model enabled direct estimation of health service costs and quality-adjusted life years (QALYs) for COPD patients over their lifetimes. Internal validation compared 3 years of predicted cohort experience with ECLIPSE results. At 3 years, the Galaxy COPD model predictions of annual exacerbation rate and annual decline in forced expiratory volume in 1 second fell within the ECLIPSE data confidence limits, although 3-year overall survival was outside the observed confidence limits. Projections of the risk equations over time permitted extrapolation to patient lifetimes. Averaging the predicted cost/QALY outcomes for the different patients within the ECLIPSE cohort gives an estimated lifetime cost of £25,214 (undiscounted)/£20,318 (discounted) and lifetime QALYs of 6.45 (undiscounted/5.24 [discounted]) per ECLIPSE patient. A new form of model for COPD was conceptualized, implemented, and internally validated, based on a series of linked equations using epidemiological data (ECLIPSE) and cost data (TORCH). This Galaxy model predicts COPD outcomes from treatment effects on disease attributes such as lung function

  12. Myocardial segmentation based on coronary anatomy using coronary computed tomography angiography: Development and validation in a pig model

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Mi Sun [Chung-Ang University College of Medicine, Department of Radiology, Chung-Ang University Hospital, Seoul (Korea, Republic of); Yang, Dong Hyun; Seo, Joon Beom; Kang, Joon-Won; Lim, Tae-Hwan [Asan Medical Center, University of Ulsan College of Medicine, Department of Radiology and Research Institute of Radiology, Seoul (Korea, Republic of); Kim, Young-Hak; Kang, Soo-Jin; Jung, Joonho [Asan Medical Center, University of Ulsan College of Medicine, Heart Institute, Seoul (Korea, Republic of); Kim, Namkug [Asan Medical Center, University of Ulsan College of Medicine, Department of Convergence Medicine, Seoul (Korea, Republic of); Heo, Seung-Ho [Asan Medical Center, University of Ulsan College of Medicine, Asan institute for Life Science, Seoul (Korea, Republic of); Baek, Seunghee [Asan Medical Center, University of Ulsan College of Medicine, Department of Clinical Epidemiology and Biostatistics, Seoul (Korea, Republic of); Choi, Byoung Wook [Yonsei University, Department of Diagnostic Radiology, College of Medicine, Seoul (Korea, Republic of)

    2017-10-15

    To validate a method for performing myocardial segmentation based on coronary anatomy using coronary CT angiography (CCTA). Coronary artery-based myocardial segmentation (CAMS) was developed for use with CCTA. To validate and compare this method with the conventional American Heart Association (AHA) classification, a single coronary occlusion model was prepared and validated using six pigs. The unstained occluded coronary territories of the specimens and corresponding arterial territories from CAMS and AHA segmentations were compared using slice-by-slice matching and 100 virtual myocardial columns. CAMS more precisely predicted ischaemic area than the AHA method, as indicated by 95% versus 76% (p < 0.001) of the percentage of matched columns (defined as percentage of matched columns of segmentation method divided by number of unstained columns in the specimen). According to the subgroup analyses, CAMS demonstrated a higher percentage of matched columns than the AHA method in the left anterior descending artery (100% vs. 77%; p < 0.001) and mid- (99% vs. 83%; p = 0.046) and apical-level territories of the left ventricle (90% vs. 52%; p = 0.011). CAMS is a feasible method for identifying the corresponding myocardial territories of the coronary arteries using CCTA. (orig.)

  13. Development and Validation of Perioperative Risk-Adjustment Models for Hip Fracture Repair, Total Hip Arthroplasty, and Total Knee Arthroplasty.

    Science.gov (United States)

    Schilling, Peter L; Bozic, Kevin J

    2016-01-06

    Comparing outcomes across providers requires risk-adjustment models that account for differences in case mix. The burden of data collection from the clinical record can make risk-adjusted outcomes difficult to measure. The purpose of this study was to develop risk-adjustment models for hip fracture repair (HFR), total hip arthroplasty (THA), and total knee arthroplasty (TKA) that weigh adequacy of risk adjustment against data-collection burden. We used data from the American College of Surgeons National Surgical Quality Improvement Program to create derivation cohorts for HFR (n = 7000), THA (n = 17,336), and TKA (n = 28,661). We developed logistic regression models for each procedure using age, sex, American Society of Anesthesiologists (ASA) physical status classification, comorbidities, laboratory values, and vital signs-based comorbidities as covariates, and validated the models with use of data from 2012. The derivation models' C-statistics for mortality were 80%, 81%, 75%, and 92% and for adverse events were 68%, 68%, 60%, and 70% for HFR, THA, TKA, and combined procedure cohorts. Age, sex, and ASA classification accounted for a large share of the explained variation in mortality (50%, 58%, 70%, and 67%) and adverse events (43%, 45%, 46%, and 68%). For THA and TKA, these three variables were nearly as predictive as models utilizing all covariates. HFR model discrimination improved with the addition of comorbidities and laboratory values; among the important covariates were functional status, low albumin, high creatinine, disseminated cancer, dyspnea, and body mass index. Model performance was similar in validation cohorts. Risk-adjustment models using data from health records demonstrated good discrimination and calibration for HFR, THA, and TKA. It is possible to provide adequate risk adjustment using only the most predictive variables commonly available within the clinical record. This finding helps to inform the trade-off between model performance and data

  14. Development and Validation of Computational Fluid Dynamics Models for Prediction of Heat Transfer and Thermal Microenvironments of Corals

    Science.gov (United States)

    Ong, Robert H.; King, Andrew J. C.; Mullins, Benjamin J.; Cooper, Timothy F.; Caley, M. Julian

    2012-01-01

    We present Computational Fluid Dynamics (CFD) models of the coupled dynamics of water flow, heat transfer and irradiance in and around corals to predict temperatures experienced by corals. These models were validated against controlled laboratory experiments, under constant and transient irradiance, for hemispherical and branching corals. Our CFD models agree very well with experimental studies. A linear relationship between irradiance and coral surface warming was evident in both the simulation and experimental result agreeing with heat transfer theory. However, CFD models for the steady state simulation produced a better fit to the linear relationship than the experimental data, likely due to experimental error in the empirical measurements. The consistency of our modelling results with experimental observations demonstrates the applicability of CFD simulations, such as the models developed here, to coral bleaching studies. A study of the influence of coral skeletal porosity and skeletal bulk density on surface warming was also undertaken, demonstrating boundary layer behaviour, and interstitial flow magnitude and temperature profiles in coral cross sections. Our models compliment recent studies showing systematic changes in these parameters in some coral colonies and have utility in the prediction of coral bleaching. PMID:22701582

  15. Development of a Duplex Ultrasound Simulator and Preliminary Validation of Velocity Measurements in Carotid Artery Models.

    Science.gov (United States)

    Zierler, R Eugene; Leotta, Daniel F; Sansom, Kurt; Aliseda, Alberto; Anderson, Mark D; Sheehan, Florence H

    2016-07-01

    Duplex ultrasound scanning with B-mode imaging and both color Doppler and Doppler spectral waveforms is relied upon for diagnosis of vascular pathology and selection of patients for further evaluation and treatment. In most duplex ultrasound applications, classification of disease severity is based primarily on alterations in blood flow velocities, particularly the peak systolic velocity (PSV) obtained from Doppler spectral waveforms. We developed a duplex ultrasound simulator for training and assessment of scanning skills. Duplex ultrasound cases were prepared from 2-dimensional (2D) images of normal and stenotic carotid arteries by reconstructing the common carotid, internal carotid, and external carotid arteries in 3 dimensions and computationally simulating blood flow velocity fields within the lumen. The simulator displays a 2D B-mode image corresponding to transducer position on a mannequin, overlaid by color coding of velocity data. A spectral waveform is generated according to examiner-defined settings (depth and size of the Doppler sample volume, beam steering, Doppler beam angle, and pulse repetition frequency or scale). The accuracy of the simulator was assessed by comparing the PSV measured from the spectral waveforms with the true PSV which was derived from the computational flow model based on the size and location of the sample volume within the artery. Three expert examiners made a total of 36 carotid artery PSV measurements based on the simulated cases. The PSV measured by the examiners deviated from true PSV by 8% ± 5% (N = 36). The deviation in PSV did not differ significantly between artery segments, normal and stenotic arteries, or examiners. To our knowledge, this is the first simulation of duplex ultrasound that can create and display real-time color Doppler images and Doppler spectral waveforms. The results demonstrate that an examiner can measure PSV from the spectral waveforms using the settings on the simulator with a mean absolute error

  16. LMDzT-INCA dust forecast model developments and associated validation efforts

    International Nuclear Information System (INIS)

    Schulz, M; Cozic, A; Szopa, S

    2009-01-01

    The nudged atmosphere global climate model LMDzT-INCA is used to forecast global dust fields. Evaluation is undertaken in retrospective for the forecast results of the year 2006. For this purpose AERONET/Photons sites in Northern Africa and on the Arabian Peninsula are chosen where aerosol optical depth is dominated by dust. Despite its coarse resolution, the model captures 48% of the day to day dust variability near Dakar on the initial day of the forecast. On weekly and monthly scale the model captures respectively 62% and 68% of the variability. Correlation coefficients between daily AOD values observed and modelled at Dakar decrease from 0.69 for the initial forecast day to 0.59 and 0.41 respectively for two days ahead and five days ahead. If one requests that the model should be able to issue a warning for an exceedance of aerosol optical depth of 0.5 and issue no warning in the other cases, then the model was wrong in 29% of the cases for day 0, 32% for day 2 and 35% for day 5. A reanalysis run with archived ECMWF winds is only slightly better (r=0.71) but was in error in 25% of the cases. Both the improved simulation of the monthly versus daily variability and the deterioration of the forecast with time can be explained by model failure to simulate the exact timing of a dust event.

  17. Development and validation of a CFD-based steam reformer model

    DEFF Research Database (Denmark)

    Kær, Søren Knudsen; Dahlqvist, Mathis; Saksager, Anders

    2006-01-01

    Steam reforming of liquid biofuels (ethanol, bio-diesel etc.) represents a sustainable source of hydrogen for micro Combined Heat and Power (CHP) production as well as Auxiliary Power Units (APUs). In relation to the design of the steam reforming reactor several parameter are important including...... for expensive prototypes. This paper presents an advanced Computational Fluid Dynamics based model of a steam reformer. The model was implemented in the commercial CFD code Fluent through the User Defined Functions interface. The model accounts for the flue gas flow as well as the reformate flow including...... a detailed mechanism for the reforming reactions. Heat exchange between the flue gas and reformate streams through the reformer reactor walls was also included as a conjugate heat transfer process.  From a review of published models for the catalytic steam reforming of ethanol and preliminary predictions...

  18. Validation of battery-alternator model against experimental data - a first step towards developing a future power supply system

    Energy Technology Data Exchange (ETDEWEB)

    Boulos, A.M.; Burnham, K.J.; Mahtani, J.L. [Coventry University (United Kingdom). Control Theory and Applications Centre; Pacaud, C. [Jaguar Cars Ltd., Coventry (United Kingdom). Engineering Centre

    2004-01-01

    The electric power system of a modern vehicle has to supply enough electrical energy to drive numerous electrical and electronic systems and components. The electric power system of a vehicle consists of two major components: an alternator and a battery. A detailed understanding of the characteristics of the electric power system, electrical load demands and the operating environment, such as road conditions and vehicle laden weight, is required when the capacities of the generator and the battery are to be determined for a vehicle. In this study, a battery-alternator system has been developed and simulated in MATLAB/Simulink, and data obtained from vehicle tests have been used as a basis for validating the models. This is considered to be a necessary first step in the design and development of a new 42 V power supply system. (author)

  19. The VATO project: Development and validation of a dynamic transfer model of tritium in grassland ecosystem.

    Science.gov (United States)

    Le Dizès, S; Aulagnier, C; Maro, D; Rozet, M; Vermorel, F; Hébert, D; Voiseux, C; Solier, L; Godinot, C; Fievet, B; Laguionie, P; Connan, O; Cazimajou, O; Morillon, M

    2017-05-01

    In this paper, a dynamic compartment model with a high temporal resolution has been investigated to describe tritium transfer in grassland ecosystems exposed to atmospheric 3 H releases from nuclear facilities under normal operating or accidental conditions. TOCATTA-χ model belongs to the larger framework of the SYMBIOSE modelling and simulation platform that aims to assess the fate and transport of a wide range of radionuclides in various environmental systems. In this context, the conceptual and mathematical models of TOCATTA-χ have been designed to be relatively simple, minimizing the number of compartments and input parameters required. In the same time, the model achieves a good compromise between easy-to-use (as it is to be used in an operational mode), explicative power and predictive accuracy in various experimental conditions. In the framework of the VATO project, the model has been tested against two-year-long in situ measurements of 3 H activity concentration monitored by IRSN in air, groundwater and grass, together with meteorological parameters, on a grass field plot located 2 km downwind of the AREVA NC La Hague nuclear reprocessing plant, as was done in the past for the evaluation of transfer of 14 C in grass. By considering fast exchanges at the vegetation-air canopy interface, the model correctly reproduces the observed variability in TFWT activity concentration in grass, which evolves in accordance with spikes in atmospheric HTO activity concentration over the previous 24 h. The average OBT activity concentration in grass is also correctly reproduced. However, the model has to be improved in order to reproduce punctual high concentration of OBT activity, as observed in December 2013. The introduction of another compartment with a fast kinetic (like TFWT) - although outside the model scope - improves the predictions by increasing the correlation coefficient from 0.29 up to 0.56 when it includes this particular point. Further experimental

  20. Development of a coupled physical-biological ecosystem model ECOSMO - Part I: Model description and validation for the North Sea

    DEFF Research Database (Denmark)

    Schrum, Corinna; Alekseeva, I.; St. John, Michael

    2006-01-01

    A 3-D coupled biophysical model ECOSMO (ECOSystem MOdel) has been developed. The biological module of ECOSMO is based on lower trophic level interactions between two phyto- and two zooplankton components. The dynamics of the different phytoplankton components are governed by the availability...... of the macronutrients nitrogen, phosphate and silicate as well as light. Zooplankton production is simulated based on the consumption of the different phytoplankton groups and detritus. The biological module is coupled to a nonlinear 3-D baroclinic model. The physical and biological modules are driven by surface...... showed that the model, based on consideration of limiting processes, is able to reproduce the observed spatial and seasonal variability of the North Sea ecosystem e.g. the spring bloom, summer sub-surface production and the fall bloom. Distinct differences in regional characteristics of diatoms...

  1. Community pharmacist attitudes towards collaboration with general practitioners: development and validation of a measure and a model

    Directory of Open Access Journals (Sweden)

    Van Connie

    2012-09-01

    Full Text Available Abstract Background Community Pharmacists and General Practitioners (GPs are increasingly being encouraged to adopt more collaborative approaches to health care delivery as collaboration in primary care has been shown to be effective in improving patient outcomes. However, little is known about pharmacist attitudes towards collaborating with their GP counterparts and variables that influence this interprofessional collaboration. This study aims to develop and validate 1 an instrument to measure pharmacist attitudes towards collaboration with GPs and 2 a model that illustrates how pharmacist attitudes (and other variables influence collaborative behaviour with GPs. Methods A questionnaire containing the newly developed “Attitudes Towards Collaboration Instrument for Pharmacists” (ATCI-P and a previously validated behavioural measure “Frequency of Interprofessional Collaboration Instrument for Pharmacists” (FICI-P was administered to a sample of 1215 Australian pharmacists. The ATCI-P was developed based on existing literature and qualitative interviews with GPs and community pharmacists. Principal Component Analysis was used to assess the structure of the ATCI-P and the Cronbach’s alpha coefficient was used to assess the internal consistency of the instrument. Structural equation modelling was used to determine how pharmacist attitudes (as measured by the ATCI-P and other variables, influence collaborative behaviour (as measured by the FICI-P. Results Four hundred and ninety-two surveys were completed and returned for a response rate of 40%. Principal Component Analysis revealed the ATCI-P consisted of two factors: ‘interactional determinants’ and ‘practitioner determinants’, both with good internal consistency (Cronbach’s alpha = .90 and .93 respectively. The model demonstrated adequate fit (χ2/df = 1.89, CFI = .955, RMSEA = .062, 90% CI [.049-.074] and illustrated that ‘interactional determinants’ was

  2. Validating Animal Models

    Directory of Open Access Journals (Sweden)

    Nina Atanasova

    2015-06-01

    Full Text Available In this paper, I respond to the challenge raised against contemporary experimental neurobiology according to which the field is in a state of crisis because of the multiple experimental protocols employed in different laboratories and strengthening their reliability that presumably preclude the validity of neurobiological knowledge. I provide an alternative account of experimentation in neurobiology which makes sense of its experimental practices. I argue that maintaining a multiplicity of experimental protocols and strengthening their reliability are well justified and they foster rather than preclude the validity of neurobiological knowledge. Thus, their presence indicates thriving rather than crisis of experimental neurobiology.

  3. Developing and validating of predictive model for radiofrequency radiation emission within the vicinity of fm stations in Ghana

    International Nuclear Information System (INIS)

    Ahenkora-Duodu, Kingsley

    2016-07-01

    The rapid growing number of FM stations with their corresponding antennas have led to an increase in the concern of the potential health risks that may arise as a result of exposure to RF radiations. The main objective of this research was to develop and validate a predictive model with real time measured data for FM antennas in Ghana. Theoretical and experimental assessment of radiofrequency emission due to FM antennas has been analysed. The maximum and minimum electric field spatial average recorded was 7.17E-01 ± 6.97E-01V/m at Kasapa FM and 6.39E-02 ± 5.39E-02V/m at Asempa FM respectively. At a transmission frequency range of 88 -108 MHz, the average power density of the real time measured data ranged between 3.92E-05W/m"2 and 1.37E-03W/m"2 whiles that of the FM model varied from 9.72E-03W/m"2 to 5.35E-01W/m"2 respectively. Results obtained showed a variation between measured power density levels and the FM model. The FM model overestimates the power density levels as compared to that of the measured data. The impact predictions were based on the maximum values estimated by the FM model, hence these results validates the credibility of the impact analysis for the FM stations. The general public exposure quotient ranged between 9.00E-03 and 2.68E-01 whilst that of the occupational exposure quotient varied from 9.72E-04 to 5.35E-02. The results obtained were found to be in compliance with the International Commission on Non-Ionizing Radiation Protection (ICNIRP) RF exposure limit. (au)

  4. Development and validation of a facial expression database based on the dimensional and categorical model of emotions.

    Science.gov (United States)

    Fujimura, Tomomi; Umemura, Hiroyuki

    2018-01-15

    The present study describes the development and validation of a facial expression database comprising five different horizontal face angles in dynamic and static presentations. The database includes twelve expression types portrayed by eight Japanese models. This database was inspired by the dimensional and categorical model of emotions: surprise, fear, sadness, anger with open mouth, anger with closed mouth, disgust with open mouth, disgust with closed mouth, excitement, happiness, relaxation, sleepiness, and neutral (static only). The expressions were validated using emotion classification and Affect Grid rating tasks [Russell, Weiss, & Mendelsohn, 1989. Affect Grid: A single-item scale of pleasure and arousal. Journal of Personality and Social Psychology, 57(3), 493-502]. The results indicate that most of the expressions were recognised as the intended emotions and could systematically represent affective valence and arousal. Furthermore, face angle and facial motion information influenced emotion classification and valence and arousal ratings. Our database will be available online at the following URL. https://www.dh.aist.go.jp/database/face2017/ .

  5. Development and validation of a prognostic model for recurrent glioblastoma patients treated with bevacizumab and irinotecan

    DEFF Research Database (Denmark)

    Urup, Thomas; Dahlrot, Rikke Hedegaard; Grunnet, Kirsten

    2016-01-01

    Background Predictive markers and prognostic models are required in order to individualize treatment of recurrent glioblastoma (GBM) patients. Here, we sought to identify clinical factors able to predict response and survival in recurrent GBM patients treated with bevacizumab (BEV) and irinotecan....... Material and methods A total of 219 recurrent GBM patients treated with BEV plus irinotecan according to a previously published treatment protocol were included in the initial population. Prognostic models were generated by means of multivariate logistic and Cox regression analysis. Results In multivariate...

  6. Certified reference materials for food packaging specific migration tests: development, validation and modelling

    NARCIS (Netherlands)

    Stoffers, N.H.

    2005-01-01

    Keywords:certified reference materials; diffusion; food contact materials; food packaging; laurolactam; migration modelling; nylon; specific migration This thesis compiles several research topics

  7. pH Effects in Foods: Development, Validation and Calibration of a Fundamental Model

    NARCIS (Netherlands)

    Tijskens, L.M.M.; Biekman, E.S.A.; Greiner, R.; Seyhan, F.; Barringer, S.A.

    2001-01-01

    The effects of pH observed in the activity of a number of enzymes from different origins and the degradation of green colour in blanched vegetables, was modelled based on fundamental kinetic principles by considering hydrogen ions as an integral part of the reaction mechanism. Parameters were

  8. Development and Validation of Discriminant Analysis Models for Student Loan Defaultees and Non-Defaultees.

    Science.gov (United States)

    Myers, Greeley; Siera, Steven

    1980-01-01

    Default on guaranteed student loans has been increasing. The use of discriminant analysis as a technique to identify "good" v "bad" student loans based on information available from the loan application is discussed. Research to test the ability of models to such predictions is reported. (Author/MLW)

  9. Development and validation of prognostic models in metastatic breast cancer: a GOCS study.

    Science.gov (United States)

    Rabinovich, M; Vallejo, C; Bianco, A; Perez, J; Machiavelli, M; Leone, B; Romero, A; Rodriguez, R; Cuevas, M; Dansky, C

    1992-01-01

    The significance of several prognostic factors and the magnitude of their influence on response rate and survival were assessed by means of uni- and multivariate analyses in 362 patients with stage IV (UICC) breast carcinoma receiving combination chemotherapy as first systemic treatment over an 8-year period. Univariate analyses identified performance status and prior adjuvant radiotherapy as predictors of objective regression (OR), whereas the performance status, prior chemotherapy and radiotherapy (adjuvants), white blood cells count, SGOT and SGPT levels, and metastatic pattern were significantly correlated to survival. In multivariate analyses favorable characteristics associated to OR were prior adjuvant radiotherapy, no prior chemotherapy and postmenopausal status. Regarding survival, the performance status and visceral involvement were selected by the Cox model. The predictive accuracy of the logistic and the proportional hazards models was retrospectively tested in the training sample, and prospectively in a new population of 126 patients also receiving combined chemotherapy as first treatment for metastatic breast cancer. A certain overfitting to data in the training sample was observed with the regression model for response. However, the discriminative ability of the Cox model for survival was clearly confirmed.

  10. Identification of patients at high risk for Clostridium difficile infection : Development and validation of a risk prediction model in hospitalized patients treated with antibiotics

    NARCIS (Netherlands)

    van Werkhoven, C. H.; van der Tempel, J.; Jajou, R.; Thijsen, S. F T; Diepersloot, R. J A; Bonten, M. J M; Postma, D. F.; Oosterheert, J. J.

    2015-01-01

    To develop and validate a prediction model for Clostridium difficile infection (CDI) in hospitalized patients treated with systemic antibiotics, we performed a case-cohort study in a tertiary (derivation) and secondary care hospital (validation). Cases had a positive Clostridium test and were

  11. Development and validation of A quasi-dimensional model for (M)Ethanol-Fuelled SI engines

    OpenAIRE

    Vancoillie, Jeroen; Verhelst, Sebastian; Sileghem, Louis; Demuynck, Joachim; Galle, Jonas

    2012-01-01

    RESEARCH OBJECTIVE - The use of methanol and ethanol in spark-ignition engines forms an interesting approach to decarbonizing transport and securing domestic energy supply. Experimental work has produced promising results, however, the full potential of light alcohols in modern engine technology remains to be explored. Today, this can be addressed at low cost using system simulations of the whole engine, provided that the employed models account for the effect of the fuel on engine operation....

  12. Barriers to developing a valid rodent model of Alzheimer's disease: from behavioural analysis to etiologicalmechanisms

    Directory of Open Access Journals (Sweden)

    Darryl Christopher Gidyk

    2015-07-01

    Full Text Available Sporadic Alzheimer's disease is the most prevalent form of age-related dementia. As such, great effort has been put forth to investigate the etiology, progression, and underlying mechanisms of the disease. Countless studies have been conducted however the details of this disease remain largely unknown. Rodent models provide opportunities to investigate certain aspects of AD that cannot be ethically studied in humans. These animal models vary from study to study and have provided some insight, but no real advancements in the prevention or treatment of the disease. In this Hypothesis and Theory paper, we discuss what we perceive as barriers to impactful discovery in rodent AD research and we offer solutions for moving forward. Although no single model of AD is capable of providing the solution to the growing epidemic of the disease, we encourage a comprehensive approach that acknowledges the complex etiology of AD with the goal of enhancing the bidirectional translatability from bench to bedside and vice versa.

  13. Development and validation of a low-frequency modeling code for high-moment transmitter rod antennas

    Science.gov (United States)

    Jordan, Jared Williams; Sternberg, Ben K.; Dvorak, Steven L.

    2009-12-01

    The goal of this research is to develop and validate a low-frequency modeling code for high-moment transmitter rod antennas to aid in the design of future low-frequency TX antennas with high magnetic moments. To accomplish this goal, a quasi-static modeling algorithm was developed to simulate finite-length, permeable-core, rod antennas. This quasi-static analysis is applicable for low frequencies where eddy currents are negligible, and it can handle solid or hollow cores with winding insulation thickness between the antenna's windings and its core. The theory was programmed in Matlab, and the modeling code has the ability to predict the TX antenna's gain, maximum magnetic moment, saturation current, series inductance, and core series loss resistance, provided the user enters the corresponding complex permeability for the desired core magnetic flux density. In order to utilize the linear modeling code to model the effects of nonlinear core materials, it is necessary to use the correct complex permeability for a specific core magnetic flux density. In order to test the modeling code, we demonstrated that it can accurately predict changes in the electrical parameters associated with variations in the rod length and the core thickness for antennas made out of low carbon steel wire. These tests demonstrate that the modeling code was successful in predicting the changes in the rod antenna characteristics under high-current nonlinear conditions due to changes in the physical dimensions of the rod provided that the flux density in the core was held constant in order to keep the complex permeability from changing.

  14. How Teachers Become Leaders: An Internationally Validated Theoretical Model of Teacher Leadership Development

    Science.gov (United States)

    Poekert, Philip; Alexandrou, Alex; Shannon, Darbianne

    2016-01-01

    Teacher leadership is increasingly being touted as a practical response to guide teacher learning in school improvement and policy reform efforts. However, the field of research on teacher leadership in relation to post-compulsory educational development has been and remains largely atheoretical to date. This empirical study proposes a grounded…

  15. Development and validation of a model for high pressure liquid poison injection for CANDU-6 shutdown system no.2

    International Nuclear Information System (INIS)

    Rhee, B.-W.; Jeong, C.J.; Choi, J.H.; Yoo, S.-Y.

    2002-01-01

    In CANDU reactor one of the two reactor shutdown systems is the liquid poison injection system which injects the highly pressurized liquid neutron poison into the moderator tank via small holes on the nozzle pipes. To ensure the safe shutdown of a reactor it is necessary for the poison curtains generated by jets provide quick, and enough negative reactivity to the reactor during the early stage of the accident. In order to produce the neutron cross section necessary to perform this work, the poison concentration distribution during the transient is necessary. In this study, a set of models for analyzing the transient poison concentration induced by this high pressure poison injection jet activated upon the reactor trip in a CANDU-6 reactor moderator tank has been developed and used to generate the poison concentration distribution of the poison curtains induced by the high pressure jets injected into the vacant region between the calandria tube banks. The poison injection rate through the jet holes drilled on the nozzle pipes is obtained by a 1-D transient hydrodynamic code called, ALITRIG, and this injection rate is used to provide the inlet boundary condition to a 3-D CFD model of the moderator tank based on CFX4.3, an AEA Technology CFD code, to simulate the formation and growth of the poison jet curtain inside the moderator tank. For validation, the current model is validated against a poison injection experiment performed at BARC, India and another poison jet experiment for Generic CANDU-6 performed at AECL, Canada. In conclusion this set of models is considered to predict the experimental results in a physically reasonable and consistent manner. (author)

  16. Development and validation of a Kalman filter-based model for vehicle slip angle estimation

    Science.gov (United States)

    Gadola, M.; Chindamo, D.; Romano, M.; Padula, F.

    2014-01-01

    It is well known that vehicle slip angle is one of the most difficult parameters to measure on a vehicle during testing or racing activities. Moreover, the appropriate sensor is very expensive and it is often difficult to fit to a car, especially on race cars. We propose here a strategy to eliminate the need for this sensor by using a mathematical tool which gives a good estimation of the vehicle slip angle. A single-track car model, coupled with an extended Kalman filter, was used in order to achieve the result. Moreover, a tuning procedure is proposed that takes into consideration both nonlinear and saturation characteristics typical of vehicle lateral dynamics. The effectiveness of the proposed algorithm has been proven by both simulation results and real-world data.

  17. Using Clinical Factors and Mammographic Breast Density to Estimate Breast Cancer Risk: Development and Validation of a New Predictive Model

    Science.gov (United States)

    Tice, Jeffrey A.; Cummings, Steven R.; Smith-Bindman, Rebecca; Ichikawa, Laura; Barlow, William E.; Kerlikowske, Karla

    2009-01-01

    Background Current models for assessing breast cancer risk are complex and do not include breast density, a strong risk factor for breast cancer that is routinely reported with mammography. Objective To develop and validate an easy-to-use breast cancer risk prediction model that includes breast density. Design Empirical model based on Surveillance, Epidemiology, and End Results incidence, and relative hazards from a prospective cohort. Setting Screening mammography sites participating in the Breast Cancer Surveillance Consortium. Patients 1 095 484 women undergoing mammography who had no previous diagnosis of breast cancer. Measurements Self-reported age, race or ethnicity, family history of breast cancer, and history of breast biopsy. Community radiologists rated breast density by using 4 Breast Imaging Reporting and Data System categories. Results During 5.3 years of follow-up, invasive breast cancer was diagnosed in 14 766 women. The breast density model was well calibrated overall (expected–observed ratio, 1.03 [95% CI, 0.99 to 1.06]) and in racial and ethnic subgroups. It had modest discriminatory accuracy (concordance index, 0.66 [CI, 0.65 to 0.67]). Women with low-density mammograms had 5-year risks less than 1.67% unless they had a family history of breast cancer and were older than age 65 years. Limitation The model has only modest ability to discriminate between women who will develop breast cancer and those who will not. Conclusion A breast cancer prediction model that incorporates routinely reported measures of breast density can estimate 5-year risk for invasive breast cancer. Its accuracy needs to be further evaluated in independent populations before it can be recommended for clinical use. PMID:18316752

  18. Five-Factor Model personality disorder prototypes: a review of their development, validity, and comparison to alternative approaches.

    Science.gov (United States)

    Miller, Joshua D

    2012-12-01

    In this article, the development of Five-Factor Model (FFM) personality disorder (PD) prototypes for the assessment of DSM-IV PDs are reviewed, as well as subsequent procedures for scoring individuals' FFM data with regard to these PD prototypes, including similarity scores and simple additive counts that are based on a quantitative prototype matching methodology. Both techniques, which result in very strongly correlated scores, demonstrate convergent and discriminant validity, and provide clinically useful information with regard to various forms of functioning. The techniques described here for use with FFM data are quite different from the prototype matching methods used elsewhere. © 2012 The Author. Journal of Personality © 2012, Wiley Periodicals, Inc.

  19. Transient simulation of an endothermic chemical process facility coupled to a high temperature reactor: Model development and validation

    International Nuclear Information System (INIS)

    Brown, Nicholas R.; Seker, Volkan; Revankar, Shripad T.; Downar, Thomas J.

    2012-01-01

    Highlights: ► Models for PBMR and thermochemical sulfur cycle based hydrogen plant are developed. ► Models are validated against available data in literature. ► Transient in coupled reactor and hydrogen plant system is studied. ► For loss-of-heat sink accident, temperature feedback within the reactor core enables shut down of the reactor. - Abstract: A high temperature reactor (HTR) is a candidate to drive high temperature water-splitting using process heat. While both high temperature nuclear reactors and hydrogen generation plants have high individual degrees of development, study of the coupled plant is lacking. Particularly absent are considerations of the transient behavior of the coupled plant, as well as studies of the safety of the overall plant. The aim of this document is to contribute knowledge to the effort of nuclear hydrogen generation. In particular, this study regards identification of safety issues in the coupled plant and the transient modeling of some leading candidates for implementation in the Nuclear Hydrogen Initiative (NHI). The Sulfur Iodine (SI) and Hybrid Sulfur (HyS) cycles are considered as candidate hydrogen generation schemes. Three thermodynamically derived chemical reaction chamber models are coupled to a well-known reference design of a high temperature nuclear reactor. These chemical reaction chamber models have several dimensions of validation, including detailed steady state flowsheets, integrated loop test data, and bench scale chemical kinetics. The models and coupling scheme are presented here, as well as a transient test case initiated within the chemical plant. The 50% feed flow failure within the chemical plant results in a slow loss-of-heat sink (LOHS) accident in the nuclear reactor. Due to the temperature feedback within the reactor core the nuclear reactor partially shuts down over 1500 s. Two distinct regions are identified within the coupled plant response: (1) immediate LOHS due to the loss of the sulfuric

  20. Toward a Psychological Study of Class Consciousness: Development and Validation of a Social Psychological Model

    Directory of Open Access Journals (Sweden)

    Lucas A. Keefer

    2015-12-01

    Full Text Available While social class has recently become a prominent topic in social psychological research, much of this effort has focused on the psychological consequences of objective and subjective indices of class (e.g., income, perceived status. This approach sheds light on the consequences of social class itself, but overlooks a construct of central importance in earlier theorizing on class: class consciousness, or the extent to which individuals acknowledge and situate themselves within class relations. The current paper offers a psychological model of class consciousness comprised of five elements: awareness of social class, perceptions of class conflict, beliefs about the permeability of class groups, identification with a class group, and personal experience of being treated as a member of one’s class. We offer a measure assessing those central dimensions and assess differences in these dimensions by age, gender, indices of social class, political ideology, and among different class groups. Finally, we offer suggestions for how an awareness of class consciousness may enrich social psychology and ultimately foster political change.

  1. Heat and mass transport during microwave heating of mashed potato in domestic oven--model development, validation, and sensitivity analysis.

    Science.gov (United States)

    Chen, Jiajia; Pitchai, Krishnamoorthy; Birla, Sohan; Negahban, Mehrdad; Jones, David; Subbiah, Jeyamkondan

    2014-10-01

    A 3-dimensional finite-element model coupling electromagnetics and heat and mass transfer was developed to understand the interactions between the microwaves and fresh mashed potato in a 500 mL tray. The model was validated by performing heating of mashed potato from 25 °C on a rotating turntable in a microwave oven, rated at 1200 W, for 3 min. The simulated spatial temperature profiles on the top and bottom layer of the mashed potato showed similar hot and cold spots when compared to the thermal images acquired by an infrared camera. Transient temperature profiles at 6 locations collected by fiber-optic sensors showed good agreement with predicted results, with the root mean square error ranging from 1.6 to 11.7 °C. The predicted total moisture loss matched well with the observed result. Several input parameters, such as the evaporation rate constant, the intrinsic permeability of water and gas, and the diffusion coefficient of water and gas, are not readily available for mashed potato, and they cannot be easily measured experimentally. Reported values for raw potato were used as baseline values. A sensitivity analysis of these input parameters on the temperature profiles and the total moisture loss was evaluated by changing the baseline values to their 10% and 1000%. The sensitivity analysis showed that the gas diffusion coefficient, intrinsic water permeability, and the evaporation rate constant greatly influenced the predicted temperature and total moisture loss, while the intrinsic gas permeability and the water diffusion coefficient had little influence. This model can be used by the food product developers to understand microwave heating of food products spatially and temporally. This tool will allow food product developers to design food package systems that would heat more uniformly in various microwave ovens. The sensitivity analysis of this study will help us determine the most significant parameters that need to be measured accurately for reliable

  2. Validation of models with multivariate output

    International Nuclear Information System (INIS)

    Rebba, Ramesh; Mahadevan, Sankaran

    2006-01-01

    This paper develops metrics for validating computational models with experimental data, considering uncertainties in both. A computational model may generate multiple response quantities and the validation experiment might yield corresponding measured values. Alternatively, a single response quantity may be predicted and observed at different spatial and temporal points. Model validation in such cases involves comparison of multiple correlated quantities. Multiple univariate comparisons may give conflicting inferences. Therefore, aggregate validation metrics are developed in this paper. Both classical and Bayesian hypothesis testing are investigated for this purpose, using multivariate analysis. Since, commonly used statistical significance tests are based on normality assumptions, appropriate transformations are investigated in the case of non-normal data. The methodology is implemented to validate an empirical model for energy dissipation in lap joints under dynamic loading

  3. Development and internal validation of a prognostic model to predict recurrence free survival in patients with adult granulosa cell tumors of the ovary

    NARCIS (Netherlands)

    van Meurs, Hannah S.; Schuit, Ewoud; Horlings, Hugo M.; van der Velden, Jacobus; van Driel, Willemien J.; Mol, Ben Willem J.; Kenter, Gemma G.; Buist, Marrije R.

    2014-01-01

    Models to predict the probability of recurrence free survival exist for various types of malignancies, but a model for recurrence free survival in individuals with an adult granulosa cell tumor (GCT) of the ovary is lacking. We aimed to develop and internally validate such a prognostic model. We

  4. Computerized prediction of intensive care unit discharge after cardiac surgery: development and validation of a Gaussian processes model

    Directory of Open Access Journals (Sweden)

    Meyfroidt Geert

    2011-10-01

    Full Text Available Abstract Background The intensive care unit (ICU length of stay (LOS of patients undergoing cardiac surgery may vary considerably, and is often difficult to predict within the first hours after admission. The early clinical evolution of a cardiac surgery patient might be predictive for his LOS. The purpose of the present study was to develop a predictive model for ICU discharge after non-emergency cardiac surgery, by analyzing the first 4 hours of data in the computerized medical record of these patients with Gaussian processes (GP, a machine learning technique. Methods Non-interventional study. Predictive modeling, separate development (n = 461 and validation (n = 499 cohort. GP models were developed to predict the probability of ICU discharge the day after surgery (classification task, and to predict the day of ICU discharge as a discrete variable (regression task. GP predictions were compared with predictions by EuroSCORE, nurses and physicians. The classification task was evaluated using aROC for discrimination, and Brier Score, Brier Score Scaled, and Hosmer-Lemeshow test for calibration. The regression task was evaluated by comparing median actual and predicted discharge, loss penalty function (LPF ((actual-predicted/actual and calculating root mean squared relative errors (RMSRE. Results Median (P25-P75 ICU length of stay was 3 (2-5 days. For classification, the GP model showed an aROC of 0.758 which was significantly higher than the predictions by nurses, but not better than EuroSCORE and physicians. The GP had the best calibration, with a Brier Score of 0.179 and Hosmer-Lemeshow p-value of 0.382. For regression, GP had the highest proportion of patients with a correctly predicted day of discharge (40%, which was significantly better than the EuroSCORE (p Conclusions A GP model that uses PDMS data of the first 4 hours after admission in the ICU of scheduled adult cardiac surgery patients was able to predict discharge from the ICU as a

  5. Developing and validating a new precise risk-prediction model for new-onset hypertension: The Jichi Genki hypertension prediction model (JG model).

    Science.gov (United States)

    Kanegae, Hiroshi; Oikawa, Takamitsu; Suzuki, Kenji; Okawara, Yukie; Kario, Kazuomi

    2018-03-31

    No integrated risk assessment tools that include lifestyle factors and uric acid have been developed. In accordance with the Industrial Safety and Health Law in Japan, a follow-up examination of 63 495 normotensive individuals (mean age 42.8 years) who underwent a health checkup in 2010 was conducted every year for 5 years. The primary endpoint was new-onset hypertension (systolic blood pressure [SBP]/diastolic blood pressure [DBP] ≥ 140/90 mm Hg and/or the initiation of antihypertensive medications with self-reported hypertension). During the mean 3.4 years of follow-up, 7402 participants (11.7%) developed hypertension. The prediction model included age, sex, body mass index (BMI), SBP, DBP, low-density lipoprotein cholesterol, uric acid, proteinuria, current smoking, alcohol intake, eating rate, DBP by age, and BMI by age at baseline and was created by using Cox proportional hazards models to calculate 3-year absolute risks. The derivation analysis confirmed that the model performed well both with respect to discrimination and calibration (n = 63 495; C-statistic = 0.885, 95% confidence interval [CI], 0.865-0.903; χ 2 statistic = 13.6, degree of freedom [df] = 7). In the external validation analysis, moreover, the model performed well both in its discrimination and calibration characteristics (n = 14 168; C-statistic = 0.846; 95%CI, 0.775-0.905; χ 2 statistic = 8.7, df = 7). Adding LDL cholesterol, uric acid, proteinuria, alcohol intake, eating rate, and BMI by age to the base model yielded a significantly higher C-statistic, net reclassification improvement (NRI), and integrated discrimination improvement, especially NRI non-event (NRI = 0.127, 95%CI = 0.100-0.152; NRI non-event  = 0.108, 95%CI = 0.102-0.117). In conclusion, a highly precise model with good performance was developed for predicting incident hypertension using the new parameters of eating rate, uric acid, proteinuria, and BMI by age. ©2018 Wiley Periodicals, Inc.

  6. Validation through model testing

    International Nuclear Information System (INIS)

    1995-01-01

    Geoval-94 is the third Geoval symposium arranged jointly by the OECD/NEA and the Swedish Nuclear Power Inspectorate. Earlier symposia in this series took place in 1987 and 1990. In many countries, the ongoing programmes to site and construct deep geological repositories for high and intermediate level nuclear waste are close to realization. A number of studies demonstrates the potential barrier function of the geosphere, but also that there are many unresolved issues. A key to these problems are the possibilities to gain knowledge by model testing with experiments and to increase confidence in models used for prediction. The sessions cover conclusions from the INTRAVAL-project, experiences from integrated experimental programs and underground research laboratories as well as the integration between performance assessment and site characterisation. Technical issues ranging from waste and buffer interactions with the rock to radionuclide migration in different geological media is addressed. (J.S.)

  7. The German cervical cancer screening model: development and validation of a decision-analytic model for cervical cancer screening in Germany.

    Science.gov (United States)

    Siebert, Uwe; Sroczynski, Gaby; Hillemanns, Peter; Engel, Jutta; Stabenow, Roland; Stegmaier, Christa; Voigt, Kerstin; Gibis, Bernhard; Hölzel, Dieter; Goldie, Sue J

    2006-04-01

    We sought to develop and validate a decision-analytic model for the natural history of cervical cancer for the German health care context and to apply it to cervical cancer screening. We developed a Markov model for the natural history of cervical cancer and cervical cancer screening in the German health care context. The model reflects current German practice standards for screening, diagnostic follow-up and treatment regarding cervical cancer and its precursors. Data for disease progression and cervical cancer survival were obtained from the literature and German cancer registries. Accuracy of Papanicolaou (Pap) testing was based on meta-analyses. We performed internal and external model validation using observed epidemiological data for unscreened women from different German cancer registries. The model predicts life expectancy, incidence of detected cervical cancer cases, lifetime cervical cancer risks and mortality. The model predicted a lifetime cervical cancer risk of 3.0% and a lifetime cervical cancer mortality of 1.0%, with a peak cancer incidence of 84/100,000 at age 51 years. These results were similar to observed data from German cancer registries, German literature data and results from other international models. Based on our model, annual Pap screening could prevent 98.7% of diagnosed cancer cases and 99.6% of deaths due to cervical cancer in women completely adherent to screening and compliant to treatment. Extending the screening interval from 1 year to 2, 3 or 5 years resulted in reduced screening effectiveness. This model provides a tool for evaluating the long-term effectiveness of different cervical cancer screening tests and strategies.

  8. Polytomous diagnosis of ovarian tumors as benign, borderline, primary invasive or metastatic: development and validation of standard and kernel-based risk prediction models

    Directory of Open Access Journals (Sweden)

    Testa Antonia C

    2010-10-01

    Full Text Available Abstract Background Hitherto, risk prediction models for preoperative ultrasound-based diagnosis of ovarian tumors were dichotomous (benign versus malignant. We develop and validate polytomous models (models that predict more than two events to diagnose ovarian tumors as benign, borderline, primary invasive or metastatic invasive. The main focus is on how different types of models perform and compare. Methods A multi-center dataset containing 1066 women was used for model development and internal validation, whilst another multi-center dataset of 1938 women was used for temporal and external validation. Models were based on standard logistic regression and on penalized kernel-based algorithms (least squares support vector machines and kernel logistic regression. We used true polytomous models as well as combinations of dichotomous models based on the 'pairwise coupling' technique to produce polytomous risk estimates. Careful variable selection was performed, based largely on cross-validated c-index estimates. Model performance was assessed with the dichotomous c-index (i.e. the area under the ROC curve and a polytomous extension, and with calibration graphs. Results For all models, between 9 and 11 predictors were selected. Internal validation was successful with polytomous c-indexes between 0.64 and 0.69. For the best model dichotomous c-indexes were between 0.73 (primary invasive vs metastatic and 0.96 (borderline vs metastatic. On temporal and external validation, overall discrimination performance was good with polytomous c-indexes between 0.57 and 0.64. However, discrimination between primary and metastatic invasive tumors decreased to near random levels. Standard logistic regression performed well in comparison with advanced algorithms, and combining dichotomous models performed well in comparison with true polytomous models. The best model was a combination of dichotomous logistic regression models. This model is available online

  9. Development and validation of a novel predictive scoring model for microvascular invasion in patients with hepatocellular carcinoma

    International Nuclear Information System (INIS)

    Zhao, Hui; Hua, Ye; Dai, Tu; He, Jian; Tang, Min; Fu, Xu; Mao, Liang; Jin, Huihan; Qiu, Yudong

    2017-01-01

    Highlights: • This study aimed to establish a novel predictive scoring model of MVI in HCC patients. • Preoperative imaging features on CECT, such as intratumoral arteries, non-nodule type and absence of radiological tumor capsule were independent predictors for MVI. • The predictive scoring model is of great value in prediction of MVI regardless of tumor size. - Abstract: Purpose: Microvascular invasion (MVI) in patients with hepatocellular carcinoma (HCC) cannot be accurately predicted preoperatively. This study aimed to establish a predictive scoring model of MVI in solitary HCC patients without macroscopic vascular invasion. Methods: A total of 309 consecutive HCC patients who underwent curative hepatectomy were divided into the derivation (n = 206) and validation cohort (n = 103). A predictive scoring model of MVI was established according to the valuable predictors in the derivation cohort based on multivariate logistic regression analysis. The performance of the predictive model was evaluated in the derivation and validation cohorts. Results: Preoperative imaging features on CECT, such as intratumoral arteries, non-nodular type of HCC and absence of radiological tumor capsule were independent predictors for MVI. The predictive scoring model was established according to the β coefficients of the 3 predictors. Area under receiver operating characteristic (AUROC) of the predictive scoring model was 0.872 (95% CI, 0.817-0.928) and 0.856 (95% CI, 0.771-0.940) in the derivation and validation cohorts. The positive and negative predictive values were 76.5% and 88.0% in the derivation cohort and 74.4% and 88.3% in the validation cohort. The performance of the model was similar between the patients with tumor size ≤5 cm and >5 cm in AUROC (P = 0.910). Conclusions: The predictive scoring model based on intratumoral arteries, non-nodular type of HCC, and absence of the radiological tumor capsule on preoperative CECT is of great value in the prediction of MVI

  10. Development and validation of a novel predictive scoring model for microvascular invasion in patients with hepatocellular carcinoma

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, Hui [Department of Hepatopancreatobiliary Surgery, Nanjing Drum Tower Hospital Clinical College of Nanjing Medical University, Nanjing, Jiangsu (China); Department of Hepatopancreatobiliary Surgery, Nanjing Medical University Affiliated Wuxi Second People' s Hospital, Wuxi, Jiangsu (China); Hua, Ye [Department of Neurology, Nanjing Medical University Affiliated Wuxi Second People’s Hospital, Wuxi, Jiangsu (China); Dai, Tu [Department of Hepatopancreatobiliary Surgery, Nanjing Medical University Affiliated Wuxi Second People' s Hospital, Wuxi, Jiangsu (China); He, Jian; Tang, Min [Department of Radiology, Drum Tower Hospital, Medical School of Nanjing University, Nanjing, Jiangsu (China); Fu, Xu; Mao, Liang [Department of Hepatopancreatobiliary Surgery, Nanjing Drum Tower Hospital Clinical College of Nanjing Medical University, Nanjing, Jiangsu (China); Jin, Huihan, E-mail: 45687061@qq.com [Department of Hepatopancreatobiliary Surgery, Nanjing Medical University Affiliated Wuxi Second People' s Hospital, Wuxi, Jiangsu (China); Qiu, Yudong, E-mail: yudongqiu510@163.com [Department of Hepatopancreatobiliary Surgery, Nanjing Drum Tower Hospital Clinical College of Nanjing Medical University, Nanjing, Jiangsu (China)

    2017-03-15

    Highlights: • This study aimed to establish a novel predictive scoring model of MVI in HCC patients. • Preoperative imaging features on CECT, such as intratumoral arteries, non-nodule type and absence of radiological tumor capsule were independent predictors for MVI. • The predictive scoring model is of great value in prediction of MVI regardless of tumor size. - Abstract: Purpose: Microvascular invasion (MVI) in patients with hepatocellular carcinoma (HCC) cannot be accurately predicted preoperatively. This study aimed to establish a predictive scoring model of MVI in solitary HCC patients without macroscopic vascular invasion. Methods: A total of 309 consecutive HCC patients who underwent curative hepatectomy were divided into the derivation (n = 206) and validation cohort (n = 103). A predictive scoring model of MVI was established according to the valuable predictors in the derivation cohort based on multivariate logistic regression analysis. The performance of the predictive model was evaluated in the derivation and validation cohorts. Results: Preoperative imaging features on CECT, such as intratumoral arteries, non-nodular type of HCC and absence of radiological tumor capsule were independent predictors for MVI. The predictive scoring model was established according to the β coefficients of the 3 predictors. Area under receiver operating characteristic (AUROC) of the predictive scoring model was 0.872 (95% CI, 0.817-0.928) and 0.856 (95% CI, 0.771-0.940) in the derivation and validation cohorts. The positive and negative predictive values were 76.5% and 88.0% in the derivation cohort and 74.4% and 88.3% in the validation cohort. The performance of the model was similar between the patients with tumor size ≤5 cm and >5 cm in AUROC (P = 0.910). Conclusions: The predictive scoring model based on intratumoral arteries, non-nodular type of HCC, and absence of the radiological tumor capsule on preoperative CECT is of great value in the prediction of MVI

  11. Development and validation of a wear model for the analysis of the wheel profile evolution in railway vehicles

    Science.gov (United States)

    Auciello, J.; Ignesti, M.; Malvezzi, M.; Meli, E.; Rindi, A.

    2012-11-01

    The numerical wheel wear prediction in railway applications is of great importance for different aspects, such as the safety against vehicle instability and derailment, the planning of wheelset maintenance interventions and the design of an optimal wheel profile from the wear point of view. For these reasons, this paper presents a complete model aimed at the evaluation of the wheel wear and the wheel profile evolution by means of dynamic simulations, organised in two parts which interact with each other mutually: a vehicle's dynamic model and a model for the wear estimation. The first is a 3D multibody model of a railway vehicle implemented in SIMPACK™, a commercial software for the analysis of mechanical systems, where the wheel-rail interaction is entrusted to a C/C++user routine external to SIMPACK, in which the global contact model is implemented. In this regard, the research on the contact points between the wheel and the rail is based on an innovative algorithm developed by the authors in previous works, while normal and tangential forces in the contact patches are calculated according to Hertz's theory and Kalker's global theory, respectively. Due to the numerical efficiency of the global contact model, the multibody vehicle and the contact model interact directly online during the dynamic simulations. The second is the wear model, written in the MATLAB® environment, mainly based on an experimental relationship between the frictional power developed at the wheel-rail interface and the amount of material removed by wear. Starting from a few outputs of the multibody simulations (position of contact points, contact forces and rigid creepages), it evaluates the local variables, such as the contact pressures and local creepages, using a local contact model (Kalker's FASTSIM algorithm). These data are then passed to another subsystem which evaluates, by means of the considered experimental relationship, both the material to be removed and its distribution along

  12. The development, validation and initial results of an integrated model for determining the environmental sustainability of biogas production pathways

    NARCIS (Netherlands)

    Pierie, Frank; van Someren, Christian; Benders, René M.J.; Bekkering, Jan; van Gemert, Wim; Moll, Henri C.

    2016-01-01

    Biogas produced through Anaerobic Digestion can be seen as a flexible and storable energy carrier. However, the environmental sustainability and efficiency of biogas production is not fully understood. Within this article the use, operation, structure, validation, and results of a model for the

  13. Development and validation of a new LBM-MRT hybrid model with enthalpy formulation for melting with natural convection

    Energy Technology Data Exchange (ETDEWEB)

    Miranda Fuentes, Johann [Université de Lyon, CNRS, UMR5008, F-69622 Villeurbanne (France); INSA-Lyon, CETHIL, F-69621 Villeurbanne (France); Kuznik, Frédéric, E-mail: frederic.kuznik@insa-lyon.fr [Université de Lyon, CNRS, UMR5008, F-69622 Villeurbanne (France); INSA-Lyon, CETHIL, F-69621 Villeurbanne (France); Johannes, Kévyn; Virgone, Joseph [Université de Lyon, CNRS, UMR5008, F-69622 Villeurbanne (France); Université Lyon 1, CETHIL, F-69622 Villeurbanne (France)

    2014-01-17

    This article presents a new model to simulate melting with natural convection of a phase change material. For the phase change problem, the enthalpy formulation is used. Energy equation is solved by a finite difference method, whereas the fluid flow is solved by the multiple relaxation time (MRT) lattice Boltzmann method. The model is first verified and validated using the data from the literature. Then, the model is applied to a tall brick filled with a fatty acid eutectic mixture and the results are presented. The main results are (1) the spatial convergence rate is of second order, (2) the new model is validated against data from the literature and (3) the natural convection plays an important role in the melting process of the fatty acid mixture considered in our work.

  14. Development and Validity Testing of Belief Measurement Model in Buddhism for Junior High School Students at Chiang Rai Buddhist Scripture School: An Application for Multitrait-Multimethod Analysis

    Science.gov (United States)

    Chaidi, Thirachai; Damrongpanich, Sunthorapot

    2016-01-01

    The purposes of this study were to develop a model to measure the belief in Buddhism of junior high school students at Chiang Rai Buddhist Scripture School, and to determine construct validity of the model for measuring the belief in Buddhism by using Multitrait-Multimethod analysis. The samples were 590 junior high school students at Buddhist…

  15. Development and validation of open-source software for DNA mixture interpretation based on a quantitative continuous model.

    Science.gov (United States)

    Manabe, Sho; Morimoto, Chie; Hamano, Yuya; Fujimoto, Shuntaro; Tamaki, Keiji

    2017-01-01

    In criminal investigations, forensic scientists need to evaluate DNA mixtures. The estimation of the number of contributors and evaluation of the contribution of a person of interest (POI) from these samples are challenging. In this study, we developed a new open-source software "Kongoh" for interpreting DNA mixture based on a quantitative continuous model. The model uses quantitative information of peak heights in the DNA profile and considers the effect of artifacts and allelic drop-out. By using this software, the likelihoods of 1-4 persons' contributions are calculated, and the most optimal number of contributors is automatically determined; this differs from other open-source software. Therefore, we can eliminate the need to manually determine the number of contributors before the analysis. Kongoh also considers allele- or locus-specific effects of biological parameters based on the experimental data. We then validated Kongoh by calculating the likelihood ratio (LR) of a POI's contribution in true contributors and non-contributors by using 2-4 person mixtures analyzed through a 15 short tandem repeat typing system. Most LR values obtained from Kongoh during true-contributor testing strongly supported the POI's contribution even for small amounts or degraded DNA samples. Kongoh correctly rejected a false hypothesis in the non-contributor testing, generated reproducible LR values, and demonstrated higher accuracy of the estimated number of contributors than another software based on the quantitative continuous model. Therefore, Kongoh is useful in accurately interpreting DNA evidence like mixtures and small amounts or degraded DNA samples.

  16. Development and validation of open-source software for DNA mixture interpretation based on a quantitative continuous model.

    Directory of Open Access Journals (Sweden)

    Sho Manabe

    Full Text Available In criminal investigations, forensic scientists need to evaluate DNA mixtures. The estimation of the number of contributors and evaluation of the contribution of a person of interest (POI from these samples are challenging. In this study, we developed a new open-source software "Kongoh" for interpreting DNA mixture based on a quantitative continuous model. The model uses quantitative information of peak heights in the DNA profile and considers the effect of artifacts and allelic drop-out. By using this software, the likelihoods of 1-4 persons' contributions are calculated, and the most optimal number of contributors is automatically determined; this differs from other open-source software. Therefore, we can eliminate the need to manually determine the number of contributors before the analysis. Kongoh also considers allele- or locus-specific effects of biological parameters based on the experimental data. We then validated Kongoh by calculating the likelihood ratio (LR of a POI's contribution in true contributors and non-contributors by using 2-4 person mixtures analyzed through a 15 short tandem repeat typing system. Most LR values obtained from Kongoh during true-contributor testing strongly supported the POI's contribution even for small amounts or degraded DNA samples. Kongoh correctly rejected a false hypothesis in the non-contributor testing, generated reproducible LR values, and demonstrated higher accuracy of the estimated number of contributors than another software based on the quantitative continuous model. Therefore, Kongoh is useful in accurately interpreting DNA evidence like mixtures and small amounts or degraded DNA samples.

  17. Verification and validation for waste disposal models

    International Nuclear Information System (INIS)

    1987-07-01

    A set of evaluation criteria has been developed to assess the suitability of current verification and validation techniques for waste disposal methods. A survey of current practices and techniques was undertaken and evaluated using these criteria with the items most relevant to waste disposal models being identified. Recommendations regarding the most suitable verification and validation practices for nuclear waste disposal modelling software have been made

  18. Geochemistry Model Validation Report: External Accumulation Model

    International Nuclear Information System (INIS)

    Zarrabi, K.

    2001-01-01

    The purpose of this Analysis and Modeling Report (AMR) is to validate the External Accumulation Model that predicts accumulation of fissile materials in fractures and lithophysae in the rock beneath a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. (Lithophysae are voids in the rock having concentric shells of finely crystalline alkali feldspar, quartz, and other materials that were formed due to entrapped gas that later escaped, DOE 1998, p. A-25.) The intended use of this model is to estimate the quantities of external accumulation of fissile material for use in external criticality risk assessments for different types of degrading WPs: U.S. Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The scope of the model validation is to (1) describe the model and the parameters used to develop the model, (2) provide rationale for selection of the parameters by comparisons with measured values, and (3) demonstrate that the parameters chosen are the most conservative selection for external criticality risk calculations. To demonstrate the applicability of the model, a Pu-ceramic WP is used as an example. The model begins with a source term from separately documented EQ6 calculations; where the source term is defined as the composition versus time of the water flowing out of a breached waste package (WP). Next, PHREEQC, is used to simulate the transport and interaction of the source term with the resident water and fractured tuff below the repository. In these simulations the primary mechanism for accumulation is mixing of the high pH, actinide-laden source term with resident water; thus lowering the pH values sufficiently for fissile minerals to become insoluble and precipitate. In the final section of the model, the outputs from PHREEQC, are processed to produce mass of accumulation

  19. Development of a tool dedicated to the evaluation of hydrogen term source for technological Wastes: assumptions, physical models, and validation

    Energy Technology Data Exchange (ETDEWEB)

    Lamouroux, C. [CEA Saclay, Nuclear Energy Division /DANS, Department of physico-chemistry, 91191 Gif sur yvette (France); Esnouf, S. [CEA Saclay, DSM/IRAMIS/SIS2M/Radiolysis Laboratory , 91191 Gif sur yvette (France); Cochin, F. [Areva NC,recycling BU, DIRP/RDP tour Areva, 92084 Paris La Defense (France)

    2013-07-01

    In radioactive waste packages hydrogen is generated, in one hand, from the radiolysis of wastes (mainly organic materials) and, in the other hand, from the radiolysis of water content in the cement matrix. In order to assess hydrogen generation 2 tools based on operational models have been developed. One is dedicated to the determination of the hydrogen source term issues from the radiolysis of the wastes: the STORAGE tool (Simulation Tool Of Emission Radiolysis Gas), the other deals with the hydrogen source term gas, produced by radiolysis of the cement matrices (the Damar tool). The approach used by the STORAGE tool for assessing the production rate of radiolysis gases is divided into five steps: 1) Specification of the data packages, in particular, inventories and radiological materials defined for a package medium; 2) Determination of radiochemical yields for the different constituents and the laws of behavior associated, this determination of radiochemical yields is made from the PRELOG database in which radiochemical yields in different irradiation conditions have been compiled; 3) Definition of hypothesis concerning the composition and the distribution of contamination inside the package to allow assessment of the power absorbed by the constituents; 4) Sum-up of all the contributions; And finally, 5) validation calculations by comparison with a reduced sampling of packages. Comparisons with measured values confirm the conservative character of the methodology and give confidence in the safety margins for safety analysis report.

  20. Developing a model of competence in the operating theatre: psychometric validation of the perceived perioperative competence scale-revised.

    Science.gov (United States)

    Gillespie, Brigid M; Polit, Denise F; Hamlin, Lois; Chaboyer, Wendy

    2012-01-01

    This paper describes the development and validation of the Revised Perioperative Competence Scale (PPCS-R). There is a lack of a psychometrically tested sound self-assessment tools to measure nurses' perceived competence in the operating room. Content validity was established by a panel of international experts and the original 98-item scale was pilot tested with 345 nurses in Queensland, Australia. Following the removal of several items, a national sample that included all 3209 nurses who were members of the Australian College of Operating Room Nurses was surveyed using the 94-item version. Psychometric testing assessed content validity using exploratory factor analysis, internal consistency using Cronbach's alpha, and construct validity using the "known groups" technique. During item reduction, several preliminary factor analyses were performed on two random halves of the sample (n=550). Usable data for psychometric assessment were obtained from 1122 nurses. The original 94-item scale was reduced to 40 items. The final factor analysis using the entire sample resulted in a 40 item six-factor solution. Cronbach's alpha for the 40-item scale was .96. Construct validation demonstrated significant differences (pperceived competence scores relative to years of operating room experience and receipt of specialty education. On the basis of these results, the psychometric properties of the PPCS-R were considered encouraging. Further testing of the tool in different samples of operating room nurses is necessary to enable cross-cultural comparisons. Copyright © 2011 Elsevier Ltd. All rights reserved.

  1. Development and validation of a simple model for cellular and cell cluster dosimetry with practical application in targeted radionuclide therapy

    International Nuclear Information System (INIS)

    Bardies, M.; Myers, M.J.

    1992-01-01

    The authors have developed an analytical technique for calculating the mean absorbed dose to the cell nucleus from a variety of spatial distributions of cells and activities and a wide range of emitted energies and radionuclides. The dose to the nucleus has been calculated using this method from activity distributed (1) on the cell surface (2) throughout the cytoplasm (3) throughout a cluster of cells (micrometastasis) and (4) on the surface of the cluster of cells. The derived absorption factors have been based on the latest point kernels of Berger and have been validated against published estimates. They show good agreement and the model has the advantage of being easily adapted for revisions and extensions of available low energy data. Data sets may be derived with the absorbed fractions or the absorbed dose per emission as a function of the radial extent of the activity, and either the individual energies of the emissions or the totality of the emissions from a particular radio-nuclide. The practical applications of the model have included: (a) calculation of the absorbed dose to radioimmuno-targeted micrometastasis in the peritoneum; (b) calculations of doses to cells labelled on the surface with some novel emitters such as 67 Cu, 177 Lu, 153 Sm, 111 Ag, 186 Re, 188 Re as well as 131 I, 125 I and 90 Y; (c) comparison of doses to the cell nucleus from MIBG labelled with 125 I and 131 I and distributed in the cytoplasm of the cell; and (d) estimates of the absorbed dose to the cell nucleus from alpha emitters distributed on the surface of the cell

  2. Development and validation of risk models to predict outcomes following in-hospital cardiac arrest attended by a hospital-based resuscitation team.

    Science.gov (United States)

    Harrison, David A; Patel, Krishna; Nixon, Edel; Soar, Jasmeet; Smith, Gary B; Gwinnutt, Carl; Nolan, Jerry P; Rowan, Kathryn M

    2014-08-01

    The National Cardiac Arrest Audit (NCAA) is the UK national clinical audit for in-hospital cardiac arrest. To make fair comparisons among health care providers, clinical indicators require case mix adjustment using a validated risk model. The aim of this study was to develop and validate risk models to predict outcomes following in-hospital cardiac arrest attended by a hospital-based resuscitation team in UK hospitals. Risk models for two outcomes-return of spontaneous circulation (ROSC) for greater than 20min and survival to hospital discharge-were developed and validated using data for in-hospital cardiac arrests between April 2011 and March 2013. For each outcome, a full model was fitted and then simplified by testing for non-linearity, combining categories and stepwise reduction. Finally, interactions between predictors were considered. Models were assessed for discrimination, calibration and accuracy. 22,479 in-hospital cardiac arrests in 143 hospitals were included (14,688 development, 7791 validation). The final risk model for ROSC>20min included: age (non-linear), sex, prior length of stay in hospital, reason for attendance, location of arrest, presenting rhythm, and interactions between presenting rhythm and location of arrest. The model for hospital survival included the same predictors, excluding sex. Both models had acceptable performance across the range of measures, although discrimination for hospital mortality exceeded that for ROSC>20min (c index 0.81 versus 0.72). Validated risk models for ROSC>20min and hospital survival following in-hospital cardiac arrest have been developed. These models will strengthen comparative reporting in NCAA and support local quality improvement. Copyright © 2014 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  3. Development and validation of risk models to predict outcomes following in-hospital cardiac arrest attended by a hospital-based resuscitation team☆

    Science.gov (United States)

    Harrison, David A.; Patel, Krishna; Nixon, Edel; Soar, Jasmeet; Smith, Gary B.; Gwinnutt, Carl; Nolan, Jerry P.; Rowan, Kathryn M.

    2014-01-01

    Aim The National Cardiac Arrest Audit (NCAA) is the UK national clinical audit for in-hospital cardiac arrest. To make fair comparisons among health care providers, clinical indicators require case mix adjustment using a validated risk model. The aim of this study was to develop and validate risk models to predict outcomes following in-hospital cardiac arrest attended by a hospital-based resuscitation team in UK hospitals. Methods Risk models for two outcomes—return of spontaneous circulation (ROSC) for greater than 20 min and survival to hospital discharge—were developed and validated using data for in-hospital cardiac arrests between April 2011 and March 2013. For each outcome, a full model was fitted and then simplified by testing for non-linearity, combining categories and stepwise reduction. Finally, interactions between predictors were considered. Models were assessed for discrimination, calibration and accuracy. Results 22,479 in-hospital cardiac arrests in 143 hospitals were included (14,688 development, 7791 validation). The final risk model for ROSC > 20 min included: age (non-linear), sex, prior length of stay in hospital, reason for attendance, location of arrest, presenting rhythm, and interactions between presenting rhythm and location of arrest. The model for hospital survival included the same predictors, excluding sex. Both models had acceptable performance across the range of measures, although discrimination for hospital mortality exceeded that for ROSC > 20 min (c index 0.81 versus 0.72). Conclusions Validated risk models for ROSC > 20 min and hospital survival following in-hospital cardiac arrest have been developed. These models will strengthen comparative reporting in NCAA and support local quality improvement. PMID:24830872

  4. Validation of asphalt mixture pavement skid prediction model and development of skid prediction model for surface treatments.

    Science.gov (United States)

    2017-04-01

    Pavement skid resistance is primarily a function of the surface texture, which includes both microtexture and macrotexture. Earlier, under the Texas Department of Transportation (TxDOT) Research Project 0-5627, the researchers developed a method to p...

  5. New Guideline for the Reporting of Studies Developing, Validating, or Updating a Multivariable Clinical Prediction Model : The TRIPOD Statement

    NARCIS (Netherlands)

    Moons, Karel G. M.; Altman, Douglas G.; Reitsma, Johannes B.; Collins, Gary S.

    Prediction models are developed to aid health care providers in estimating the probability that a specific outcome or disease is present (diagnostic prediction models) or will occur in the future (prognostic prediction models), to inform their decision making. Prognostic models here also include

  6. Development and validation of P-MODTRAN7 and P-MCScene, 1D and 3D polarimetric radiative transfer models

    Science.gov (United States)

    Hawes, Frederick T.; Berk, Alexander; Richtsmeier, Steven C.

    2016-05-01

    A validated, polarimetric 3-dimensional simulation capability, P-MCScene, is being developed by generalizing Spectral Sciences' Monte Carlo-based synthetic scene simulation model, MCScene, to include calculation of all 4 Stokes components. P-MCScene polarimetric optical databases will be generated by a new version (MODTRAN7) of the government-standard MODTRAN radiative transfer algorithm. The conversion of MODTRAN6 to a polarimetric model is being accomplished by (1) introducing polarimetric data, by (2) vectorizing the MODTRAN radiation calculations and by (3) integrating the newly revised and validated vector discrete ordinate model VDISORT3. Early results, presented here, demonstrate a clear pathway to the long-term goal of fully validated polarimetric models.

  7. Predicting Overall Survival After Stereotactic Ablative Radiation Therapy in Early-Stage Lung Cancer: Development and External Validation of the Amsterdam Prognostic Model

    Energy Technology Data Exchange (ETDEWEB)

    Louie, Alexander V., E-mail: Dr.alexlouie@gmail.com [Department of Radiation Oncology, VU University Medical Center, Amsterdam (Netherlands); Department of Radiation Oncology, London Regional Cancer Program, University of Western Ontario, London, Ontario (Canada); Department of Epidemiology, Harvard School of Public Health, Harvard University, Boston, Massachusetts (United States); Haasbeek, Cornelis J.A. [Department of Radiation Oncology, VU University Medical Center, Amsterdam (Netherlands); Mokhles, Sahar [Department of Cardio-Thoracic Surgery, Erasmus University Medical Center, Rotterdam (Netherlands); Rodrigues, George B. [Department of Radiation Oncology, London Regional Cancer Program, University of Western Ontario, London, Ontario (Canada); Stephans, Kevin L. [Department of Radiation Oncology, Taussig Cancer Institute, Cleveland Clinic, Cleveland, Ohio (United States); Lagerwaard, Frank J. [Department of Radiation Oncology, VU University Medical Center, Amsterdam (Netherlands); Palma, David A. [Department of Radiation Oncology, London Regional Cancer Program, University of Western Ontario, London, Ontario (Canada); Videtic, Gregory M.M. [Department of Radiation Oncology, Taussig Cancer Institute, Cleveland Clinic, Cleveland, Ohio (United States); Warner, Andrew [Department of Radiation Oncology, London Regional Cancer Program, University of Western Ontario, London, Ontario (Canada); Takkenberg, Johanna J.M. [Department of Cardio-Thoracic Surgery, Erasmus University Medical Center, Rotterdam (Netherlands); Reddy, Chandana A. [Department of Radiation Oncology, Taussig Cancer Institute, Cleveland Clinic, Cleveland, Ohio (United States); Maat, Alex P.W.M. [Department of Cardio-Thoracic Surgery, Erasmus University Medical Center, Rotterdam (Netherlands); Woody, Neil M. [Department of Radiation Oncology, Taussig Cancer Institute, Cleveland Clinic, Cleveland, Ohio (United States); Slotman, Ben J.; Senan, Suresh [Department of Radiation Oncology, VU University Medical Center, Amsterdam (Netherlands)

    2015-09-01

    Purpose: A prognostic model for 5-year overall survival (OS), consisting of recursive partitioning analysis (RPA) and a nomogram, was developed for patients with early-stage non-small cell lung cancer (ES-NSCLC) treated with stereotactic ablative radiation therapy (SABR). Methods and Materials: A primary dataset of 703 ES-NSCLC SABR patients was randomly divided into a training (67%) and an internal validation (33%) dataset. In the former group, 21 unique parameters consisting of patient, treatment, and tumor factors were entered into an RPA model to predict OS. Univariate and multivariate models were constructed for RPA-selected factors to evaluate their relationship with OS. A nomogram for OS was constructed based on factors significant in multivariate modeling and validated with calibration plots. Both the RPA and the nomogram were externally validated in independent surgical (n=193) and SABR (n=543) datasets. Results: RPA identified 2 distinct risk classes based on tumor diameter, age, World Health Organization performance status (PS) and Charlson comorbidity index. This RPA had moderate discrimination in SABR datasets (c-index range: 0.52-0.60) but was of limited value in the surgical validation cohort. The nomogram predicting OS included smoking history in addition to RPA-identified factors. In contrast to RPA, validation of the nomogram performed well in internal validation (r{sup 2}=0.97) and external SABR (r{sup 2}=0.79) and surgical cohorts (r{sup 2}=0.91). Conclusions: The Amsterdam prognostic model is the first externally validated prognostication tool for OS in ES-NSCLC treated with SABR available to individualize patient decision making. The nomogram retained strong performance across surgical and SABR external validation datasets. RPA performance was poor in surgical patients, suggesting that 2 different distinct patient populations are being treated with these 2 effective modalities.

  8. Model validation: a systemic and systematic approach

    International Nuclear Information System (INIS)

    Sheng, G.; Elzas, M.S.; Cronhjort, B.T.

    1993-01-01

    The term 'validation' is used ubiquitously in association with the modelling activities of numerous disciplines including social, political natural, physical sciences, and engineering. There is however, a wide range of definitions which give rise to very different interpretations of what activities the process involves. Analyses of results from the present large international effort in modelling radioactive waste disposal systems illustrate the urgent need to develop a common approach to model validation. Some possible explanations are offered to account for the present state of affairs. The methodology developed treats model validation and code verification in a systematic fashion. In fact, this approach may be regarded as a comprehensive framework to assess the adequacy of any simulation study. (author)

  9. Structural system identification: Structural dynamics model validation

    Energy Technology Data Exchange (ETDEWEB)

    Red-Horse, J.R.

    1997-04-01

    Structural system identification is concerned with the development of systematic procedures and tools for developing predictive analytical models based on a physical structure`s dynamic response characteristics. It is a multidisciplinary process that involves the ability (1) to define high fidelity physics-based analysis models, (2) to acquire accurate test-derived information for physical specimens using diagnostic experiments, (3) to validate the numerical simulation model by reconciling differences that inevitably exist between the analysis model and the experimental data, and (4) to quantify uncertainties in the final system models and subsequent numerical simulations. The goal of this project was to develop structural system identification techniques and software suitable for both research and production applications in code and model validation.

  10. Analysis of various quality attributes of sunflower and soybean plants by near infra-red reflectance spectroscopy: Development and validation calibration models

    Science.gov (United States)

    Sunflower and soybean are summer annuals that can be grown as an alternative to corn and may be particularly useful in organic production systems. Rapid and low cost methods of analyzing plant quality would be helpful for crop management. We developed and validated calibration models for Near-infrar...

  11. Development and external validation of a risk-prediction model to predict 5-year overall survival in advanced larynx cancer

    NARCIS (Netherlands)

    Petersen, Japke F.; Stuiver, Martijn M.; Timmermans, Adriana J.; Chen, Amy; Zhang, Hongzhen; O'Neill, James P.; Deady, Sandra; Vander Poorten, Vincent; Meulemans, Jeroen; Wennerberg, Johan; Skroder, Carl; Day, Andrew T.; Koch, Wayne; van den Brekel, Michiel W. M.

    2017-01-01

    TNM-classification inadequately estimates patient-specific overall survival (OS). We aimed to improve this by developing a risk-prediction model for patients with advanced larynx cancer. Cohort study. We developed a risk prediction model to estimate the 5-year OS rate based on a cohort of 3,442

  12. Development and experimental validation of a monte carlo modeling of the neutron emission from a d-t generator

    Energy Technology Data Exchange (ETDEWEB)

    Remetti, Romolo; Lepore, Luigi [Sapienza University of Rome, Dept. SBAI, Via Antonio Scarpa 14, 00161 Rome (Italy); Cherubini, Nadia [ENEA CRE Casaccia, Nuclear Material Characterization Laboratory and Nuclear Waste Management, Via Anguillarese 301, 00123 Rome (Italy)

    2017-01-11

    An extensive use of Monte Carlo simulations led to the identification of a Thermo Scientific MP320 neutron generator MCNPX input deck. Such input deck is currently utilized at ENEA Casaccia Research Center for optimizing all the techniques and applications involving the device, in particular for explosives and drugs detection by fast neutrons. The working model of the generator was obtained thanks to a detailed representation of the MP320 internal components, and to the potentialities offered by the MCNPX code. Validation of the model was obtained by comparing simulated results vs. manufacturer's data, and vs. experimental tests. The aim of this work is explaining all the steps that led to those results, suggesting a procedure that might be extended to different models of neutron generators.

  13. Development and experimental validation of a monte carlo modeling of the neutron emission from a d-t generator

    Science.gov (United States)

    Remetti, Romolo; Lepore, Luigi; Cherubini, Nadia

    2017-01-01

    An extensive use of Monte Carlo simulations led to the identification of a Thermo Scientific MP320 neutron generator MCNPX input deck. Such input deck is currently utilized at ENEA Casaccia Research Center for optimizing all the techniques and applications involving the device, in particular for explosives and drugs detection by fast neutrons. The working model of the generator was obtained thanks to a detailed representation of the MP320 internal components, and to the potentialities offered by the MCNPX code. Validation of the model was obtained by comparing simulated results vs. manufacturer's data, and vs. experimental tests. The aim of this work is explaining all the steps that led to those results, suggesting a procedure that might be extended to different models of neutron generators.

  14. Development and validation of a new dynamic computer-controlled model of the human stomach and small intestine.

    Science.gov (United States)

    Guerra, Aurélie; Denis, Sylvain; le Goff, Olivier; Sicardi, Vincent; François, Olivier; Yao, Anne-Françoise; Garrait, Ghislain; Manzi, Aimé Pacifique; Beyssac, Eric; Alric, Monique; Blanquet-Diot, Stéphanie

    2016-06-01

    For ethical, regulatory, and economic reasons, in vitro human digestion models are increasingly used as an alternative to in vivo assays. This study aims to present the new Engineered Stomach and small INtestine (ESIN) model and its validation for pharmaceutical applications. This dynamic computer-controlled system reproduces, according to in vivo data, the complex physiology of the human stomach and small intestine, including pH, transit times, chyme mixing, digestive secretions, and passive absorption of digestion products. Its innovative design allows a progressive meal intake and the differential gastric emptying of solids and liquids. The pharmaceutical behavior of two model drugs (paracetamol immediate release form and theophylline sustained release tablet) was studied in ESIN during liquid digestion. The results were compared to those found with a classical compendial method (paddle apparatus) and in human volunteers. Paracetamol and theophylline tablets showed similar absorption profiles in ESIN and in healthy subjects. For theophylline, a level A in vitro-in vivo correlation could be established between the results obtained in ESIN and in humans. Interestingly, using a pharmaceutical basket, the swelling and erosion of the theophylline sustained release form was followed during transit throughout ESIN. ESIN emerges as a relevant tool for pharmaceutical studies but once further validated may find many other applications in nutritional, toxicological, and microbiological fields. Biotechnol. Bioeng. 2016;113: 1325-1335. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  15. SU-E-T-479: Development and Validation of Analytical Models Predicting Secondary Neutron Radiation in Proton Therapy Applications

    International Nuclear Information System (INIS)

    Farah, J; Bonfrate, A; Donadille, L; Martinetti, F; Trompier, F; Clairand, I; De Olivera, A; Delacroix, S; Herault, J; Piau, S; Vabre, I

    2014-01-01

    Purpose: Test and validation of analytical models predicting leakage neutron exposure in passively scattered proton therapy. Methods: Taking inspiration from the literature, this work attempts to build an analytical model predicting neutron ambient dose equivalents, H*(10), within the local 75 MeV ocular proton therapy facility. MC simulations were first used to model H*(10) in the beam axis plane while considering a closed final collimator and pristine Bragg peak delivery. Next, MC-based analytical model was tested against simulation results and experimental measurements. The model was also expended in the vertical direction to enable a full 3D mapping of H*(10) inside the treatment room. Finally, the work focused on upgrading the literature model to clinically relevant configurations considering modulated beams, open collimators, patient-induced neutron fluctuations, etc. Results: The MC-based analytical model efficiently reproduced simulated H*(10) values with a maximum difference below 10%. In addition, it succeeded in predicting measured H*(10) values with differences <40%. The highest differences were registered at the closest and farthest positions from isocenter where the analytical model failed to faithfully reproduce the high neutron fluence and energy variations. The differences remains however acceptable taking into account the high measurement/simulation uncertainties and the end use of this model, i.e. radiation protection. Moreover, the model was successfully (differences < 20% on simulations and < 45% on measurements) extended to predict neutrons in the vertical direction with respect to the beam line as patients are in the upright seated position during ocular treatments. Accounting for the impact of beam modulation, collimation and the present of a patient in the beam path is far more challenging and conversion coefficients are currently being defined to predict stray neutrons in clinically representative treatment configurations. Conclusion

  16. Development and Validation of 3D-CFD Injection and Combustion Models for Dual Fuel Combustion in Diesel Ignited Large Gas Engines

    Directory of Open Access Journals (Sweden)

    Lucas Eder

    2018-03-01

    Full Text Available This paper focuses on improving the 3D-Computational Fluid Dynamics (CFD modeling of diesel ignited gas engines, with an emphasis on injection and combustion modeling. The challenges of modeling are stated and possible solutions are provided. A specific approach for modeling injection is proposed that improves the modeling of the ballistic region of the needle lift. Experimental results from an inert spray chamber are used for model validation. Two-stage ignition methods are described along with improvements in ignition delay modeling of the diesel ignited gas engine. The improved models are used in the Extended Coherent Flame Model with the 3 Zones approach (ECFM-3Z. The predictive capability of the models is investigated using data from single cylinder engine (SCE tests conducted at the Large Engines Competence Center (LEC. The results are discussed and further steps for development are identified.

  17. The development and validation of the Blended Socratic Method of Teaching (BSMT: An instructional model to enhance critical thinking skills of undergraduate business students

    Directory of Open Access Journals (Sweden)

    Eugenia Arazo Boa

    2018-01-01

    Full Text Available Enhancing critical thinking skills is one of the paramount goals of many educational institutions. This study presents the development and validation of the Blended Socratic Method of Teaching (BSMT, a teaching model intended to foster critical thinking skills of business students in the undergraduate level. The main objectives of the study were to 1 to survey the critical thinking skills of undergraduate business students, and 2 to develop and validate the BSMT model designed to enhance critical thinking skills. The research procedure comprised of two phases related to the two research objectives: 1 surveying the critical thinking skills of 371 undergraduate business students at Naresuan University International College focusing on the three critical thinking competencies of the RED model—recognize assumptions, evaluate arguments, and draw conclusion, and the determination of the level of their critical thinking; and 2 developing the instructional model followed by validation of the model by five experts. The results of the study were: 1 the undergraduate business students have deficient critical thinking based on the RED Model competencies as they scored “below average” on the critical thinking appraisal, and 2 the developed model comprised six elements: focus, syntax, principles of reaction, the social system, the support system, and application. The experts were in complete agreement that the model is “highly appropriate” in improving the critical thinking skills of the business students. The main essence of the model is the syntax comprising of five steps: group assignment, analysis and writing of case studies; group presentation of the business case analysis in class; Socratic discussion/questioning in class; posting of the case study on the class Facebook account; and online Socratic discussion/questioning. The BSMT model is an authentic and comprehensive model combining the Socratic method of teaching, information and

  18. Simulation of the adsorption capacity of polar organic compounds and dyes from water onto activated carbons: Model development and validation

    Directory of Open Access Journals (Sweden)

    Warisa Bunmahotama

    2018-03-01

    Full Text Available A model approach is developed to simulate the adsorption isotherms of low-molecular-weight polar organic compounds (LMWPOCs, halogenated LMWPOCs, and dye molecules onto activated carbons (AC. The models were based on the Dubinin–Astakhov equation, with the limiting pore volume of adsorbent estimated from the pore size distribution data, and the adsorption affinity of the adsorbate described by the molecular connectivity index. The models were used to simulate the adsorption data of 87 LMWPOCs onto six ACs, 25 halogenated LMWPOCs onto two ACs and 22 dyes onto three ACs. The developed models follow the experimental data fairly well, with errors of 49, 33 and 43% for the tested LMWPOCs, halogenated LMWPOCs, and dyes, respectively. This study shows that the developed model approach may provide a simple means for the estimation of adsorption capacity for LMWPOCs and dyes onto ACs in water.

  19. Prediction and validation of pool fire development in enclosures by means of CFD Models for risk assessment of nuclear power plants (Poolfire) - Report year 2

    International Nuclear Information System (INIS)

    Van Hees, P.; Wahlqvist, J.; Kong, D.; Hostikka, S.; Sikanen, T.; Husted, B.; Magnusson, T.; Joerud, F.

    2013-05-01

    Fires in nuclear power plants can be an important hazard for the overall safety of the facility. One of the typical fire sources is a pool fire. It is therefore important to have good knowledge on the fire behaviour of pool fire and be able to predict the heat release rate by prediction of the mass loss rate. This project envisages developing a pyrolysis model to be used in CFD models. In this report the activities for second year are reported, which is an overview of the experiments conducted, further development and validation of models and cases study to be selected in year 3. (Author)

  20. Prediction and validation of pool fire development in enclosures by means of CFD Models for risk assessment of nuclear power plants (Poolfire) - Report year 2

    Energy Technology Data Exchange (ETDEWEB)

    van Hees, P.; Wahlqvist, J.; Kong, D. [Lund Univ., Lund (Sweden); Hostikka, S.; Sikanen, T. [VTT Technical Research Centre of Finland (Finland); Husted, B. [Haugesund Univ. College, Stord (Norway); Magnusson, T. [Ringhals AB, Vaeroebacka (Sweden); Joerud, F. [European Spallation Source (ESS), Lund (Sweden)

    2013-05-15

    Fires in nuclear power plants can be an important hazard for the overall safety of the facility. One of the typical fire sources is a pool fire. It is therefore important to have good knowledge on the fire behaviour of pool fire and be able to predict the heat release rate by prediction of the mass loss rate. This project envisages developing a pyrolysis model to be used in CFD models. In this report the activities for second year are reported, which is an overview of the experiments conducted, further development and validation of models and cases study to be selected in year 3. (Author)

  1. Development of boiling transition analysis code TCAPE-INS/B based on mechanistic methods for BWR fuel bundles. Models and validations with boiling transition experimental data

    International Nuclear Information System (INIS)

    Ishida, Naoyuki; Utsuno, Hideaki; Kasahara, Fumio

    2003-01-01

    The Boiling Transition (BT) analysis code TCAPE-INS/B based on the mechanistic methods coupled with subchannel analysis has been developed for the evaluation of the integrity of Boiling Water Reactor (BWR) fuel rod bundles under abnormal operations. Objective of the development is the evaluation of the BT without using empirical BT and rewetting correlations needed for different bundle designs in the current analysis methods. TCAPE-INS/B consisted mainly of the drift-flux model, the film flow model, the cross-flow model, the thermal conductivity model and the heat transfer correlations. These models were validated systematically with the experimental data. The accuracy of the prediction for the steady-state Critical Heat Flux (CHF) and the transient temperature of the fuel rod surface after the occurrence of BT were evaluated on the validations. The calculations for the experiments with the single tube and bundles were carried out for the validations of the models incorporated in the code. The results showed that the steady-state CHF was predicted within about 6% average error. In the transient calculations, BT timing and temperature of the fuel rod surface gradient agreed well with experimental results, but rewetting was predicted lately. So, modeling of heat transfer phenomena during post-BT is under modification. (author)

  2. Development and validation of a predictive model for the growth of Vibrio parahaemolyticus in post-harvest shellstock oysters.

    Science.gov (United States)

    Parveen, Salina; DaSilva, Ligia; DePaola, Angelo; Bowers, John; White, Chanelle; Munasinghe, Kumudini Apsara; Brohawn, Kathy; Mudoh, Meshack; Tamplin, Mark

    2013-01-15

    Information is limited about the growth and survival of naturally-occurring Vibrio parahaemolyticus in live oysters under commercially relevant storage conditions harvested from different regions and in different oyster species. This study produced a predictive model for the growth of naturally-occurring V. parahaemolyticus in live Eastern oysters (Crassostrea virginica) harvested from the Chesapeake Bay, MD, USA and stored at 5-30 °C until oysters gapped. The model was validated with model-independent data collected from Eastern oysters harvested from the Chesapeake Bay and Mobile Bay, AL, USA and Asian (C. ariakensis) oysters from the Chesapeake Bay, VA, USA. The effect of harvest season, region and water condition on growth rate (GR) was also tested. At each time interval, two samples consisting of six oysters each were analyzed by a direct-plating method for total V. parahaemolyticus. The Baranyi D-model was fitted to the total V. parahaemolyticus growth and survival data. A secondary model was produced using the square root model. V. parahaemolyticus slowly inactivated at 5 and 10 °C with average rates of -0.002 and -0.001 log cfu/h, respectively. The average GRs at 15, 20, 25, and 30 °C were 0.038, 0.082, 0.228, and 0.219 log cfu/h, respectively. The bias and accuracy factors of the secondary model for model-independent data were 1.36 and 1.46 for Eastern oysters from Mobile Bay and the Chesapeake Bay, respectively. V. parahaemolyticus GRs were markedly lower in Asian oysters. Harvest temperature, salinity, region and season had no effect on GRs. The observed GRs were less than those predicted by the U.S. Food and Drug Administration's V. parahaemolyticus quantitative risk assessment. Copyright © 2012 Elsevier B.V. All rights reserved.

  3. DEVELOPMENT AND VALIDATION OF AN AIR-TO-BEEF FOOD CHAIN MODEL FOR DIOXIN-LIKE COMPOUNDS

    Science.gov (United States)

    A model for predicting concentrations of dioxin-like compounds in beef is developed and tested. The key premise of the model is that concentrations of these compounds in air are the source term, or starting point, for estimating beef concentrations. Vapor-phase concentrations t...

  4. Towards an Explanation of Overeating Patterns Among Normal Weight College Women: Development and Validation of a Structural Equation Model

    OpenAIRE

    Russ, Christine Runyan II

    1998-01-01

    Although research describing relationships between psychosocial factors and various eating patterns is growing, a model which explains the mechanisms through which these factors may operate is lacking. A model to explain overeating patterns among normal weight college females was developed and tested. The model contained the following variables: global adjustment, eating and weight cognitions, emotional eating, and self-efficacy. Three hundred ninety-o...

  5. Development and validation of a radial turbine efficiency and mass flow model at design and off-design conditions

    International Nuclear Information System (INIS)

    Serrano, José Ramón; Arnau, Francisco José; García-Cuevas, Luis Miguel; Dombrovsky, Artem; Tartoussi, Hadi

    2016-01-01

    Highlights: • A procedure for performance maps extrapolation of any radial turbine is presented. • Non measured VGT positions, speeds and blade to jet speed ratios can be extrapolated. • Calibration coefficients that can be fitted with a limited set of map data are used. • Experimental points at high blade to jet speed ratios have been used for validation. • The extrapolation accuracy is good in different map ranges and variables. - Abstract: Turbine performance at extreme off-design conditions is growing in importance for properly computing turbocharged reciprocating internal combustion engines behaviour during urban driving conditions at current and future homologation cycles. In these cases, the turbine operates at very low flow rates and power outputs and at very high blade to jet speed ratios during transitory periods due to turbocharger wheel inertia and the high pulsation level of engine exhaust flow. This paper presents a physically based method that is able to extrapolate radial turbines reduced mass flow and adiabatic efficiency in blade speed ratio, turbine rotational speed and stator vanes position. The model uses a very narrow range of experimental data from turbine maps to fit the necessary coefficients. By using a special experimental turbocharger gas stand, experimental data have been obtained for extremely low turbine power outputs for the sake of model validation. Even if the data used for fitting only covers the turbine normal operation zone, the extrapolation model provides very good agreement with the experiments at very high blade speed ratio points; producing also good results when extrapolating in rotational speed and stator vanes position.

  6. PEMFC modeling and experimental validation

    Energy Technology Data Exchange (ETDEWEB)

    Vargas, J.V.C. [Federal University of Parana (UFPR), Curitiba, PR (Brazil). Dept. of Mechanical Engineering], E-mail: jvargas@demec.ufpr.br; Ordonez, J.C.; Martins, L.S. [Florida State University, Tallahassee, FL (United States). Center for Advanced Power Systems], Emails: ordonez@caps.fsu.edu, martins@caps.fsu.edu

    2009-07-01

    In this paper, a simplified and comprehensive PEMFC mathematical model introduced in previous studies is experimentally validated. Numerical results are obtained for an existing set of commercial unit PEM fuel cells. The model accounts for pressure drops in the gas channels, and for temperature gradients with respect to space in the flow direction, that are investigated by direct infrared imaging, showing that even at low current operation such gradients are present in fuel cell operation, and therefore should be considered by a PEMFC model, since large coolant flow rates are limited due to induced high pressure drops in the cooling channels. The computed polarization and power curves are directly compared to the experimentally measured ones with good qualitative and quantitative agreement. The combination of accuracy and low computational time allow for the future utilization of the model as a reliable tool for PEMFC simulation, control, design and optimization purposes. (author)

  7. A method for simulating sediment incipient motion varying with time and space in an ocean model (FVCOM): development and validation

    Science.gov (United States)

    Zhu, Zichen; Wang, Yongzhi; Bian, Shuhua; Hu, Zejian; Liu, Jianqiang; Liu, Lejun

    2017-11-01

    We modified the sediment incipient motion in a numerical model and evaluated the impact of this modification using a study case of the coastal area around Weihai, China. The modified and unmodified versions of the model were validated by comparing simulated and observed data of currents, waves, and suspended sediment concentrations (SSC) measured from July 25th to July 26th, 2006. A fitted Shields diagram was introduced into the sediment model so that the critical erosional shear stress could vary with time. Thus, the simulated SSC patterns were improved to more closely reflect the observed values, so that the relative error of the variation range decreased by up to 34.5% and the relative error of simulated temporally averaged SSC decreased by up to 36%. In the modified model, the critical shear stress values of the simulated silt with a diameter of 0.035 mm and mud with a diameter of 0.004 mm varied from 0.05 to 0.13 N/m2, and from 0.05 to 0.14 N/m 2, respectively, instead of remaining constant in the unmodified model. Besides, a method of applying spatially varying fractions of the mixed grain size sediment improved the simulated SSC distribution to fit better to the remote sensing map and reproduced the zonal area with high SSC between Heini Bay and the erosion groove in the modified model. The Relative Mean Absolute Error was reduced by between 6% and 79%, depending on the regional attributes when we used the modified method to simulate incipient sediment motion. But the modification achieved the higher accuracy in this study at a cost of computation speed decreasing by 1.52%.

  8. Development and validation of a detailed TRNSYS-Matlab model for large solar collector fields for district heating applications

    DEFF Research Database (Denmark)

    Bava, Federico; Furbo, Simon

    2017-01-01

    This study describes the development of a detailed TRNSYS-Matlab model to simulate the behavior of a large solar collector field for district heating application. The model includes and investigates aspects which are not always considered by simpler models, such as flow distribution...... programming and computing time. Thermal capacity was worth being considered only for the bulkier components, such as the longer distribution and transmission pipes. The actual control strategy, which regulates the flow rates in the solar heating plant, was accurately reproduced in the model, as proved...... in the different rows, effect of the flow regime on the collector efficiency, thermal capacity of the components and effect of shadows from row to row. The model was compared with measurements from a solar collector field and the impact of each aspect was evaluated. A good agreement between model and measurements...

  9. Validation of a phytoremediation computer model

    Energy Technology Data Exchange (ETDEWEB)

    Corapcioglu, M Y; Sung, K; Rhykerd, R L; Munster, C; Drew, M [Texas A and M Univ., College Station, TX (United States)

    1999-01-01

    The use of plants to stimulate remediation of contaminated soil is an effective, low-cost cleanup method which can be applied to many different sites. A phytoremediation computer model has been developed to simulate how recalcitrant hydrocarbons interact with plant roots in unsaturated soil. A study was conducted to provide data to validate and calibrate the model. During the study, lysimeters were constructed and filled with soil contaminated with 10 [mg kg[sub -1

  10. Development and validation of bubble breakup and coalescence constitutive models for the one-group interfacial area transport equation

    International Nuclear Information System (INIS)

    Pellacani, Filippo

    2012-01-01

    A local mechanistic model for bubble coalescence and breakup for the one-group interfacial area transport equation has been developed, in agreement and within the limits of the current understanding, based on an exhaustive survey of the theory and of the state of the art models for bubble dynamics simulation. The new model has been tested using the commercial 3D CFD code ANSYS CFX. Upward adiabatic turbulent air-water bubbly flow has been simulated and the results have been compared with the data obtained in the experimental facility PUMA. The range of the experimental data available spans between 0.5 to 2 m/s liquid velocity and 5 to 15 % volume fraction. For the implementation of the models, both the monodispersed and the interfacial area transport equation approaches have been used. The first one to perform a detailed analysis of the forces and models to reproduce the dynamic of the dispersed phase adequately and to be used in the next phases of the work. Also two different bubble induced turbulence models have been tested to consider the effect of the presence of the gas phase on the turbulence of the liquid phase. The interfacial area transport equation has been successfully implemented into the CFD code and the state of the art breakup and coalescence models have been used for simulation. The limitations of the actual theory have been shown and a new bubble interactions model has been developed. The simulations showed that a considerable improvement is achieved if compared to the state of the art closure models. Limits in the implementation derive from the actual understanding and formulation of the bubbly dynamics. A strong dependency on the interfacial non-drag force models and coefficients have been shown. More experimental and theory work needs to be done in this field to increase the prediction capability of the simulation tools regarding the distribution of the phases along the pipe radius.

  11. Development and validation of the Brazilian version of the Attitudes to Aging Questionnaire (AAQ: An example of merging classical psychometric theory and the Rasch measurement model

    Directory of Open Access Journals (Sweden)

    Trentini Clarissa M

    2008-01-01

    Full Text Available Abstract Background Aging has determined a demographic shift in the world, which is considered a major societal achievement, and a challenge. Aging is primarily a subjective experience, shaped by factors such as gender and culture. There is a lack of instruments to assess attitudes to aging adequately. In addition, there is no instrument developed or validated in developing region contexts, so that the particularities of ageing in these areas are not included in the measures available. This paper aims to develop and validate a reliable attitude to aging instrument by combining classical psychometric approach and Rasch analysis. Methods Pilot study and field trial are described in details. Statistical analysis included classic psychometric theory (EFA and CFA and Rasch measurement model. The latter was applied to examine unidimensionality, response scale and item fit. Results Sample was composed of 424 Brazilian old adults, which was compared to an international sample (n = 5238. The final instrument shows excellent psychometric performance (discriminant validity, confirmatory factor analysis and Rasch fit statistics. Rasch analysis indicated that modifications in the response scale and item deletions improved the initial solution derived from the classic approach. Conclusion The combination of classic and modern psychometric theories in a complementary way is fruitful for development and validation of instruments. The construction of a reliable Brazilian Attitudes to Aging Questionnaire is important for assessing cultural specificities of aging in a transcultural perspective and can be applied in international cross-cultural investigations running less risk of cultural bias.

  12. Development and External Validation of Prognostic Model for 2-Year Survival of Non-Small-Cell Lung Cancer Patients Treated With Chemoradiotherapy

    International Nuclear Information System (INIS)

    Dehing-Oberije, Cary; Yu Shipeng; De Ruysscher, Dirk; Meersschout, Sabine; Van Beek, Karen; Lievens, Yolande; Van Meerbeeck, Jan; De Neve, Wilfried; Rao, Bharat Ph.D.; Weide, Hiska van der; Lambin, Philippe

    2009-01-01

    Purpose: Radiotherapy, combined with chemotherapy, is the treatment of choice for a large group of non-small-cell lung cancer (NSCLC) patients. Recent developments in the treatment of these patients have led to improved survival. However, the clinical TNM stage is highly inaccurate for the prediction of survival, and alternatives are lacking. The objective of this study was to develop and validate a prediction model for survival of NSCLC patients, treated with chemoradiotherapy. Patients and Methods: The clinical data from 377 consecutive inoperable NSCLC patients, Stage I-IIIB, treated radically with chemoradiotherapy were collected. A prognostic model for 2-year survival was developed, using 2-norm support vector machines. The performance of the model was expressed as the area under the curve of the receiver operating characteristic and assessed using leave-one-out cross-validation, as well as two external data sets. Results: The final multivariate model consisted of gender, World Health Organization performance status, forced expiratory volume in 1 s, number of positive lymph node stations, and gross tumor volume. The area under the curve, assessed by leave-one-out cross-validation, was 0.74, and application of the model to the external data sets yielded an area under the curve of 0.75 and 0.76. A high- and low-risk group could be clearly identified using a risk score based on the model. Conclusion: The multivariate model performed very well and was able to accurately predict the 2-year survival of NSCLC patients treated with chemoradiotherapy. The model could support clinicians in the treatment decision-making process.

  13. Development and Validation of an Older Occupant Finite Element Model of a Mid-Sized Male for Investigation of Age-related Injury Risk.

    Science.gov (United States)

    Schoell, Samantha L; Weaver, Ashley A; Urban, Jillian E; Jones, Derek A; Stitzel, Joel D; Hwang, Eunjoo; Reed, Matthew P; Rupp, Jonathan D; Hu, Jingwen

    2015-11-01

    The aging population is a growing concern as the increased fragility and frailty of the elderly results in an elevated incidence of injury as well as an increased risk of mortality and morbidity. To assess elderly injury risk, age-specific computational models can be developed to directly calculate biomechanical metrics for injury. The first objective was to develop an older occupant Global Human Body Models Consortium (GHBMC) average male model (M50) representative of a 65 year old (YO) and to perform regional validation tests to investigate predicted fractures and injury severity with age. Development of the GHBMC M50 65 YO model involved implementing geometric, cortical thickness, and material property changes with age. Regional validation tests included a chest impact, a lateral impact, a shoulder impact, a thoracoabdominal impact, an abdominal bar impact, a pelvic impact, and a lateral sled test. The second objective was to investigate age-related injury risks by performing a frontal US NCAP simulation test with the GHBMC M50 65 YO and the GHBMC M50 v4.2 models. Simulation results were compared to the GHBMC M50 v4.2 to evaluate the effect of age on occupant response and risk for head injury, neck injury, thoracic injury, and lower extremity injury. Overall, the GHBMC M50 65 YO model predicted higher probabilities of AIS 3+ injury for the head and thorax.

  14. Development and validation of a stochastic model for potential growth of Listeria monocytogenes in naturally contaminated lightly preserved seafood

    DEFF Research Database (Denmark)

    Mejlholm, Ole; Bøknæs, Niels; Dalgaard, Paw

    2015-01-01

    added acetic and/or lactic acids. The stochastic model was developed from an existing deterministic model including the effect of 12 environmental parameters and microbial interaction (O. Mejlholm and P. Dalgaard, Food Microbiology, submitted for publication). Observed maximum population density (MPD...... of the least and most preserved sample of CSGH and CSS were used as input for the existing deterministic model. For both modelling approaches, it was shown that lag time and the effect of microbial interaction needs to be included to accurately predict MPD values of L. monocytogenes. Addition of organic acids...... to CSGH and CSS was confirmed as a suitable mitigation strategy against the risk of growth by L. monocytogenes as both types of products were in compliance with the EU regulation on ready-to-eat foods....

  15. Prediction of early death among patients enrolled in phase I trials: development and validation of a new model based on platelet count and albumin.

    Science.gov (United States)

    Ploquin, A; Olmos, D; Lacombe, D; A'Hern, R; Duhamel, A; Twelves, C; Marsoni, S; Morales-Barrera, R; Soria, J-C; Verweij, J; Voest, E E; Schöffski, P; Schellens, J H; Kramar, A; Kristeleit, R S; Arkenau, H-T; Kaye, S B; Penel, N

    2012-09-25

    Selecting patients with 'sufficient life expectancy' for Phase I oncology trials remains challenging. The Royal Marsden Hospital Score (RMS) previously identified high-risk patients as those with ≥ 2 of the following: albumin upper limit of normal; >2 metastatic sites. This study developed an alternative prognostic model, and compared its performance with that of the RMS. The primary end point was the 90-day mortality rate. The new model was developed from the same database as RMS, but it used Chi-squared Automatic Interaction Detection (CHAID). The ROC characteristics of both methods were then validated in an independent database of 324 patients enrolled in European Organization on Research and Treatment of Cancer Phase I trials of cytotoxic agents between 2000 and 2009. The CHAID method identified high-risk patients as those with albumin model and RMS, respectively. The negative predictive values (NPV) were similar for the CHAID model and RMS. The CHAID model and RMS provided a similarly high level of NPV, but the CHAID model gave a better accuracy in the validation set. Both CHAID model and RMS may improve the screening process in phase I trials.

  16. On the development of a coupled regional climate-vegetation model RCM-CLM-CN-DV and its validation in Tropical Africa

    Science.gov (United States)

    Wang, Guiling; Yu, Miao; Pal, Jeremy S.; Mei, Rui; Bonan, Gordon B.; Levis, Samuel; Thornton, Peter E.

    2016-01-01

    This paper presents a regional climate system model RCM-CLM-CN-DV and its validation over Tropical Africa. The model development involves the initial coupling between the ICTP regional climate model RegCM4.3.4 (RCM) and the Community Land Model version 4 (CLM4) including models of carbon-nitrogen dynamics (CN) and vegetation dynamics (DV), and further improvements of the models. Model improvements derive from the new parameterization from CLM4.5 that addresses the well documented overestimation of gross primary production (GPP), a refinement of stress deciduous phenology scheme in CN that addresses a spurious LAI fluctuation for drought-deciduous plants, and the incorporation of a survival rule into the DV model to prevent tropical broadleaf evergreens trees from growing in areas with a prolonged drought season. The impact of the modifications on model results is documented based on numerical experiments using various subcomponents of the model. The performance of the coupled model is then validated against observational data based on three configurations with increasing capacity: RCM-CLM with prescribed leaf area index and fractional coverage of different plant functional types (PFTs); RCM-CLM-CN with prescribed PFTs coverage but prognostic plant phenology; RCM-CLM-CN-DV in which both the plant phenology and PFTs coverage are simulated by the model. Results from these three models are compared against the FLUXNET up-scaled GPP and ET data, LAI and PFT coverages from remote sensing data including MODIS and GIMMS, University of Delaware precipitation and temperature data, and surface radiation data from MVIRI and SRB. Our results indicate that the models perform well in reproducing the physical climate and surface radiative budgets in the domain of interest. However, PFTs coverage is significantly underestimated by the model over arid and semi-arid regions of Tropical Africa, caused by an underestimation of LAI in these regions by the CN model that gets exacerbated

  17. Development, validation and application of a fixed district heating model structure that requires small amounts of input data

    International Nuclear Information System (INIS)

    Aberg, Magnus; Widén, Joakim

    2013-01-01

    Highlights: • A fixed model structure for cost-optimisaton studies of DH systems is developed. • A method for approximating heat demands using outdoor temperature data is developed. • Six different Swedish district heating systems are modelled and studied. • The impact of heat demand change on heat and electricity production is examined. • Reduced heat demand leads to less use of fossil fuels and biomass in the modelled systems. - Abstract: Reducing the energy use of buildings is an important part in reaching the European energy efficiency targets. Consequently, local energy systems need to adapt to a lower demand for heating. A 90% of Swedish multi-family residential buildings use district heating (DH) produced in Sweden’s over 400 DH systems, which use different heat production technologies and fuels. DH system modelling results obtained until now are mostly for particular DH systems and cannot be easily generalised. Here, a fixed model structure (FMS) based on linear programming for cost-optimisaton studies of DH systems is developed requiring only general DH system information. A method for approximating heat demands based on local outdoor temperature data is also developed. A scenario is studied where the FMS is applied to six Swedish DH systems and heat demands are reduced due to energy efficiency improvements in buildings. The results show that the FMS is a useful tool for DH system optimisation studies and that building energy efficiency improvements lead to reduced use of fossil fuels and biomass in DH systems. Also, the share of CHP in the production mix is increased in five of the six DH systems when the heat demand is reduced

  18. Attempted development and cross-validation of predictive models of individual-level and organizational-level turnover of nuclear power operators

    International Nuclear Information System (INIS)

    Vasa-Sideris, S.J.

    1989-01-01

    Nuclear power accounts for 209% of the electric power generated in the U.S. by 107 nuclear plants which employ over 8,700 operators. Operator turnover is significant to utilities from the economic point of view since it costs almost three hundred thousand dollars to train and qualify one operator, and because turnover affects plant operability and therefore plant safety. The study purpose was to develop and cross-validate individual-level and organizational-level models of turnover of nuclear power plant operators. Data were obtained by questionnaires and from published data for 1983 and 1984 on a number of individual, organizational, and environmental predictors. Plants had been in operation for two or more years. Questionnaires were returned by 29 out of 50 plants on over 1600 operators. The objectives were to examine the reliability of the turnover criterion, to determine the classification accuracy of the multivariate predictive models and of categories of predictors (individual, organizational, and environmental) and to determine if a homology existed between the individual-level and organizational-level models. The method was to examine the shrinkage that occurred between foldback design (in which the predictive models were reapplied to the data used to develop them) and cross-validation. Results did not support the hypothesis objectives. Turnover data were accurate but not stable between the two years. No significant differences were detected between the low and high turnover groups at the organization or individual level in cross-validation. Lack of stability in the criterion, restriction of range, and small sample size at the organizational level were serious limitations of this study. The results did support the methods. Considerable shrinkage occurred between foldback and cross-validation of the models

  19. The Incidence Patterns Model to Estimate the Distribution of New HIV Infections in Sub-Saharan Africa: Development and Validation of a Mathematical Model.

    Directory of Open Access Journals (Sweden)

    Annick Bórquez

    2016-09-01

    Full Text Available Programmatic planning in HIV requires estimates of the distribution of new HIV infections according to identifiable characteristics of individuals. In sub-Saharan Africa, robust routine data sources and historical epidemiological observations are available to inform and validate such estimates.We developed a predictive model, the Incidence Patterns Model (IPM, representing populations according to factors that have been demonstrated to be strongly associated with HIV acquisition risk: gender, marital/sexual activity status, geographic location, "key populations" based on risk behaviours (sex work, injecting drug use, and male-to-male sex, HIV and ART status within married or cohabiting unions, and circumcision status. The IPM estimates the distribution of new infections acquired by group based on these factors within a Bayesian framework accounting for regional prior information on demographic and epidemiological characteristics from trials or observational studies. We validated and trained the model against direct observations of HIV incidence by group in seven rounds of cohort data from four studies ("sites" conducted in Manicaland, Zimbabwe; Rakai, Uganda; Karonga, Malawi; and Kisesa, Tanzania. The IPM performed well, with the projections' credible intervals for the proportion of new infections per group overlapping the data's confidence intervals for all groups in all rounds of data. In terms of geographical distribution, the projections' credible intervals overlapped the confidence intervals for four out of seven rounds, which were used as proxies for administrative divisions in a country. We assessed model performance after internal training (within one site and external training (between sites by comparing mean posterior log-likelihoods and used the best model to estimate the distribution of HIV incidence in six countries (Gabon, Kenya, Malawi, Rwanda, Swaziland, and Zambia in the region. We subsequently inferred the potential

  20. The Incidence Patterns Model to Estimate the Distribution of New HIV Infections in Sub-Saharan Africa: Development and Validation of a Mathematical Model.

    Science.gov (United States)

    Bórquez, Annick; Cori, Anne; Pufall, Erica L; Kasule, Jingo; Slaymaker, Emma; Price, Alison; Elmes, Jocelyn; Zaba, Basia; Crampin, Amelia C; Kagaayi, Joseph; Lutalo, Tom; Urassa, Mark; Gregson, Simon; Hallett, Timothy B

    2016-09-01

    Programmatic planning in HIV requires estimates of the distribution of new HIV infections according to identifiable characteristics of individuals. In sub-Saharan Africa, robust routine data sources and historical epidemiological observations are available to inform and validate such estimates. We developed a predictive model, the Incidence Patterns Model (IPM), representing populations according to factors that have been demonstrated to be strongly associated with HIV acquisition risk: gender, marital/sexual activity status, geographic location, "key populations" based on risk behaviours (sex work, injecting drug use, and male-to-male sex), HIV and ART status within married or cohabiting unions, and circumcision status. The IPM estimates the distribution of new infections acquired by group based on these factors within a Bayesian framework accounting for regional prior information on demographic and epidemiological characteristics from trials or observational studies. We validated and trained the model against direct observations of HIV incidence by group in seven rounds of cohort data from four studies ("sites") conducted in Manicaland, Zimbabwe; Rakai, Uganda; Karonga, Malawi; and Kisesa, Tanzania. The IPM performed well, with the projections' credible intervals for the proportion of new infections per group overlapping the data's confidence intervals for all groups in all rounds of data. In terms of geographical distribution, the projections' credible intervals overlapped the confidence intervals for four out of seven rounds, which were used as proxies for administrative divisions in a country. We assessed model performance after internal training (within one site) and external training (between sites) by comparing mean posterior log-likelihoods and used the best model to estimate the distribution of HIV incidence in six countries (Gabon, Kenya, Malawi, Rwanda, Swaziland, and Zambia) in the region. We subsequently inferred the potential contribution of each

  1. Development and Validation of a Biodynamic Model for Mechanistically Predicting Metal Accumulation in Fish-Parasite Systems.

    Directory of Open Access Journals (Sweden)

    T T Yen Le

    Full Text Available Because of different reported effects of parasitism on the accumulation of metals in fish, it is important to consider parasites while interpreting bioaccumulation data from biomonitoring programmes. Accordingly, the first step is to take parasitism into consideration when simulating metal bioaccumulation in the fish host under laboratory conditions. In the present study, the accumulation of metals in fish-parasite systems was simulated by a one-compartment toxicokinetic model and compared to uninfected conspecifics. As such, metal accumulation in fish was assumed to result from a balance of different uptake and loss processes depending on the infection status. The uptake by parasites was considered an efflux from the fish host, similar to elimination. Physiological rate constants for the uninfected fish were parameterised based on the covalent index and the species weight while the parameterisation for the infected fish was carried out based on the reported effects of parasites on the uptake kinetics of the fish host. The model was then validated for the system of the chub Squalius cephalus and the acanthocephalan Pomphorhynchus tereticollis following 36-day exposure to waterborne Pb. The dissolved concentration of Pb in the exposure tank water fluctuated during the exposure, ranging from 40 to 120 μg/L. Generally, the present study shows that the one-compartment model can be an effective method for simulating the accumulation of metals in fish, taking into account effects of parasitism. In particular, the predicted concentrations of Cu, Fe, Zn, and Pb in the uninfected chub as well as in the infected chub and the acanthocephalans were within one order of magnitude of the measurements. The variation in the absorption efficiency and the elimination rate constant of the uninfected chub resulted in variations of about one order of magnitude in the predicted concentrations of Pb. Inclusion of further assumptions for simulating metal accumulation

  2. Multi-body simulation of a canine hind limb: model development, experimental validation and calculation of ground reaction forces

    Directory of Open Access Journals (Sweden)

    Wefstaedt Patrick

    2009-11-01

    Full Text Available Abstract Background Among other causes the long-term result of hip prostheses in dogs is determined by aseptic loosening. A prevention of prosthesis complications can be achieved by an optimization of the tribological system which finally results in improved implant duration. In this context a computerized model for the calculation of hip joint loadings during different motions would be of benefit. In a first step in the development of such an inverse dynamic multi-body simulation (MBS- model we here present the setup of a canine hind limb model applicable for the calculation of ground reaction forces. Methods The anatomical geometries of the MBS-model have been established using computer tomography- (CT- and magnetic resonance imaging- (MRI- data. The CT-data were collected from the pelvis, femora, tibiae and pads of a mixed-breed adult dog. Geometric information about 22 muscles of the pelvic extremity of 4 mixed-breed adult dogs was determined using MRI. Kinematic and kinetic data obtained by motion analysis of a clinically healthy dog during a gait cycle (1 m/s on an instrumented treadmill were used to drive the model in the multi-body simulation. Results and Discussion As a result the vertical ground reaction forces (z-direction calculated by the MBS-system show a maximum deviation of 1.75%BW for the left and 4.65%BW for the right hind limb from the treadmill measurements. The calculated peak ground reaction forces in z- and y-direction were found to be comparable to the treadmill measurements, whereas the curve characteristics of the forces in y-direction were not in complete alignment. Conclusion In conclusion, it could be demonstrated that the developed MBS-model is suitable for simulating ground reaction forces of dogs during walking. In forthcoming investigations the model will be developed further for the calculation of forces and moments acting on the hip joint during different movements, which can be of help in context with the in

  3. Development and validation of a prognostic model incorporating texture analysis derived from standardised segmentation of PET in patients with oesophageal cancer

    Energy Technology Data Exchange (ETDEWEB)

    Foley, Kieran G. [Cardiff University, Division of Cancer and Genetics, Cardiff (United Kingdom); Hills, Robert K. [Cardiff University, Haematology Clinical Trials Unit, Cardiff (United Kingdom); Berthon, Beatrice; Marshall, Christopher [Wales Research and Diagnostic PET Imaging Centre, Cardiff (United Kingdom); Parkinson, Craig; Spezi, Emiliano [Cardiff University, School of Engineering, Cardiff (United Kingdom); Lewis, Wyn G. [University Hospital of Wales, Department of Upper GI Surgery, Cardiff (United Kingdom); Crosby, Tom D.L. [Department of Oncology, Velindre Cancer Centre, Cardiff (United Kingdom); Roberts, Stuart Ashley [University Hospital of Wales, Department of Clinical Radiology, Cardiff (United Kingdom)

    2018-01-15

    This retrospective cohort study developed a prognostic model incorporating PET texture analysis in patients with oesophageal cancer (OC). Internal validation of the model was performed. Consecutive OC patients (n = 403) were chronologically separated into development (n = 302, September 2010-September 2014, median age = 67.0, males = 227, adenocarcinomas = 237) and validation cohorts (n = 101, September 2014-July 2015, median age = 69.0, males = 78, adenocarcinomas = 79). Texture metrics were obtained using a machine-learning algorithm for automatic PET segmentation. A Cox regression model including age, radiological stage, treatment and 16 texture metrics was developed. Patients were stratified into quartiles according to a prognostic score derived from the model. A p-value < 0.05 was considered statistically significant. Primary outcome was overall survival (OS). Six variables were significantly and independently associated with OS: age [HR =1.02 (95% CI 1.01-1.04), p < 0.001], radiological stage [1.49 (1.20-1.84), p < 0.001], treatment [0.34 (0.24-0.47), p < 0.001], log(TLG) [5.74 (1.44-22.83), p = 0.013], log(Histogram Energy) [0.27 (0.10-0.74), p = 0.011] and Histogram Kurtosis [1.22 (1.04-1.44), p = 0.017]. The prognostic score demonstrated significant differences in OS between quartiles in both the development (X{sup 2} 143.14, df 3, p < 0.001) and validation cohorts (X{sup 2} 20.621, df 3, p < 0.001). This prognostic model can risk stratify patients and demonstrates the additional benefit of PET texture analysis in OC staging. (orig.)

  4. Development of Simplified and Dynamic Model for Double Glazing Unit Validated with Full-Scale Facade Element

    DEFF Research Database (Denmark)

    Liu, Mingzhe; Wittchen, Kim Bjarne; Heiselberg, Per

    2012-01-01

    The project aims at developing simplified calculation methods for the different features that influence energy demand and indoor environment behind “intelligent” glazed façades. This paper describes how to set up simplified model to calculate the thermal and solar properties (U and g value......) together with comfort performance (internal surface temperature of the glazing) of a double glazing unit. Double glazing unit is defined as 1D model with nodes representing different layers of material. Several models with different number of nodes and position of these are compared and verified in order...... to find a simplified method which can calculate the performance as accurately as possible. The calculated performance in terms of internal surface temperature is verified with experimental data collected in a full-scale façade element test facility at Aalborg University (DK). The advantage...

  5. Molecular Dynamics Studies of Liposomes as Carriers for Photosensitizing Drugs: Development, Validation, and Simulations with a Coarse-Grained Model.

    Science.gov (United States)

    Jämbeck, Joakim P M; Eriksson, Emma S E; Laaksonen, Aatto; Lyubartsev, Alexander P; Eriksson, Leif A

    2014-01-14

    Liposomes are proposed as drug delivery systems and can in principle be designed so as to cohere with specific tissue types or local environments. However, little detail is known about the exact mechanisms for drug delivery and the distributions of drug molecules inside the lipid carrier. In the current work, a coarse-grained (CG) liposome model is developed, consisting of over 2500 lipids, with varying degrees of drug loading. For the drug molecule, we chose hypericin, a natural compound proposed for use in photodynamic therapy, for which a CG model was derived and benchmarked against corresponding atomistic membrane bilayer model simulations. Liposomes with 21-84 hypericin molecules were generated and subjected to 10 microsecond simulations. Distribution of the hypericins, their orientations within the lipid bilayer, and the potential of mean force for transferring a hypericin molecule from the interior aqueous "droplet" through the liposome bilayer are reported herein.

  6. Verifying and Validating Simulation Models

    Energy Technology Data Exchange (ETDEWEB)

    Hemez, Francois M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-02-23

    This presentation is a high-level discussion of the Verification and Validation (V&V) of computational models. Definitions of V&V are given to emphasize that “validation” is never performed in a vacuum; it accounts, instead, for the current state-of-knowledge in the discipline considered. In particular comparisons between physical measurements and numerical predictions should account for their respective sources of uncertainty. The differences between error (bias), aleatoric uncertainty (randomness) and epistemic uncertainty (ignorance, lack-of- knowledge) are briefly discussed. Four types of uncertainty in physics and engineering are discussed: 1) experimental variability, 2) variability and randomness, 3) numerical uncertainty and 4) model-form uncertainty. Statistical sampling methods are available to propagate, and analyze, variability and randomness. Numerical uncertainty originates from the truncation error introduced by the discretization of partial differential equations in time and space. Model-form uncertainty is introduced by assumptions often formulated to render a complex problem more tractable and amenable to modeling and simulation. The discussion concludes with high-level guidance to assess the “credibility” of numerical simulations, which stems from the level of rigor with which these various sources of uncertainty are assessed and quantified.

  7. Development and validation of a logistic regression model to distinguish transition zone cancers from benign prostatic hyperplasia on multi-parametric prostate MRI

    Energy Technology Data Exchange (ETDEWEB)

    Iyama, Yuji [Kumamoto Chuo Hospital, Department of Diagnostic Radiology, Kumamoto, Kumamoto (Japan); Kumamoto University, Department of Diagnostic Radiology, Graduate School of Medical Sciences, Kumamoto, Kumamoto (Japan); Nakaura, Takeshi; Nagayama, Yasunori; Utsunomiya, Daisuke; Yamashita, Yasuyuki [Kumamoto University, Department of Diagnostic Radiology, Graduate School of Medical Sciences, Kumamoto, Kumamoto (Japan); Katahira, Kazuhiro; Oda, Seitaro [Kumamoto Chuo Hospital, Department of Diagnostic Radiology, Kumamoto, Kumamoto (Japan); Iyama, Ayumi [National Hospital Organization Kumamoto Medical Center, Department of Diagnostic Radiology, Kumamoto, Kumamoto (Japan)

    2017-09-15

    To develop a prediction model to distinguish between transition zone (TZ) cancers and benign prostatic hyperplasia (BPH) on multi-parametric prostate magnetic resonance imaging (mp-MRI). This retrospective study enrolled 60 patients with either BPH or TZ cancer, who had undergone 3 T-MRI. We generated ten parameters for T2-weighted images (T2WI), diffusion-weighted images (DWI) and dynamic MRI. Using a t-test and multivariate logistic regression (LR) analysis to evaluate the parameters' accuracy, we developed LR models. We calculated the area under the receiver operating characteristic curve (ROC) of LR models by a leave-one-out cross-validation procedure, and the LR model's performance was compared with radiologists' performance with their opinion and with the Prostate Imaging Reporting and Data System (Pi-RADS v2) score. Multivariate LR analysis showed that only standardized T2WI signal and mean apparent diffusion coefficient (ADC) maintained their independent values (P < 0.001). The validation analysis showed that the AUC of the final LR model was comparable to that of board-certified radiologists, and superior to that of Pi-RADS scores. A standardized T2WI and mean ADC were independent factors for distinguishing between BPH and TZ cancer. The performance of the LR model was comparable to that of experienced radiologists. (orig.)

  8. Innovation, Product Development, and New Business Models in Networks: How to come from case studies to a valid and operational theory

    DEFF Research Database (Denmark)

    Rasmussen, Erik Stavnsager; Jørgensen, Jacob Høj; Goduscheit, René Chester

    2007-01-01

    We have in the research project NEWGIBM (New Global ICT based Business Models) during 2005 and 2006 closely cooperated with a group of firms. The focus in the project has been development of new business models (and innovation) in close cooperation with multiple partners. These partners have been...... customers, suppliers, R&D partners, and others. The methodological problem is thus, how to come from e.g. one in-depth case study to a more formalized theory or model on how firms can develop new projects and be innovative in a network. The paper is structured so that it starts with a short presentation...... of the two key concepts in our research setting and theoretical models: Innovation and networks. It is not our intention in this paper to present a lengthy discussion of the two concepts, but a short presentation is necessary to understand the validity and interpretation discussion later in the paper. Next...

  9. Laboratory research program to aid in developing and testing the validity of conceptual models for flow and transport through unsaturated porous media

    International Nuclear Information System (INIS)

    Glass, R.J.

    1991-01-01

    As part of the Yucca Mountain Project, a laboratory research program is being developed at Sandia National Laboratories that will integrate fundamental physical experimentation with conceptual model formulation and mathematical modeling and aid in subsequent model validation for unsaturated zone water and contaminant transport. Experimental systems are being developed to explore flow and transport processes and assumptions of fundamental importance to various conceptual models. Experimentation will run concurrently in two types of systems: fractured and nonfractured tuffaceous systems; and analogue systems having specific characteristics of the tuff systems but designed to maximize experimental control and resolution of data measurement. Areas in which experimentation currently is directed include infiltration flow instability, water and solute movement in unsaturated fractures, fracture-matrix interaction, and scaling laws to define effective large-scale properties for heterogeneous, fractured media. 16 refs

  10. Developing and validating a tablet version of an illness explanatory model interview for a public health survey in Pune, India.

    Directory of Open Access Journals (Sweden)

    Joseph G Giduthuri

    Full Text Available BACKGROUND: Mobile electronic devices are replacing paper-based instruments and questionnaires for epidemiological and public health research. The elimination of a data-entry step after an interview is a notable advantage over paper, saving investigator time, decreasing the time lags in managing and analyzing data, and potentially improving the data quality by removing the error-prone data-entry step. Research has not yet provided adequate evidence, however, to substantiate the claim of fewer errors for computerized interviews. METHODOLOGY: We developed an Android-based illness explanatory interview for influenza vaccine acceptance and tested the instrument in a field study in Pune, India, for feasibility and acceptability. Error rates for tablet and paper were compared with reference to the voice recording of the interview as gold standard to assess discrepancies. We also examined the preference of interviewers for the classical paper-based or the electronic version of the interview and compared the costs of research with both data collection devices. RESULTS: In 95 interviews with household respondents, total error rates with paper and tablet devices were nearly the same (2.01% and 1.99% respectively. Most interviewers indicated no preference for a particular device; but those with a preference opted for tablets. The initial investment in tablet-based interviews was higher compared to paper, while the recurring costs per interview were lower with the use of tablets. CONCLUSION: An Android-based tablet version of a complex interview was developed and successfully validated. Advantages were not compromised by increased errors, and field research assistants with a preference preferred the Android device. Use of tablets may be more costly than paper for small samples and less costly for large studies.

  11. Development of Prediction Model and Experimental Validation in Predicting the Curcumin Content of Turmeric (Curcuma longa L.).

    Science.gov (United States)

    Akbar, Abdul; Kuanar, Ananya; Joshi, Raj K; Sandeep, I S; Mohanty, Sujata; Naik, Pradeep K; Mishra, Antaryami; Nayak, Sanghamitra

    2016-01-01

    The drug yielding potential of turmeric ( Curcuma longa L.) is largely due to the presence of phyto-constituent 'curcumin.' Curcumin has been found to possess a myriad of therapeutic activities ranging from anti-inflammatory to neuroprotective. Lack of requisite high curcumin containing genotypes and variation in the curcumin content of turmeric at different agro climatic regions are the major stumbling blocks in commercial production of turmeric. Curcumin content of turmeric is greatly influenced by environmental factors. Hence, a prediction model based on artificial neural network (ANN) was developed to map genome environment interaction basing on curcumin content, soli and climatic factors from different agroclimatic regions for prediction of maximum curcumin content at various sites to facilitate the selection of suitable region for commercial cultivation of turmeric. The ANN model was developed and tested using a data set of 119 generated by collecting samples from 8 different agroclimatic regions of Odisha. The curcumin content from these samples was measured that varied from 7.2% to 0.4%. The ANN model was trained with 11 parameters of soil and climatic factors as input and curcumin content as output. The results showed that feed-forward ANN model with 8 nodes (MLFN-8) was the most suitable one with R 2 value of 0.91. Sensitivity analysis revealed that minimum relative humidity, altitude, soil nitrogen content and soil pH had greater effect on curcumin content. This ANN model has shown proven efficiency for predicting and optimizing the curcumin content at a specific site.

  12. Development of prediction model and experimental validation in predicting the curcumin content of turmeric (Curcuma longa L.

    Directory of Open Access Journals (Sweden)

    Abdul Akbar

    2016-10-01

    Full Text Available The drug yielding potential of turmeric (Curcuma longa L. is largely due to the presence of phyto-constituent ‘curcumin’. Curcumin has been found to possess a myriad of therapeutic activities ranging from anti-inflammatory to neuroprotective. Lack of requisite high curcumin containing genotypes and variation in the curcumin content of turmeric at different agro climatic regions are the major stumbling blocks in commercial production of turmeric. Curcumin content of turmeric is greatly influenced by environmental factors. Hence, a prediction model based on artificial neural network (ANN was developed to map genome environment interaction basing on curcumin content, soli and climatic factors from different agroclimatic regions for prediction of maximum curcumin content at various sites to facilitate the selection of suitable region for commercial cultivation of turmeric. The ANN model was developed and tested using a data set of 119 generated by collecting samples from 8 different agroclimatic regions of Odisha. The curcumin content from these samples was measured that varied from 7.2% to 0.4%. The ANN model was trained with 11 parameters of soil and climatic factors as input and curcumin content as output. The results showed that feed-forward ANN model with 8 nodes (MLFN-8 was the most suitable one with R2 value of 0.91. Sensitivity analysis revealed that minimum relative humidity, altitude, soil nitrogen content and soil pH had greater effect on curcumin content. This ANN model has shown proven efficiency for predicting and optimizing the curcumin content at a specific site.

  13. Development and Validation of a Risk-Score Model for Type 2 Diabetes: A Cohort Study of a Rural Adult Chinese Population.

    Directory of Open Access Journals (Sweden)

    Ming Zhang

    Full Text Available Some global models to predict the risk of diabetes may not be applicable to local populations. We aimed to develop and validate a score to predict type 2 diabetes mellitus (T2DM in a rural adult Chinese population. Data for a cohort of 12,849 participants were randomly divided into derivation (n = 11,564 and validation (n = 1285 datasets. A questionnaire interview and physical and blood biochemical examinations were performed at baseline (July to August 2007 and July to August 2008 and follow-up (July to August 2013 and July to October 2014. A Cox regression model was used to weigh each variable in the derivation dataset. For each significant variable, a score was calculated by multiplying β by 100 and rounding to the nearest integer. Age, body mass index, triglycerides and fasting plasma glucose (scores 3, 12, 24 and 76, respectively were predictors of incident T2DM. The model accuracy was assessed by the area under the receiver operating characteristic curve (AUC, with optimal cut-off value 936. With the derivation dataset, sensitivity, specificity and AUC of the model were 66.7%, 74.0% and 0.768 (95% CI 0.760-0.776, respectively. With the validation dataset, the performance of the model was superior to the Chinese (simple, FINDRISC, Oman and IDRS models of T2DM risk but equivalent to the Framingham model, which is widely applicable in a variety of populations. Our model for predicting 6-year risk of T2DM could be used in a rural adult Chinese population.

  14. Liver stiffness value-based risk estimation of late recurrence after curative resection of hepatocellular carcinoma: development and validation of a predictive model.

    Directory of Open Access Journals (Sweden)

    Kyu Sik Jung

    Full Text Available Preoperative liver stiffness (LS measurement using transient elastography (TE is useful for predicting late recurrence after curative resection of hepatocellular carcinoma (HCC. We developed and validated a novel LS value-based predictive model for late recurrence of HCC.Patients who were due to undergo curative resection of HCC between August 2006 and January 2010 were prospectively enrolled and TE was performed prior to operations by study protocol. The predictive model of late recurrence was constructed based on a multiple logistic regression model. Discrimination and calibration were used to validate the model.Among a total of 139 patients who were finally analyzed, late recurrence occurred in 44 patients, with a median follow-up of 24.5 months (range, 12.4-68.1. We developed a predictive model for late recurrence of HCC using LS value, activity grade II-III, presence of multiple tumors, and indocyanine green retention rate at 15 min (ICG R15, which showed fairly good discrimination capability with an area under the receiver operating characteristic curve (AUROC of 0.724 (95% confidence intervals [CIs], 0.632-0.816. In the validation, using a bootstrap method to assess discrimination, the AUROC remained largely unchanged between iterations, with an average AUROC of 0.722 (95% CIs, 0.718-0.724. When we plotted a calibration chart for predicted and observed risk of late recurrence, the predicted risk of late recurrence correlated well with observed risk, with a correlation coefficient of 0.873 (P<0.001.A simple LS value-based predictive model could estimate the risk of late recurrence in patients who underwent curative resection of HCC.

  15. A proposed best practice model validation framework for banks

    Directory of Open Access Journals (Sweden)

    Pieter J. (Riaan de Jongh

    2017-06-01

    Full Text Available Background: With the increasing use of complex quantitative models in applications throughout the financial world, model risk has become a major concern. The credit crisis of 2008–2009 provoked added concern about the use of models in finance. Measuring and managing model risk has subsequently come under scrutiny from regulators, supervisors, banks and other financial institutions. Regulatory guidance indicates that meticulous monitoring of all phases of model development and implementation is required to mitigate this risk. Considerable resources must be mobilised for this purpose. The exercise must embrace model development, assembly, implementation, validation and effective governance. Setting: Model validation practices are generally patchy, disparate and sometimes contradictory, and although the Basel Accord and some regulatory authorities have attempted to establish guiding principles, no definite set of global standards exists. Aim: Assessing the available literature for the best validation practices. Methods: This comprehensive literature study provided a background to the complexities of effective model management and focussed on model validation as a component of model risk management. Results: We propose a coherent ‘best practice’ framework for model validation. Scorecard tools are also presented to evaluate if the proposed best practice model validation framework has been adequately assembled and implemented. Conclusion: The proposed best practice model validation framework is designed to assist firms in the construction of an effective, robust and fully compliant model validation programme and comprises three principal elements: model validation governance, policy and process.

  16. Development and validation of a new virtual source model for portal image prediction and treatment quality control

    International Nuclear Information System (INIS)

    Chabert, Isabelle

    2015-01-01

    Intensity-Modulated Radiation Therapy (IMRT), require extensive verification procedures to ensure the correct dose delivery. Electronic Portal Imaging Devices (EPIDs) are widely used for quality assurance in radiotherapy, and also for dosimetric verifications. For this latter application, the images obtained during the treatment session can be compared to a pre-calculated reference image in order to highlight dose delivery errors. The quality control performance depends (1) on the accuracy of the pre-calculated reference image (2) on the ability of the tool used to compare images to detect errors. These two key points were studied during this PhD work. We chose to use a Monte Carlo (MC)-based method developed in the laboratory and based on the DPGLM (Dirichlet process generalized linear model) de-noising technique to predict high-resolution reference images. A model of the studied linear accelerator (linac Synergy, Elekta, Crawley, UK) was first developed using the PENELOPE MC codes, and then commissioned using measurements acquired in the Hopital Nord of Marseille. A 71 Go phase space file (PSF) stored under the flattening filter was then analyzed to build a new kind of virtual source model based on correlated histograms (200 Mo). This new and compact VSM is as much accurate as the PSF to calculate dose distributions in water if histogram sampling is based on adaptive method. The associated EPID modelling in PENELOPE suggests that hypothesis about linac primary source were too simple and should be reconsidered. The use of the VSM to predict high-resolution portal images however led to excellent results. The VSM associated to the linac and EPID MC models were used to detect errors in IMRT treatment plans. A preliminary study was conducted introducing on purpose treatment errors in portal image calculations (primary source parameters, phantom position and morphology changes). The γ-index commonly used in clinical routine appears to be less effective than the

  17. Development and Validation of a New Blade Element Momentum Skewed-Wake Model within AeroDyn: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Ning, S. A.; Hayman, G.; Damiani, R.; Jonkman, J.

    2014-12-01

    Blade element momentum methods, though conceptually simple, are highly useful for analyzing wind turbines aerodynamics and are widely used in many design and analysis applications. A new version of AeroDyn is being developed to take advantage of new robust solution methodologies, conform to a new modularization framework for National Renewable Energy Laboratory's FAST, utilize advanced skewed-wake analysis methods, fix limitations with previous implementations, and to enable modeling of highly flexible and nonstraight blades. This paper reviews blade element momentum theory and several of the options available for analyzing skewed inflow. AeroDyn implementation details are described for the benefit of users and developers. These new options are compared to solutions from the previous version of AeroDyn and to experimental data. Finally, recommendations are given on how one might select from the various available solution approaches.

  18. Computational Model Prediction and Biological Validation Using Simplified Mixed Field Exposures for the Development of a GCR Reference Field

    Science.gov (United States)

    Hada, M.; Rhone, J.; Beitman, A.; Saganti, P.; Plante, I.; Ponomarev, A.; Slaba, T.; Patel, Z.

    2018-01-01

    The yield of chromosomal aberrations has been shown to increase in the lymphocytes of astronauts after long-duration missions of several months in space. Chromosome exchanges, especially translocations, are positively correlated with many cancers and are therefore a potential biomarker of cancer risk associated with radiation exposure. Although extensive studies have been carried out on the induction of chromosomal aberrations by low- and high-LET radiation in human lymphocytes, fibroblasts, and epithelial cells exposed in vitro, there is a lack of data on chromosome aberrations induced by low dose-rate chronic exposure and mixed field beams such as those expected in space. Chromosome aberration studies at NSRL will provide the biological validation needed to extend the computational models over a broader range of experimental conditions (more complicated mixed fields leading up to the galactic cosmic rays (GCR) simulator), helping to reduce uncertainties in radiation quality effects and dose-rate dependence in cancer risk models. These models can then be used to answer some of the open questions regarding requirements for a full GCR reference field, including particle type and number, energy, dose rate, and delivery order. In this study, we designed a simplified mixed field beam with a combination of proton, helium, oxygen, and iron ions with shielding or proton, helium, oxygen, and titanium without shielding. Human fibroblasts cells were irradiated with these mixed field beam as well as each single beam with acute and chronic dose rate, and chromosome aberrations (CA) were measured with 3-color fluorescent in situ hybridization (FISH) chromosome painting methods. Frequency and type of CA induced with acute dose rate and chronic dose rates with single and mixed field beam will be discussed. A computational chromosome and radiation-induced DNA damage model, BDSTRACKS (Biological Damage by Stochastic Tracks), was updated to simulate various types of CA induced by

  19. [Risk Prediction Using Routine Data: Development and Validation of Multivariable Models Predicting 30- and 90-day Mortality after Surgical Treatment of Colorectal Cancer].

    Science.gov (United States)

    Crispin, Alexander; Strahwald, Brigitte; Cheney, Catherine; Mansmann, Ulrich

    2018-06-04

    Quality control, benchmarking, and pay for performance (P4P) require valid indicators and statistical models allowing adjustment for differences in risk profiles of the patient populations of the respective institutions. Using hospital remuneration data for measuring quality and modelling patient risks has been criticized by clinicians. Here we explore the potential of prediction models for 30- and 90-day mortality after colorectal cancer surgery based on routine data. Full census of a major statutory health insurer. Surgical departments throughout the Federal Republic of Germany. 4283 and 4124 insurants with major surgery for treatment of colorectal cancer during 2013 and 2014, respectively. Age, sex, primary and secondary diagnoses as well as tumor locations as recorded in the hospital remuneration data according to §301 SGB V. 30- and 90-day mortality. Elixhauser comorbidities, Charlson conditions, and Charlson scores were generated from the ICD-10 diagnoses. Multivariable prediction models were developed using a penalized logistic regression approach (logistic ridge regression) in a derivation set (patients treated in 2013). Calibration and discrimination of the models were assessed in an internal validation sample (patients treated in 2014) using calibration curves, Brier scores, receiver operating characteristic curves (ROC curves) and the areas under the ROC curves (AUC). 30- and 90-day mortality rates in the learning-sample were 5.7 and 8.4%, respectively. The corresponding values in the validation sample were 5.9% and once more 8.4%. Models based on Elixhauser comorbidities exhibited the highest discriminatory power with AUC values of 0.804 (95% CI: 0.776 -0.832) and 0.805 (95% CI: 0.782-0.828) for 30- and 90-day mortality. The Brier scores for these models were 0.050 (95% CI: 0.044-0.056) and 0.067 (95% CI: 0.060-0.074) and similar to the models based on Charlson conditions. Regardless of the model, low predicted probabilities were well calibrated, while

  20. Development and Validation of the Total HUman Model for Safety (THUMS) Version 5 Containing Multiple 1D Muscles for Estimating Occupant Motions with Muscle Activation During Side Impacts.

    Science.gov (United States)

    Iwamoto, Masami; Nakahira, Yuko

    2015-11-01

    Accurate prediction of occupant head kinematics is critical for better understanding of head/face injury mechanisms in side impacts, especially far-side occupants. In light of the fact that researchers have demonstrated that muscle activations, especially in neck muscles, can affect occupant head kinematics, a human body finite element (FE) model that considers muscle activation is useful for predicting occupant head kinematics in real-world automotive accidents. In this study, we developed a human body FE model called the THUMS (Total HUman Model for Safety) Version 5 that contains 262 one-dimensional (1D) Hill-type muscle models over the entire body. The THUMS was validated against 36 series of PMHS (Post Mortem Human Surrogate) and volunteer test data in this study, and 16 series of PMHS and volunteer test data on side impacts are presented. Validation results with force-time curves were also evaluated quantitatively using the CORA (CORrelation and Analysis) method. The validation results suggest that the THUMS has good biofidelity in the responses of the regional or full body for side impacts, but relatively poor biofidelity in its local level of responses such as brain displacements. Occupant kinematics predicted by the THUMS with a muscle controller using 22 PID (Proportional-Integral- Derivative) controllers were compared with those of volunteer test data on low-speed lateral impacts. The THUMS with muscle controller reproduced the head kinematics of the volunteer data more accurately than that without muscle activation, although further studies on validation of torso kinematics are needed for more accurate predictions of occupant head kinematics.

  1. Laboratory research program to aid in developing and testing the validity of conceptual models for flow and transport through unsaturated porous media

    International Nuclear Information System (INIS)

    Glass, R.J.

    1990-01-01

    As part of the Yucca Mountain Project, a laboratory research program is being developed at Sandia National Laboratories that will integrate fundamental physical experimentation with conceptual formulation and mathematical modeling and aid in subsequent model validation for unsaturated zone water and contaminant transport. Experimental systems are being developed to explore flow and transport processes and assumptions of fundamental importance to various conceptual models. Experimentation will run concurrently in two types of systems: fractured and nonfractured tuffaceous systems; and analogue systems having specific characteristics of the tuff systems but designed to maximize experimental control and resolution of data measurement. Questions to which experimentation currently is directed include infiltration flow instability, water and solute movement in unsaturated fractures, fracture-matrix interaction, and the definition of effective large-scale properties for heterogeneous, fractured media. 16 refs

  2. Development of multi-component diesel surrogate fuel models – Part I: Validation of reduced mechanisms of diesel fuel constituents in 0-D kinetic simulations

    DEFF Research Database (Denmark)

    Poon, Hiew Mun; Pang, Kar Mun; Ng, Hoon Kiat

    2016-01-01

    In the present work, development and validation of reduced chemical kinetic mechanisms for several different hydrocarbons are performed. These hydrocarbons are potential representative for practical diesel fuel constituents. n-Hexadecane (HXN), 2,2,4,4,6,8,8-heptamethylnonane (HMN), cyclohexane...... (CHX) and toluene are selected to represent straight-alkane, branched-alkane, cyclo-alkane and aromatic compounds in the diesel fuel. A five-stage chemical kinetic mechanism reduction scheme formulated in the previous work is applied to develop the reduced HMN and CHX models based on their respective...... detailed mechanisms. Alongside with the development of the reduced CHX model, a skeletal toluene sub-mechanism is constructed since the elementary reactions for toluene are subset of the detailed CHX mechanism. The final reduced HMN mechanism comprises 89 species with 319 elementary reactions, while...

  3. Assessment of leaf carotenoids content with a new carotenoid index: Development and validation on experimental and model data

    Science.gov (United States)

    Zhou, Xianfeng; Huang, Wenjiang; Kong, Weiping; Ye, Huichun; Dong, Yingying; Casa, Raffaele

    2017-05-01

    Leaf carotenoids content (LCar) is an important indicator of plant physiological status. Accurate estimation of LCar provides valuable insight into early detection of stress in vegetation. With spectroscopy techniques, a semi-empirical approach based on spectral indices was extensively used for carotenoids content estimation. However, established spectral indices for carotenoids that generally rely on limited measured data, might lack predictive accuracy for carotenoids estimation in various species and at different growth stages. In this study, we propose a new carotenoid index (CARI) for LCar assessment based on a large synthetic dataset simulated from the leaf radiative transfer model PROSPECT-5, and evaluate its capability with both simulated data from PROSPECT-5 and 4SAIL and extensive experimental datasets: the ANGERS dataset and experimental data acquired in field experiments in China in 2004. Results show that CARI was the index most linearly correlated with carotenoids content at the leaf level using a synthetic dataset (R2 = 0.943, RMSE = 1.196 μg/cm2), compared with published spectral indices. Cross-validation results with CARI using ANGERS data achieved quite an accurate estimation (R2 = 0.545, RMSE = 3.413 μg/cm2), though the RBRI performed as the best index (R2 = 0.727, RMSE = 2.640 μg/cm2). CARI also showed good accuracy (R2 = 0.639, RMSE = 1.520 μg/cm2) for LCar assessment with leaf level field survey data, though PRI performed better (R2 = 0.710, RMSE = 1.369 μg/cm2). Whereas RBRI, PRI and other assessed spectral indices showed a good performance for a given dataset, overall their estimation accuracy was not consistent across all datasets used in this study. Conversely CARI was more robust showing good results in all datasets. Further assessment of LCar with simulated and measured canopy reflectance data indicated that CARI might not be very sensitive to LCar changes at low leaf area index (LAI) value, and in these conditions soil moisture

  4. Development and validation of double and single Wiebe function for multi-injection mode Diesel engine combustion modelling for hardware-in-the-loop applications

    International Nuclear Information System (INIS)

    Maroteaux, Fadila; Saad, Charbel; Aubertin, Fabrice

    2015-01-01

    Highlights: • Modelling of Diesel engine combustion with multi-injection mode was conducted. • Double and single Wiebe correlations for pilot, main and post combustion processes were calibrated. • Ignition delay time correlations have been developed and calibrated using experimental data for each injection. • The complete in-cylinder model has been applied successfully to real time simulations on HiL test bed. - Abstract: The improvement of Diesel engine performances in terms of fuel consumption and pollutant emissions has a huge impact on management system and diagnostic procedure. Validation and testing of engine performances can benefit from the use of theoretical models, for the reduction of development time and costs. Hardware in the Loop (HiL) test bench is a suitable way to achieve these objectives. However, the increasing complexity of management systems rises challenges for the development of very reduced physical models able to run in real time applications. This paper presents an extension of a previously developed phenomenological Diesel combustion model suitable for real time applications on a HiL test bench. In the earlier study, the modelling efforts have been targeted at high engine speeds with a very short computational time window, and where the engine operates with single injection. In the present work, a modelling of in-cylinder processes at low and medium engine speeds with multi-injection is performed. In order to reach an adequate computational time, the combustion progress during the pilot and main injection periods has been treated through a double Wiebe function, while the post combustion period has required a single Wiebe function. This paper describes the basic system models and their calibration and validation against experimental data. The use of the developed correlations of Wiebe coefficients and ignition delay times for each combustion phase, included in the in-cylinder crank angle global model, is applied for the prediction

  5. Development and validation of predictive simulation model of multi-layer repair welding process by temper bead technique

    International Nuclear Information System (INIS)

    Okano, Shigetaka; Miyasaka, Fumikazu; Mochizuki, Masahito; Tanaka, Manabu

    2015-01-01

    Stress corrosion cracking (SCC) has recently been observed in the nickel base alloy weld metal of dissimilar pipe joint used in pressurized water reactor (PWR) . Temper bead technique has been developed as one of repair procedures against SCC applicable in case that post weld heat treatment (PWHT) is difficult to carry out. In this regard, however it is essential to pass the property and performance qualification test to confirm the effect of tempering on the mechanical properties at repair welds before temper bead technique is actually used in practice. Thus the appropriate welding procedure conditions in temper bead technique are determined on the basis of the property and performance qualification testing. It is necessary for certifying the structural soundness and reliability at repair welds but takes a lot of work and time in the present circumstances. Therefore it is desirable to establish the reasonable alternatives for qualifying the property and performance at repair welds. In this study, mathematical modeling and numerical simulation procedures were developed for predicting weld bead configuration and temperature distribution during multi-layer repair welding process by temper bead technique. In the developed simulation technique, characteristics of heat source in temper bead welding are calculated from weld heat input conditions through the arc plasma simulation and then weld bead configuration and temperature distribution during temper bead welding are calculated from characteristics of heat source obtained through the coupling analysis between bead surface shape and thermal conduction. The simulation results were compared with the experimental results under the same welding heat input conditions. As the results, the bead surface shape and temperature distribution, such as A cl lines, were in good agreement between simulation and experimental results. It was concluded that the developed simulation technique has the potential to become useful for

  6. Development and validation of a modified Hybrid-III six-year-old dummy model for simulating submarining in motor-vehicle crashes.

    Science.gov (United States)

    Hu, Jingwen; Klinich, Kathleen D; Reed, Matthew P; Kokkolaras, Michael; Rupp, Jonathan D

    2012-06-01

    In motor-vehicle crashes, young school-aged children restrained by vehicle seat belt systems often suffer from abdominal injuries due to submarining. However, the current anthropomorphic test device, so-called "crash dummy", is not adequate for proper simulation of submarining. In this study, a modified Hybrid-III six-year-old dummy model capable of simulating and predicting submarining was developed using MADYMO (TNO Automotive Safety Solutions). The model incorporated improved pelvis and abdomen geometry and properties previously tested in a modified physical dummy. The model was calibrated and validated against four sled tests under two test conditions with and without submarining using a multi-objective optimization method. A sensitivity analysis using this validated child dummy model showed that dummy knee excursion, torso rotation angle, and the difference between head and knee excursions were good predictors for submarining status. It was also shown that restraint system design variables, such as lap belt angle, D-ring height, and seat coefficient of friction (COF), may have opposite effects on head and abdomen injury risks; therefore child dummies and dummy models capable of simulating submarining are crucial for future restraint system design optimization for young school-aged children. Copyright © 2011 IPEM. Published by Elsevier Ltd. All rights reserved.

  7. SU-E-J-244: Development and Validation of a Knowledge Based Planning Model for External Beam Radiation Therapy of Locally Advanced Non-Small Cell Lung Cancer

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Z; Kennedy, A [Sarah Cannon, Nashville, TN (United States); Larsen, E; Hayes, C; Grow, A [North Florida Cancer Center, Gainesville, FL (United States); Bahamondes, S.; Zheng, Y; Wu, X [JFK Comprehensive Cancer Institute, Lake Worth, FL (United States); Choi, M; Pai, S [Good Samaritan Hospital, Los Gatos, CA (United States); Li, J [Doctors Hospital of Augusta, Augusta, GA (United States); Cranford, K [Trident Medical Center, Charleston, SC (United States)

    2015-06-15

    Purpose: The study aims to develop and validate a knowledge based planning (KBP) model for external beam radiation therapy of locally advanced non-small cell lung cancer (LA-NSCLC). Methods: RapidPlan™ technology was used to develop a lung KBP model. Plans from 65 patients with LA-NSCLC were used to train the model. 25 patients were treated with VMAT, and the other patients were treated with IMRT. Organs-at-risk (OARs) included right lung, left lung, heart, esophagus, and spinal cord. DVH and geometric distribution DVH were extracted from the treated plans. The model was trained using principal component analysis and step-wise multiple regression. Box plot and regression plot tools were used to identify geometric outliers and dosimetry outliers and help fine-tune the model. The validation was performed by (a) comparing predicted DVH boundaries to actual DVHs of 63 patients and (b) using an independent set of treatment planning data. Results: 63 out of 65 plans were included in the final KBP model with PTV volume ranging from 102.5cc to 1450.2cc. Total treatment dose prescription varied from 50Gy to 70Gy based on institutional guidelines. One patient was excluded due to geometric outlier where 2.18cc of spinal cord was included in PTV. The other patient was excluded due to dosimetric outlier where the dose sparing to spinal cord was heavily enforced in the clinical plan. Target volume, OAR volume, OAR overlap volume percentage to target, and OAR out-of-field volume were included in the trained model. Lungs and heart had two principal component scores of GEDVH, whereas spinal cord and esophagus had three in the final model. Predicted DVH band (mean ±1 standard deviation) represented 66.2±3.6% of all DVHs. Conclusion: A KBP model was developed and validated for radiotherapy of LA-NSCLC in a commercial treatment planning system. The clinical implementation may improve the consistency of IMRT/VMAT planning.

  8. Development and validation of an in vitro pharmacokinetic/pharmacodynamic model to test the antibacterial efficacy of antibiotic polymer conjugates.

    Science.gov (United States)

    Azzopardi, Ernest A; Ferguson, Elaine L; Thomas, David W

    2015-04-01

    This study describes the use of a novel, two-compartment, static dialysis bag model to study the release, diffusion, and antibacterial activity of a novel, bioresponsive dextrin-colistin polymer conjugate against multidrug resistant (MDR) wild-type Acinetobacter baumannii. In this model, colistin sulfate, at its MIC, produced a rapid and extensive drop in viable bacterial counts (growth for up to 48 h, with 3 log10 CFU/ml lower bacterial counts after 48 h than those of controls. Doubling the concentration of dextrin-colistin conjugate (to 2× MIC) led to an initial bacterial killing of 3 log10 CFU/ml at 8 h, with a similar regrowth profile to 1× MIC treatment thereafter. The addition of colistin sulfate (1× MIC) to dextrin-colistin conjugate (1× MIC) resulted in undetectable bacterial counts after 4 h, followed by suppressed bacterial growth (3.5 log10 CFU/ml lower than that of control at 48 h). Incubation of dextrin-colistin conjugates with infected wound exudate from a series of burn patients (n = 6) revealed an increasing concentration of unmasked colistin in the outer compartment (OC) over time (up to 86.3% of the initial dose at 48 h), confirming that colistin would be liberated from the conjugate by endogenous α-amylase within the wound environment. These studies confirm the utility of this model system to simulate the pharmacokinetics of colistin formation in humans administered dextrin-colistin conjugates and further supports the development of antibiotic polymer conjugates in the treatment of MDR infections. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  9. Development and validation of a multilevel model for predicting workload under routine and nonroutine conditions in an air traffic management center.

    Science.gov (United States)

    Neal, Andrew; Hannah, Sam; Sanderson, Penelope; Bolland, Scott; Mooij, Martijn; Murphy, Sean

    2014-03-01

    The aim of this study was to develop a model capable of predicting variability in the mental workload experienced by frontline operators under routine and nonroutine conditions. Excess workload is a risk that needs to be managed in safety-critical industries. Predictive models are needed to manage this risk effectively yet are difficult to develop. Much of the difficulty stems from the fact that workload prediction is a multilevel problem. A multilevel workload model was developed in Study I with data collected from an en route air traffic management center. Dynamic density metrics were used to predict variability in workload within and between work units while controlling for variability among raters.The model was cross-validated in Studies 2 and 3 with the use of a high-fidelity simulator. Reported workload generally remained within the bounds of the 90% prediction interval in Studies 2 and 3. Workload crossed the upper bound of the prediction interval only under nonroutine conditions. Qualitative analyses suggest that nonroutine events caused workload to cross the upper bound of the prediction interval because the controllers could not manage their workload strategically. The model performed well under both routine and nonroutine conditions and over different patterns of workload variation. Workload prediction models can be used to support both strategic and tactical workload management. Strategic uses include the analysis of historical and projected workflows and the assessment of staffing needs.Tactical uses include the dynamic reallocation of resources to meet changes in demand.

  10. The development and validation of a clinical prediction model to determine the probability of MODY in patients with young-onset diabetes.

    Science.gov (United States)

    Shields, B M; McDonald, T J; Ellard, S; Campbell, M J; Hyde, C; Hattersley, A T

    2012-05-01

    Diagnosing MODY is difficult. To date, selection for molecular genetic testing for MODY has used discrete cut-offs of limited clinical characteristics with varying sensitivity and specificity. We aimed to use multiple, weighted, clinical criteria to determine an individual's probability of having MODY, as a crucial tool for rational genetic testing. We developed prediction models using logistic regression on data from 1,191 patients with MODY (n = 594), type 1 diabetes (n = 278) and type 2 diabetes (n = 319). Model performance was assessed by receiver operating characteristic (ROC) curves, cross-validation and validation in a further 350 patients. The models defined an overall probability of MODY using a weighted combination of the most discriminative characteristics. For MODY, compared with type 1 diabetes, these were: lower HbA(1c), parent with diabetes, female sex and older age at diagnosis. MODY was discriminated from type 2 diabetes by: lower BMI, younger age at diagnosis, female sex, lower HbA(1c), parent with diabetes, and not being treated with oral hypoglycaemic agents or insulin. Both models showed excellent discrimination (c-statistic = 0.95 and 0.98, respectively), low rates of cross-validated misclassification (9.2% and 5.3%), and good performance on the external test dataset (c-statistic = 0.95 and 0.94). Using the optimal cut-offs, the probability models improved the sensitivity (91% vs 72%) and specificity (94% vs 91%) for identifying MODY compared with standard criteria of diagnosis MODY. This allows an improved and more rational approach to determine who should have molecular genetic testing.

  11. Development and validation of an in vitro–in vivo correlation (IVIVC model for propranolol hydrochloride extended-release matrix formulations

    Directory of Open Access Journals (Sweden)

    Chinhwa Cheng

    2014-06-01

    Full Text Available The objective of this study was to develop an in vitro–in vivo correlation (IVIVC model for hydrophilic matrix extended-release (ER propranolol dosage formulations. The in vitro release characteristics of the drug were determined using USP apparatus I at 100 rpm, in a medium of varying pH (from pH 1.2 to pH 6.8. In vivo plasma concentrations and pharmacokinetic parameters in male beagle dogs were obtained after administering oral, ER formulations and immediate-release (IR commercial products. The similarity factor f2 was used to compare the dissolution data. The IVIVC model was developed using pooled fraction dissolved and fraction absorbed of propranolol ER formulations, ER-F and ER-S, with different release rates. An additional formulation ER-V, with a different release rate of propranolol, was prepared for evaluating the external predictability. The results showed that the percentage prediction error (%PE values of Cmax and AUC0–∞ were 0.86% and 5.95%, respectively, for the external validation study. The observed low prediction errors for Cmax and AUC0–∞ demonstrated that the propranolol IVIVC model was valid.

  12. SPR Hydrostatic Column Model Verification and Validation.

    Energy Technology Data Exchange (ETDEWEB)

    Bettin, Giorgia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lord, David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rudeen, David Keith [Gram, Inc. Albuquerque, NM (United States)

    2015-10-01

    A Hydrostatic Column Model (HCM) was developed to help differentiate between normal "tight" well behavior and small-leak behavior under nitrogen for testing the pressure integrity of crude oil storage wells at the U.S. Strategic Petroleum Reserve. This effort was motivated by steady, yet distinct, pressure behavior of a series of Big Hill caverns that have been placed under nitrogen for extended period of time. This report describes the HCM model, its functional requirements, the model structure and the verification and validation process. Different modes of operation are also described, which illustrate how the software can be used to model extended nitrogen monitoring and Mechanical Integrity Tests by predicting wellhead pressures along with nitrogen interface movements. Model verification has shown that the program runs correctly and it is implemented as intended. The cavern BH101 long term nitrogen test was used to validate the model which showed very good agreement with measured data. This supports the claim that the model is, in fact, capturing the relevant physical phenomena and can be used to make accurate predictions of both wellhead pressure and interface movements.

  13. Natural analogues and radionuclide transport model validation

    International Nuclear Information System (INIS)

    Lever, D.A.

    1987-08-01

    In this paper, some possible roles for natural analogues are discussed from the point of view of those involved with the development of mathematical models for radionuclide transport and with the use of these models in repository safety assessments. The characteristic features of a safety assessment are outlined in order to address the questions of where natural analogues can be used to improve our understanding of the processes involved and where they can assist in validating the models that are used. Natural analogues have the potential to provide useful information about some critical processes, especially long-term chemical processes and migration rates. There is likely to be considerable uncertainty and ambiguity associated with the interpretation of natural analogues, and thus it is their general features which should be emphasized, and models with appropriate levels of sophistication should be used. Experience gained in modelling the Koongarra uranium deposit in northern Australia is drawn upon. (author)

  14. Towards policy relevant environmental modeling: contextual validity and pragmatic models

    Science.gov (United States)

    Miles, Scott B.

    2000-01-01

    "What makes for a good model?" In various forms, this question is a question that, undoubtedly, many people, businesses, and institutions ponder with regards to their particular domain of modeling. One particular domain that is wrestling with this question is the multidisciplinary field of environmental modeling. Examples of environmental models range from models of contaminated ground water flow to the economic impact of natural disasters, such as earthquakes. One of the distinguishing claims of the field is the relevancy of environmental modeling to policy and environment-related decision-making in general. A pervasive view by both scientists and decision-makers is that a "good" model is one that is an accurate predictor. Thus, determining whether a model is "accurate" or "correct" is done by comparing model output to empirical observations. The expected outcome of this process, usually referred to as "validation" or "ground truthing," is a stamp on the model in question of "valid" or "not valid" that serves to indicate whether or not the model will be reliable before it is put into service in a decision-making context. In this paper, I begin by elaborating on the prevailing view of model validation and why this view must change. Drawing from concepts coming out of the studies of science and technology, I go on to propose a contextual view of validity that can overcome the problems associated with "ground truthing" models as an indicator of model goodness. The problem of how we talk about and determine model validity has much to do about how we perceive the utility of environmental models. In the remainder of the paper, I argue that we should adopt ideas of pragmatism in judging what makes for a good model and, in turn, developing good models. From such a perspective of model goodness, good environmental models should facilitate communication, convey—not bury or "eliminate"—uncertainties, and, thus, afford the active building of consensus decisions, instead

  15. Development and validation of models for bubble coalescence and breakup. Final report; Entwicklung und Validierung von Modellen fuer Blasenkoaleszenz und -zerfall. Abschlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    Liao, Y.; Lucas, D.

    2013-02-15

    A new generalized model for bubble coalescence and breakup has been developed. It is based on physical considerations and takes into account various mechanisms that can lead to bubble coalescence and breakup. First, in a detailed literature review, the available models were compiled and analyzed. It turned out that many of them show a contradictory behaviour. None of these models allows the prediction of the evolution of bubble size distributions along a pipe flow for a wide range of combinations of flow rates of the gas and the liquid phase. The new model has been extensively studied in a simplified Test-Solver. Although this does not cover all details of a developing flow along the pipe, it allows - in contrast to a CFD code - to conduct a large number of variational calculations to investigate the influence of individual sizes and models. Coalescence and breakup cannot be considered separately from other phenomena and models that reflect these phenomena. There are close interactions with the turbulence of the liquid phase and the momentum exchange between phases. Since the dissipation rate of turbulent kinetic energy is a direct input parameter for the new model, the turbulence modelling has been studied very carefully. To validate the model, a special experimental series for air-water flows was used, conducted at the TOPFLOW facility in an 8-meter long DN200 pipe. The data are characterized by high quality and were produced within the TOPFLOW-II project. The test series aims to provide a basis for the work presented here. Predicting the evolution of the bubble size distribution along the pipe could be improved significantly in comparison to the previous standard models for bubble coalescence and breakup implemented in CFX. However some quantitative discrepancies remain. The full model equations as well as an implementation as ''User-FORTRAN'' in CFX are available and can be used for further work on the simulation of poly-disperse bubbly

  16. Development and validation of a 3D-printed model of the ostiomeatal complex and frontal sinus for endoscopic sinus surgery training.

    Science.gov (United States)

    Alrasheed, Abdulaziz S; Nguyen, Lily H P; Mongeau, Luc; Funnell, W Robert J; Tewfik, Marc A

    2017-08-01

    Endoscopic sinus surgery poses unique training challenges due to complex and variable anatomy, and the risk of major complications. We sought to create and provide validity evidence for a novel 3D-printed simulator of the nose and paranasal sinuses. Sinonasal computed tomography (CT) images of a patient were imported into 3D visualization software. Segmentation of bony and soft tissue structures was then performed. The model was printed using simulated bone and soft tissue materials. Rhinologists and otolaryngology residents completed 6 prespecified tasks including maxillary antrostomy and frontal recess dissection on the simulator. Participants evaluated the model using survey ratings based on a 5-point Likert scale. The average time to complete each task was calculated. Descriptive analysis was used to evaluate ratings, and thematic analysis was done for qualitative questions. A total of 20 participants (10 rhinologists and 10 otolaryngology residents) tested the model and answered the survey. Overall the participants felt that the simulator would be useful as a training/educational tool (4.6/5), and that it should be integrated as part of the rhinology training curriculum (4.5/5). The following responses were obtained: visual appearance 4.25/5; realism of materials 3.8/5; and surgical experience 3.9/5. The average time to complete each task was lower for the rhinologist group than for the residents. We describe the development and validation of a novel 3D-printed model for the training of endoscopic sinus surgery skills. Although participants found the simulator to be a useful training and educational tool, further model development could improve the outcome. © 2017 ARS-AAOA, LLC.

  17. DTU PMU Laboratory Development - Testing and Validation

    OpenAIRE

    Garcia-Valle, Rodrigo; Yang, Guang-Ya; Martin, Kenneth E.; Nielsen, Arne Hejde; Østergaard, Jacob

    2010-01-01

    This is a report of the results of phasor measurement unit (PMU) laboratory development and testing done at the Centre for Electric Technology (CET), Technical University of Denmark (DTU). Analysis of the PMU performance first required the development of tools to convert the DTU PMU data into IEEE standard, and the validation is done for the DTU-PMU via a validated commercial PMU. The commercial PMU has been tested from the authors' previous efforts, where the response can be expected to foll...

  18. Development, Parameterization, and Validation of a Visco-Plastic Material Model for Sand with DifferentLevels of Water Saturation

    Science.gov (United States)

    2009-01-01

    in essential physics of the tire –sand interactions. Towards that end, a simpler ribbed- tread tire model (described below) of the type often used for...i.e. the deflection and the contact area) on a rigid sur- face. The tire was modelled in the present work using a ribbed- tread tire model similar to...with material properties representing the composite behaviour through the carcass thick - ness. The tread -cap is constructed using linear, hybrid

  19. Validation of a phytoremediation computer model

    International Nuclear Information System (INIS)

    Corapcioglu, M.Y.; Sung, K.; Rhykerd, R.L.; Munster, C.; Drew, M.

    1999-01-01

    The use of plants to stimulate remediation of contaminated soil is an effective, low-cost cleanup method which can be applied to many different sites. A phytoremediation computer model has been developed to simulate how recalcitrant hydrocarbons interact with plant roots in unsaturated soil. A study was conducted to provide data to validate and calibrate the model. During the study, lysimeters were constructed and filled with soil contaminated with 10 [mg kg -1 ] TNT, PBB and chrysene. Vegetated and unvegetated treatments were conducted in triplicate to obtain data regarding contaminant concentrations in the soil, plant roots, root distribution, microbial activity, plant water use and soil moisture. When given the parameters of time and depth, the model successfully predicted contaminant concentrations under actual field conditions. Other model parameters are currently being evaluated. 15 refs., 2 figs

  20. Theoretical development and validation of a Sharp Front model of the dewatering of a slurry by an absorbent substrate

    International Nuclear Information System (INIS)

    Collier, N C; Wilson, M A; Carter, M A; Hoff, W D; Hall, Christopher; Ball, R J; El-Turki, A; Allen, G C

    2007-01-01

    The absorption of water from a slurry into an absorbent substrate is analysed using Sharp Front theory. The analysis describes the relationship between the sorptivity S of the substrate, the desorptivity R of the slurry and the transfer sorptivity A between slurry and substrate, and leads to the relationship 1/A 2 = 1/R 2 + 1/S 2 . Experimental data are presented which validate this equation for the practically important case of the absorption of water from soft mortar mixes by fired clay bricks. A unique feature of the experimental work is the measurement of the desorptivity of the mortars at a pressure equal to the wetting front capillary pressure of the clay brick substrate. Analysis of the experimental data also enables, for the first time, the calculation of the capillary potential at the slurry/substrate interface. The analysis has relevance to many aspects of ceramic and mineral processing, industrial filtration and construction engineering

  1. First Steps to Develop and Validate a CFPD Model in Order to Support the Design of Nose-to-Brain Delivered Biopharmaceuticals.

    Science.gov (United States)

    Engelhardt, Lucas; Röhm, Martina; Mavoungou, Chrystelle; Schindowski, Katharina; Schafmeister, Annette; Simon, Ulrich

    2016-06-01

    Aerosol particle deposition in the human nasal cavity is of high interest in particular for intranasal central nervous system (CNS) drug delivery via the olfactory cleft. The objective of this study was the development and comparison of a numerical and experimental model to investigate various parameters for olfactory particle deposition within the complex anatomical nasal geometry. Based on a standardized nasal cavity, a computational fluid and particle dynamics (CFPD) model was developed that enables the variation and optimization of different parameters, which were validated by in vitro experiments using a constructed rapid-prototyped human nose model. For various flow rates (5 to 40 l/min) and particle sizes (1 to 10 μm), the airflow velocities, the calculated particle airflow patterns and the particle deposition correlated very well with the experiment. Particle deposition was investigated numerically by varying particle sizes at constant flow rate and vice versa assuming the particle size distribution of the used nebulizer. The developed CFPD model could be directly translated to the in vitro results. Hence, it can be applied for parameter screening and will contribute to the improvement of aerosol particle deposition at the olfactory cleft for CNS drug delivery in particular for biopharmaceuticals.

  2. Fuel temperature influence on the performance of a last generation common-rail diesel ballistic injector. Part II: 1D model development, validation and analysis

    International Nuclear Information System (INIS)

    Payri, R.; Salvador, F.J.; Carreres, M.; De la Morena, J.

    2016-01-01

    Highlights: • A 1D model of a solenoid common-rail ballistic injector is implemented in AMESim. • A detailed dimensional and a hydraulic characterization lead to a fair validation. • Fuel temperature influence on injector dynamics is assessed through 1D simulations. • Temperature impacts through changes in inlet orifice regime and viscous friction. • Cold fuel temperature leads to a slower injection opening due to high viscosity. - Abstract: A one-dimensional model of a solenoid-driven common-rail diesel injector has been developed in order to study the influence of fuel temperature on the injection process. The model has been implemented after a thorough characterization of the injector, both from the dimensional and the hydraulic point of view. In this sense, experimental tools for the determination of the geometry of the injector lines and orifices have been described in the paper, together with the hydraulic setup introduced to characterize the flow behaviour through the calibrated orifices. An extensive validation of the model has been performed by comparing the modelled mass flow rate against the experimental results introduced in the first part of the paper, which were performed for different engine-like operating conditions involving a wide range of fuel temperatures, injection pressures and energizing times. In that first part of the study, an important influence of the fuel temperature was reported, especially in terms of the dynamic behaviour of the injector, due to its ballistic nature. The results from the model have allowed to explain and further extend the findings of the experimental study by analyzing key features of the injector dynamics, such as the pressure drop established in the control volume due to the control orifices performance or the forces due to viscous friction, also assessing their influence on the needle lift laws.

  3. Concepts of Model Verification and Validation

    International Nuclear Information System (INIS)

    Thacker, B.H.; Doebling, S.W.; Hemez, F.M.; Anderson, M.C.; Pepin, J.E.; Rodriguez, E.A.

    2004-01-01

    Model verification and validation (VandV) is an enabling methodology for the development of computational models that can be used to make engineering predictions with quantified confidence. Model VandV procedures are needed by government and industry to reduce the time, cost, and risk associated with full-scale testing of products, materials, and weapon systems. Quantifying the confidence and predictive accuracy of model calculations provides the decision-maker with the information necessary for making high-consequence decisions. The development of guidelines and procedures for conducting a model VandV program are currently being defined by a broad spectrum of researchers. This report reviews the concepts involved in such a program. Model VandV is a current topic of great interest to both government and industry. In response to a ban on the production of new strategic weapons and nuclear testing, the Department of Energy (DOE) initiated the Science-Based Stockpile Stewardship Program (SSP). An objective of the SSP is to maintain a high level of confidence in the safety, reliability, and performance of the existing nuclear weapons stockpile in the absence of nuclear testing. This objective has challenged the national laboratories to develop high-confidence tools and methods that can be used to provide credible models needed for stockpile certification via numerical simulation. There has been a significant increase in activity recently to define VandV methods and procedures. The U.S. Department of Defense (DoD) Modeling and Simulation Office (DMSO) is working to develop fundamental concepts and terminology for VandV applied to high-level systems such as ballistic missile defense and battle management simulations. The American Society of Mechanical Engineers (ASME) has recently formed a Standards Committee for the development of VandV procedures for computational solid mechanics models. The Defense Nuclear Facilities Safety Board (DNFSB) has been a proponent of model

  4. Development and validation of a prediction model for long-term sickness absence based on occupational health survey variables

    DEFF Research Database (Denmark)

    Roelen, Corné; Thorsen, Sannie; Heymans, Martijn

    2018-01-01

    LTSA during follow-up. Results: The 15-predictor model was reduced to a 9-predictor model including age, gender, education, self-rated health, mental health, prior LTSA, work ability, emotional job demands, and recognition by the management. Discrimination by the 9-predictor model was significant (AUC...... population. Implications for rehabilitation Long-term sickness absence risk predictions would enable healthcare providers to refer high-risk employees to rehabilitation programs aimed at preventing or reducing work disability. A prediction model based on health survey variables discriminates between...... employees at high and low risk of long-term sickness absence, but discrimination was not practically useful. Health survey variables provide insufficient information to determine long-term sickness absence risk profiles. There is a need for new variables, based on the knowledge and experience...

  5. Development and validation of a numerical model for cross-section optimization of a multi-part probe for soft tissue intervention.

    Science.gov (United States)

    Frasson, L; Neubert, J; Reina, S; Oldfield, M; Davies, B L; Rodriguez Y Baena, F

    2010-01-01

    The popularity of minimally invasive surgical procedures is driving the development of novel, safer and more accurate surgical tools. In this context a multi-part probe for soft tissue surgery is being developed in the Mechatronics in Medicine Laboratory at Imperial College, London. This study reports an optimization procedure using finite element methods, for the identification of an interlock geometry able to limit the separation of the segments composing the multi-part probe. An optimal geometry was obtained and the corresponding three-dimensional finite element model validated experimentally. Simulation results are shown to be consistent with the physical experiments. The outcome of this study is an important step in the provision of a novel miniature steerable probe for surgery.

  6. Development and validation of a quasi-dimensional combustion model for SI engines fuelled by HCNG with variable hydrogen fractions

    Energy Technology Data Exchange (ETDEWEB)

    Ma, Fanhua; Wang, Yu; Wang, Mingyue; Liu, Haiquan; Wang, Junjun; Ding, Shangfen; Zhao, Shuli [State Key Laboratory of Automobile Safety and Energy, Tsinghua University, Beijing 100084 (China)

    2008-09-15

    Spark ignition engines fuelled by hydrogen enriched compressed natural gas (HCNG) have many advantages compared to traditional gasoline, diesel and natural gas engines, especially in emission control. Experimental researches have been continuously conducted to improve HCNG engine's configuration and control strategy aimed at making full use of this new fuel. With the same target, this work presents a predictive model used to simulate the working cycle of HCNG engines which is applicable for variable hydrogen blending ratios. The fundamentals of the thermodynamic model, the turbulent flame propagation model and related equation were introduced. Considering that the most important factor influencing the applicability of the model for variable hydrogen blending ratio is the laminar flame speed, the methods of how to deal with the laminar burning velocity in the model were then described in some more detail. After the determination of model constants by calibration, simulation results were compared with experimental cylinder pressure data for various hydrogen blending ratios, spark timings and equivalence ratios. The results show that simulation and experimental results match quite well except for extremely fuel lean conditions where problems of incomplete combustion become severe. (author)

  7. Towards a viscoelastic model for the unfused midpalatal suture: development and validation using the midsagittal suture in New Zealand white rabbits.

    Science.gov (United States)

    Romanyk, D L; Liu, S S; Lipsett, M G; Toogood, R W; Lagravère, M O; Major, P W; Carey, J P

    2013-06-21

    Maxillary expansion treatment is a commonly used procedure by orthodontists to widen a patient's upper jaw. As this is typically performed in adolescent patients, the midpalatal suture, connective tissue adjoining the two maxilla halves, remains unfused. Studies that have investigated patient response to expansion treatment, generally through finite element analysis, have considered this suture to behave in a linear elastic manner or it was left vacant. The purpose of the study presented here was to develop a model that could represent the midpalatal suture's viscoelastic behavior. Quasilinear viscoelastic, modified superposition, Schapery's, and Burgers modeling approaches were all considered. Raw data from a previously published study using New Zealand White Rabbits was utilized for model parameter estimation and validation. In this study, Sentalloy(®) coil springs at load levels of 0.49N (50g), 0.98N (100g), and 1.96N (200g) were used to widen the midsagittal suture of live rabbits over a period of 6 weeks. Evaluation was based on a models ability to represent experimental data well over all three load sets. Ideally, a single set of model constants could be used to represent data over all loads tested. Upon completion of the analysis it was found that the modified superposition method was able to replicate experimental data within one standard deviation of the means using a single set of constants for all loads. Future work should focus on model improvement as well as prediction of treatment outcomes. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. Three-dimensional all-speed CFD code for safety analysis of nuclear reactor containment: Status of GASFLOW parallelization, model development, validation and application

    Energy Technology Data Exchange (ETDEWEB)

    Xiao, Jianjun, E-mail: jianjun.xiao@kit.edu [Institute of Nuclear and Energy Technologies, Karlsruhe Institute of Technology, P.O. Box 3640, 76021 Karlsruhe (Germany); Travis, John R., E-mail: jack_travis@comcast.com [Engineering and Scientific Software Inc., 3010 Old Pecos Trail, Santa Fe, NM 87505 (United States); Royl, Peter, E-mail: peter.royl@partner.kit.edu [Institute of Nuclear and Energy Technologies, Karlsruhe Institute of Technology, P.O. Box 3640, 76021 Karlsruhe (Germany); Necker, Gottfried, E-mail: gottfried.necker@partner.kit.edu [Institute of Nuclear and Energy Technologies, Karlsruhe Institute of Technology, P.O. Box 3640, 76021 Karlsruhe (Germany); Svishchev, Anatoly, E-mail: anatoly.svishchev@kit.edu [Institute of Nuclear and Energy Technologies, Karlsruhe Institute of Technology, P.O. Box 3640, 76021 Karlsruhe (Germany); Jordan, Thomas, E-mail: thomas.jordan@kit.edu [Institute of Nuclear and Energy Technologies, Karlsruhe Institute of Technology, P.O. Box 3640, 76021 Karlsruhe (Germany)

    2016-05-15

    Highlights: • 3-D scalable semi-implicit pressure-based CFD code for containment safety analysis. • Robust solution algorithm valid for all-speed flows. • Well validated and widely used CFD code for hydrogen safety analysis. • Code applied in various types of nuclear reactor containments. • Parallelization enables high-fidelity models in large scale containment simulations. - Abstract: GASFLOW is a three dimensional semi-implicit all-speed CFD code which can be used to predict fluid dynamics, chemical kinetics, heat and mass transfer, aerosol transportation and other related phenomena involved in postulated accidents in nuclear reactor containments. The main purpose of the paper is to give a brief review on recent GASFLOW code development, validations and applications in the field of nuclear safety. GASFLOW code has been well validated by international experimental benchmarks, and has been widely applied to hydrogen safety analysis in various types of nuclear power plants in European and Asian countries, which have been summarized in this paper. Furthermore, four benchmark tests of a lid-driven cavity flow, low Mach number jet flow, 1-D shock tube and supersonic flow over a forward-facing step are presented in order to demonstrate the accuracy and wide-ranging capability of ICE’d ALE solution algorithm for all-speed flows. GASFLOW has been successfully parallelized using the paradigms of Message Passing Interface (MPI) and domain decomposition. The parallel version, GASFLOW-MPI, adds great value to large scale containment simulations by enabling high-fidelity models, including more geometric details and more complex physics. It will be helpful for the nuclear safety engineers to better understand the hydrogen safety related physical phenomena during the severe accident, to optimize the design of the hydrogen risk mitigation systems and to fulfill the licensing requirements by the nuclear regulatory authorities. GASFLOW-MPI is targeting a high

  9. Blood pressure and blood flow variation during postural change from sitting to standing: model development and validation

    DEFF Research Database (Denmark)

    Olufsen, M.S.; Ottesen, Johnny T.; Tran, H.T.

    2005-01-01

    Short-term cardiovascular responses to postural change from sitting to standing involve complex interactions between the autonomic nervous system, which regulates blood pressure, and cerebral autoregulation, which maintains cerebral perfusion. We present a mathematical model that can predict...... dynamic changes in beat-to-beat arterial blood pressure and middle cerebral artery blood flow velocity during postural change from sitting to standing. Our cardiovascular model utilizes 11 compartments to describe blood pressure, blood flow, compliance, and resistance in the heart and systemic circulation....... To include dynamics due to the pulsatile nature of blood pressure and blood flow, resistances in the large systemic arteries are modeled using nonlinear functions of pressure. A physiologically based submodel is used to describe effects of gravity on venous blood pooling during postural change. Two types...

  10. Development and Validation of a Deep Neural Network Model for Prediction of Postoperative In-hospital Mortality.

    Science.gov (United States)

    Lee, Christine K; Hofer, Ira; Gabel, Eilon; Baldi, Pierre; Cannesson, Maxime

    2018-04-17

    The authors tested the hypothesis that deep neural networks trained on intraoperative features can predict postoperative in-hospital mortality. The data used to train and validate the algorithm consists of 59,985 patients with 87 features extracted at the end of surgery. Feed-forward networks with a logistic output were trained using stochastic gradient descent with momentum. The deep neural networks were trained on 80% of the data, with 20% reserved for testing. The authors assessed improvement of the deep neural network by adding American Society of Anesthesiologists (ASA) Physical Status Classification and robustness of the deep neural network to a reduced feature set. The networks were then compared to ASA Physical Status, logistic regression, and other published clinical scores including the Surgical Apgar, Preoperative Score to Predict Postoperative Mortality, Risk Quantification Index, and the Risk Stratification Index. In-hospital mortality in the training and test sets were 0.81% and 0.73%. The deep neural network with a reduced feature set and ASA Physical Status classification had the highest area under the receiver operating characteristics curve, 0.91 (95% CI, 0.88 to 0.93). The highest logistic regression area under the curve was found with a reduced feature set and ASA Physical Status (0.90, 95% CI, 0.87 to 0.93). The Risk Stratification Index had the highest area under the receiver operating characteristics curve, at 0.97 (95% CI, 0.94 to 0.99). Deep neural networks can predict in-hospital mortality based on automatically extractable intraoperative data, but are not (yet) superior to existing methods.

  11. Psychological determinants of paying attention to eco-labels in purchase decisions: Model development and multinational validation

    DEFF Research Database (Denmark)

    Thøgersen, John

    Environmental labels are useful from an environmental policy perspective only if they are noticed by the consumer in the shopping situation and next - and related - understood, trusted, and valued as a tool for decision-making. In this paper, a psychological model explaining variations in consumer...

  12. Development and Validation of Spatially Explicit Habitat Models for Cavity-nesting Birds in Fishlake National Forest, Utah

    Science.gov (United States)

    Randall A., Jr. Schultz; Thomas C., Jr. Edwards; Gretchen G. Moisen; Tracey S. Frescino

    2005-01-01

    The ability of USDA Forest Service Forest Inventory and Analysis (FIA) generated spatial products to increase the predictive accuracy of spatially explicit, macroscale habitat models was examined for nest-site selection by cavity-nesting birds in Fishlake National Forest, Utah. One FIA-derived variable (percent basal area of aspen trees) was significant in the habitat...

  13. Development and Validation of a Simulation Model for the Temperature Field during High-Frequency Heating of Wood

    Directory of Open Access Journals (Sweden)

    Haojie Chai

    2018-06-01

    Full Text Available In the process of applying high-frequency heating technology to wood drying, controlling the material temperature affects both drying speed and drying quality. Therefore, research on the heat transfer mechanism of high-frequency heating of wood is of great significance. To study the heat transfer mechanism of high-frequency heating, the finite element method was used to establish and solve the wood high-frequency heating model, and experimental verification was carried out. With a decrease in moisture content, the heating rate decreased, then increased, and then decreased again. There was no obvious linear relationship between the moisture content and heating rate; the simulation accuracy of the heating rate was higher in the early and later drying stages and slightly lower near the fiber saturation point. For the central section temperature distribution, the simulation and actual measurement results matched poorly in the early drying stage because the model did not fully consider the differences in the moisture content distribution of the actual test materials. In the later drying stage, the moisture content distribution of the test materials became uniform, which was consistent with the model assumptions. Considering the changes in heating rate and temperature distribution, the accuracy of the model is good under the fiber saturation point, and it can be used to predict the high-frequency heating process of wood.

  14. Highly sensitive measurements of disease progression in rare disorders: Developing and validating a multimodal model of retinal degeneration in Stargardt disease.

    Science.gov (United States)

    Lambertus, Stanley; Bax, Nathalie M; Fakin, Ana; Groenewoud, Joannes M M; Klevering, B Jeroen; Moore, Anthony T; Michaelides, Michel; Webster, Andrew R; van der Wilt, Gert Jan; Hoyng, Carel B

    2017-01-01

    Each inherited retinal disorder is rare, but together, they affect millions of people worldwide. No treatment is currently available for these blinding diseases, but promising new options-including gene therapy-are emerging. Arguably, the most prevalent retinal dystrophy is Stargardt disease. In each case, the specific combination of ABCA4 variants (> 900 identified to date) and modifying factors is virtually unique. It accounts for the vast phenotypic heterogeneity including variable rates of functional and structural progression, thereby potentially limiting the ability of phase I/II clinical trials to assess efficacy of novel therapies with few patients. To accommodate this problem, we developed and validated a sensitive and reliable composite clinical trial endpoint for disease progression based on structural measurements of retinal degeneration. We used longitudinal data from early-onset Stargardt patients from the Netherlands (development cohort, n = 14) and the United Kingdom (external validation cohort, n = 18). The composite endpoint was derived from best-corrected visual acuity, fundus autofluorescence, and spectral-domain optical coherence tomography. Weighting optimization techniques excluded visual acuity from the composite endpoint. After optimization, the endpoint outperformed each univariable outcome, and showed an average progression of 0.41° retinal eccentricity per year (95% confidence interval, 0.30-0.52). Comparing with actual longitudinal values, the model accurately predicted progression (R2, 0.904). These properties were largely preserved in the validation cohort (0.43°/year [0.33-0.53]; prediction: R2, 0.872). We subsequently ran a two-year trial simulation with the composite endpoint, which detected a 25% decrease in disease progression with 80% statistical power using only 14 patients. These results suggest that a multimodal endpoint, reflecting structural macular changes, provides a sensitive measurement of disease progression in

  15. Highly sensitive measurements of disease progression in rare disorders: Developing and validating a multimodal model of retinal degeneration in Stargardt disease.

    Directory of Open Access Journals (Sweden)

    Stanley Lambertus

    Full Text Available Each inherited retinal disorder is rare, but together, they affect millions of people worldwide. No treatment is currently available for these blinding diseases, but promising new options-including gene therapy-are emerging. Arguably, the most prevalent retinal dystrophy is Stargardt disease. In each case, the specific combination of ABCA4 variants (> 900 identified to date and modifying factors is virtually unique. It accounts for the vast phenotypic heterogeneity including variable rates of functional and structural progression, thereby potentially limiting the ability of phase I/II clinical trials to assess efficacy of novel therapies with few patients. To accommodate this problem, we developed and validated a sensitive and reliable composite clinical trial endpoint for disease progression based on structural measurements of retinal degeneration.We used longitudinal data from early-onset Stargardt patients from the Netherlands (development cohort, n = 14 and the United Kingdom (external validation cohort, n = 18. The composite endpoint was derived from best-corrected visual acuity, fundus autofluorescence, and spectral-domain optical coherence tomography. Weighting optimization techniques excluded visual acuity from the composite endpoint. After optimization, the endpoint outperformed each univariable outcome, and showed an average progression of 0.41° retinal eccentricity per year (95% confidence interval, 0.30-0.52. Comparing with actual longitudinal values, the model accurately predicted progression (R2, 0.904. These properties were largely preserved in the validation cohort (0.43°/year [0.33-0.53]; prediction: R2, 0.872. We subsequently ran a two-year trial simulation with the composite endpoint, which detected a 25% decrease in disease progression with 80% statistical power using only 14 patients.These results suggest that a multimodal endpoint, reflecting structural macular changes, provides a sensitive measurement of disease

  16. Highly sensitive measurements of disease progression in rare disorders: Developing and validating a multimodal model of retinal degeneration in Stargardt disease

    Science.gov (United States)

    Bax, Nathalie M.; Fakin, Ana; Groenewoud, Joannes M. M.; Klevering, B. Jeroen; Moore, Anthony T.; Michaelides, Michel; Webster, Andrew R.; van der Wilt, Gert Jan; Hoyng, Carel B.

    2017-01-01

    Background Each inherited retinal disorder is rare, but together, they affect millions of people worldwide. No treatment is currently available for these blinding diseases, but promising new options—including gene therapy—are emerging. Arguably, the most prevalent retinal dystrophy is Stargardt disease. In each case, the specific combination of ABCA4 variants (> 900 identified to date) and modifying factors is virtually unique. It accounts for the vast phenotypic heterogeneity including variable rates of functional and structural progression, thereby potentially limiting the ability of phase I/II clinical trials to assess efficacy of novel therapies with few patients. To accommodate this problem, we developed and validated a sensitive and reliable composite clinical trial endpoint for disease progression based on structural measurements of retinal degeneration. Methods and findings We used longitudinal data from early-onset Stargardt patients from the Netherlands (development cohort, n = 14) and the United Kingdom (external validation cohort, n = 18). The composite endpoint was derived from best-corrected visual acuity, fundus autofluorescence, and spectral-domain optical coherence tomography. Weighting optimization techniques excluded visual acuity from the composite endpoint. After optimization, the endpoint outperformed each univariable outcome, and showed an average progression of 0.41° retinal eccentricity per year (95% confidence interval, 0.30–0.52). Comparing with actual longitudinal values, the model accurately predicted progression (R2, 0.904). These properties were largely preserved in the validation cohort (0.43°/year [0.33–0.53]; prediction: R2, 0.872). We subsequently ran a two-year trial simulation with the composite endpoint, which detected a 25% decrease in disease progression with 80% statistical power using only 14 patients. Conclusions These results suggest that a multimodal endpoint, reflecting structural macular changes, provides a

  17. Tritium and hydrogen behaviour at Phenix power plant. Application to development and validation of KUMAR type models

    International Nuclear Information System (INIS)

    Tibi, A.; Misraki, J.; Feron, D.

    1984-04-01

    Experimentations at Phenix reactor confirmed the fitness of the KUMAR model for predicting the behaviour of hydrogen and tritium, and thus, prevision of the tritium distribution at Super Phenix reactor: calculation of the tritium content of a regenerated secondary cold trap, behaviour of hydrogen during power operation, the primary cold trap being deliberately outage, and estimation of the tritium and hydrogen sources and permeation transfer ratios [fr

  18. Unit testing, model validation, and biological simulation.

    Science.gov (United States)

    Sarma, Gopal P; Jacobs, Travis W; Watts, Mark D; Ghayoomie, S Vahid; Larson, Stephen D; Gerkin, Richard C

    2016-01-01

    The growth of the software industry has gone hand in hand with the development of tools and cultural practices for ensuring the reliability of complex pieces of software. These tools and practices are now acknowledged to be essential to the management of modern software. As computational models and methods have become increasingly common in the biological sciences, it is important to examine how these practices can accelerate biological software development and improve research quality. In this article, we give a focused case study of our experience with the practices of unit testing and test-driven development in OpenWorm, an open-science project aimed at modeling Caenorhabditis elegans. We identify and discuss the challenges of incorporating test-driven development into a heterogeneous, data-driven project, as well as the role of model validation tests, a category of tests unique to software which expresses scientific models.

  19. Assessing Discriminative Performance at External Validation of Clinical Prediction Models.

    Directory of Open Access Journals (Sweden)

    Daan Nieboer

    Full Text Available External validation studies are essential to study the generalizability of prediction models. Recently a permutation test, focusing on discrimination as quantified by the c-statistic, was proposed to judge whether a prediction model is transportable to a new setting. We aimed to evaluate this test and compare it to previously proposed procedures to judge any changes in c-statistic from development to external validation setting.We compared the use of the permutation test to the use of benchmark values of the c-statistic following from a previously proposed framework to judge transportability of a prediction model. In a simulation study we developed a prediction model with logistic regression on a development set and validated them in the validation set. We concentrated on two scenarios: 1 the case-mix was more heterogeneous and predictor effects were weaker in the validation set compared to the development set, and 2 the case-mix was less heterogeneous in the validation set and predictor effects were identical in the validation and development set. Furthermore we illustrated the methods in a case study using 15 datasets of patients suffering from traumatic brain injury.The permutation test indicated that the validation and development set were homogenous in scenario 1 (in almost all simulated samples and heterogeneous in scenario 2 (in 17%-39% of simulated samples. Previously proposed benchmark values of the c-statistic and the standard deviation of the linear predictors correctly pointed at the more heterogeneous case-mix in scenario 1 and the less heterogeneous case-mix in scenario 2.The recently proposed permutation test may provide misleading results when externally validating prediction models in the presence of case-mix differences between the development and validation population. To correctly interpret the c-statistic found at external validation it is crucial to disentangle case-mix differences from incorrect regression coefficients.

  20. Mathematical modeling and validation of growth of Salmonella Enteritidis and background microorganisms in potato salad – one-step kinetic analysis and model development

    Science.gov (United States)

    This study was conducted to examine the growth of Salmonella Enteritidis (SE) in potato salad caused by cross-contamination and temperature abuse, and develop mathematical models to predict its growth. The growth of SE was investigated under constant temperature conditions (8, 10, 15, 20, 25, 30, a...

  1. Model-Based Method for Sensor Validation

    Science.gov (United States)

    Vatan, Farrokh

    2012-01-01

    Fault detection, diagnosis, and prognosis are essential tasks in the operation of autonomous spacecraft, instruments, and in situ platforms. One of NASA s key mission requirements is robust state estimation. Sensing, using a wide range of sensors and sensor fusion approaches, plays a central role in robust state estimation, and there is a need to diagnose sensor failure as well as component failure. Sensor validation can be considered to be part of the larger effort of improving reliability and safety. The standard methods for solving the sensor validation problem are based on probabilistic analysis of the system, from which the method based on Bayesian networks is most popular. Therefore, these methods can only predict the most probable faulty sensors, which are subject to the initial probabilities defined for the failures. The method developed in this work is based on a model-based approach and provides the faulty sensors (if any), which can be logically inferred from the model of the system and the sensor readings (observations). The method is also more suitable for the systems when it is hard, or even impossible, to find the probability functions of the system. The method starts by a new mathematical description of the problem and develops a very efficient and systematic algorithm for its solution. The method builds on the concepts of analytical redundant relations (ARRs).

  2. Convergence of regenerative medicine and synthetic biology to develop standardized and validated models of human diseases with clinical relevance.

    Science.gov (United States)

    Hutmacher, Dietmar Werner; Holzapfel, Boris Michael; De-Juan-Pardo, Elena Maria; Pereira, Brooke Anne; Ellem, Stuart John; Loessner, Daniela; Risbridger, Gail Petuna

    2015-12-01

    In order to progress beyond currently available medical devices and implants, the concept of tissue engineering has moved into the centre of biomedical research worldwide. The aim of this approach is not to replace damaged tissue with an implant or device but rather to prompt the patient's own tissue to enact a regenerative response by using a tissue-engineered construct to assemble new functional and healthy tissue. More recently, it has been suggested that the combination of Synthetic Biology and translational tissue-engineering techniques could enhance the field of personalized medicine, not only from a regenerative medicine perspective, but also to provide frontier technologies for building and transforming the research landscape in the field of in vitro and in vivo disease models. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.

  3. Feature Extraction for Structural Dynamics Model Validation

    Energy Technology Data Exchange (ETDEWEB)

    Farrar, Charles [Los Alamos National Laboratory; Nishio, Mayuko [Yokohama University; Hemez, Francois [Los Alamos National Laboratory; Stull, Chris [Los Alamos National Laboratory; Park, Gyuhae [Chonnam Univesity; Cornwell, Phil [Rose-Hulman Institute of Technology; Figueiredo, Eloi [Universidade Lusófona; Luscher, D. J. [Los Alamos National Laboratory; Worden, Keith [University of Sheffield

    2016-01-13

    As structural dynamics becomes increasingly non-modal, stochastic and nonlinear, finite element model-updating technology must adopt the broader notions of model validation and uncertainty quantification. For example, particular re-sampling procedures must be implemented to propagate uncertainty through a forward calculation, and non-modal features must be defined to analyze nonlinear data sets. The latter topic is the focus of this report, but first, some more general comments regarding the concept of model validation will be discussed.

  4. Model Validation in Ontology Based Transformations

    Directory of Open Access Journals (Sweden)

    Jesús M. Almendros-Jiménez

    2012-10-01

    Full Text Available Model Driven Engineering (MDE is an emerging approach of software engineering. MDE emphasizes the construction of models from which the implementation should be derived by applying model transformations. The Ontology Definition Meta-model (ODM has been proposed as a profile for UML models of the Web Ontology Language (OWL. In this context, transformations of UML models can be mapped into ODM/OWL transformations. On the other hand, model validation is a crucial task in model transformation. Meta-modeling permits to give a syntactic structure to source and target models. However, semantic requirements have to be imposed on source and target models. A given transformation will be sound when source and target models fulfill the syntactic and semantic requirements. In this paper, we present an approach for model validation in ODM based transformations. Adopting a logic programming based transformational approach we will show how it is possible to transform and validate models. Properties to be validated range from structural and semantic requirements of models (pre and post conditions to properties of the transformation (invariants. The approach has been applied to a well-known example of model transformation: the Entity-Relationship (ER to Relational Model (RM transformation.

  5. Development and validation of a risk model for prediction of hazardous alcohol consumption in general practice attendees: the predictAL study.

    Science.gov (United States)

    King, Michael; Marston, Louise; Švab, Igor; Maaroos, Heidi-Ingrid; Geerlings, Mirjam I; Xavier, Miguel; Benjamin, Vicente; Torres-Gonzalez, Francisco; Bellon-Saameno, Juan Angel; Rotar, Danica; Aluoja, Anu; Saldivia, Sandra; Correa, Bernardo; Nazareth, Irwin

    2011-01-01

    Little is known about the risk of progression to hazardous alcohol use in people currently drinking at safe limits. We aimed to develop a prediction model (predictAL) for the development of hazardous drinking in safe drinkers. A prospective cohort study of adult general practice attendees in six European countries and Chile followed up over 6 months. We recruited 10,045 attendees between April 2003 to February 2005. 6193 European and 2462 Chilean attendees recorded AUDIT scores below 8 in men and 5 in women at recruitment and were used in modelling risk. 38 risk factors were measured to construct a risk model for the development of hazardous drinking using stepwise logistic regression. The model was corrected for over fitting and tested in an external population. The main outcome was hazardous drinking defined by an AUDIT score ≥8 in men and ≥5 in women. 69.0% of attendees were recruited, of whom 89.5% participated again after six months. The risk factors in the final predictAL model were sex, age, country, baseline AUDIT score, panic syndrome and lifetime alcohol problem. The predictAL model's average c-index across all six European countries was 0.839 (95% CI 0.805, 0.873). The Hedge's g effect size for the difference in log odds of predicted probability between safe drinkers in Europe who subsequently developed hazardous alcohol use and those who did not was 1.38 (95% CI 1.25, 1.51). External validation of the algorithm in Chilean safe drinkers resulted in a c-index of 0.781 (95% CI 0.717, 0.846) and Hedge's g of 0.68 (95% CI 0.57, 0.78). The predictAL risk model for development of hazardous consumption in safe drinkers compares favourably with risk algorithms for disorders in other medical settings and can be a useful first step in prevention of alcohol misuse.

  6. Development and validation of a risk model for prediction of hazardous alcohol consumption in general practice attendees: the predictAL study.

    Directory of Open Access Journals (Sweden)

    Michael King

    Full Text Available Little is known about the risk of progression to hazardous alcohol use in people currently drinking at safe limits. We aimed to develop a prediction model (predictAL for the development of hazardous drinking in safe drinkers.A prospective cohort study of adult general practice attendees in six European countries and Chile followed up over 6 months. We recruited 10,045 attendees between April 2003 to February 2005. 6193 European and 2462 Chilean attendees recorded AUDIT scores below 8 in men and 5 in women at recruitment and were used in modelling risk. 38 risk factors were measured to construct a risk model for the development of hazardous drinking using stepwise logistic regression. The model was corrected for over fitting and tested in an external population. The main outcome was hazardous drinking defined by an AUDIT score ≥8 in men and ≥5 in women.69.0% of attendees were recruited, of whom 89.5% participated again after six months. The risk factors in the final predictAL model were sex, age, country, baseline AUDIT score, panic syndrome and lifetime alcohol problem. The predictAL model's average c-index across all six European countries was 0.839 (95% CI 0.805, 0.873. The Hedge's g effect size for the difference in log odds of predicted probability between safe drinkers in Europe who subsequently developed hazardous alcohol use and those who did not was 1.38 (95% CI 1.25, 1.51. External validation of the algorithm in Chilean safe drinkers resulted in a c-index of 0.781 (95% CI 0.717, 0.846 and Hedge's g of 0.68 (95% CI 0.57, 0.78.The predictAL risk model for development of hazardous consumption in safe drinkers compares favourably with risk algorithms for disorders in other medical settings and can be a useful first step in prevention of alcohol misuse.

  7. Data for FUMEX: Results from fuel behavior studies at the OECD Halden Reactor Project for model validation and development

    International Nuclear Information System (INIS)

    Wiesenack, W.

    1997-01-01

    Investigations of phenomena associated with extended or high burn-up are an important part of the fuel and materials testing programme carried out at the OECD Halden Reactor Project. The in-core studies comprise long term fuel rod behavior as well as the response to power ramps. Performance is assessed through measurements of fuel centre temperature, rod pressure, elongation of cladding and fuel stack, and cladding diameter changes obtained during full power reactor operation. Data from fuel behavior studies at the OECD Halden Reactor Project, provided for the IAEA co-ordinated research programme FUMEX, are used to elucidate short and long-term developments of fuel behavior. The examples comprise: fuel conductivity degradation manifested as a gradual temperature increase with burn-up; the influence of a combination of small gap/high fission gas release on fuel centre temperature (situation at high burn-up); fission gas release during normal operation and power ramps, and the possibility of a burn-up enhancement; PCMI reflected by cladding elongation, also for the case of a nominally open gap, and the change of interaction onset with burn-up. (author). 10 refs, 9 figs, 1 tab

  8. Data for FUMEX: Results from fuel behavior studies at the OECD Halden Reactor Project for model validation and development

    Energy Technology Data Exchange (ETDEWEB)

    Wiesenack, W [Institutt for Energiteknikk, Halden (Norway). OECD Halden Reaktor Projekt

    1997-08-01

    Investigations of phenomena associated with extended or high burn-up are an important part of the fuel and materials testing programme carried out at the OECD Halden Reactor Project. The in-core studies comprise long term fuel rod behavior as well as the response to power ramps. Performance is assessed through measurements of fuel centre temperature, rod pressure, elongation of cladding and fuel stack, and cladding diameter changes obtained during full power reactor operation. Data from fuel behavior studies at the OECD Halden Reactor Project, provided for the IAEA co-ordinated research programme FUMEX, are used to elucidate short and long-term developments of fuel behavior. The examples comprise: fuel conductivity degradation manifested as a gradual temperature increase with burn-up; the influence of a combination of small gap/high fission gas release on fuel centre temperature (situation at high burn-up); fission gas release during normal operation and power ramps, and the possibility of a burn-up enhancement; PCMI reflected by cladding elongation, also for the case of a nominally open gap, and the change of interaction onset with burn-up. (author). 10 refs, 9 figs, 1 tab.

  9. A broad view of model validation

    International Nuclear Information System (INIS)

    Tsang, C.F.

    1989-10-01

    The safety assessment of a nuclear waste repository requires the use of models. Such models need to be validated to ensure, as much as possible, that they are a good representation of the actual processes occurring in the real system. In this paper we attempt to take a broad view by reviewing step by step the modeling process and bringing out the need to validating every step of this process. This model validation includes not only comparison of modeling results with data from selected experiments, but also evaluation of procedures for the construction of conceptual models and calculational models as well as methodologies for studying data and parameter correlation. The need for advancing basic scientific knowledge in related fields, for multiple assessment groups, and for presenting our modeling efforts in open literature to public scrutiny is also emphasized. 16 refs

  10. Establishing model credibility involves more than validation

    International Nuclear Information System (INIS)

    Kirchner, T.

    1991-01-01

    One widely used definition of validation is that the quantitative test of the performance of a model through the comparison of model predictions to independent sets of observations from the system being simulated. The ability to show that the model predictions compare well with observations is often thought to be the most rigorous test that can be used to establish credibility for a model in the scientific community. However, such tests are only part of the process used to establish credibility, and in some cases may be either unnecessary or misleading. Naylor and Finger extended the concept of validation to include the establishment of validity for the postulates embodied in the model and the test of assumptions used to select postulates for the model. Validity of postulates is established through concurrence by experts in the field of study that the mathematical or conceptual model contains the structural components and mathematical relationships necessary to adequately represent the system with respect to the goals for the model. This extended definition of validation provides for consideration of the structure of the model, not just its performance, in establishing credibility. Evaluation of a simulation model should establish the correctness of the code and the efficacy of the model within its domain of applicability. (24 refs., 6 figs.)

  11. Validation of a Computational Model for the SLS Core Stage Oxygen Tank Diffuser Concept and the Low Profile Diffuser - An Advanced Development Design for the SLS

    Science.gov (United States)

    Brodnick, Jacob; Richardson, Brian; Ramachandran, Narayanan

    2015-01-01

    The Low Profile Diffuser (LPD) project originated as an award from the Marshall Space Flight Center (MSFC) Advanced Development (ADO) office to the Main Propulsion Systems Branch (ER22). The task was created to develop and test an LPD concept that could produce comparable performance to a larger, traditionally designed, ullage gas diffuser while occupying a smaller volume envelope. Historically, ullage gas diffusers have been large, bulky devices that occupy a significant portion of the propellant tank, decreasing the tank volume available for propellant. Ullage pressurization of spacecraft propellant tanks is required to prevent boil-off of cryogenic propellants and to provide a positive pressure for propellant extraction. To achieve this, ullage gas diffusers must slow hot, high-pressure gas entering a propellant tank from supersonic speeds to only a few meters per second. Decreasing the incoming gas velocity is typically accomplished through expansion to larger areas within the diffuser which has traditionally led to large diffuser lengths. The Fluid Dynamics Branch (ER42) developed and applied advanced Computational Fluid Dynamics (CFD) analysis methods in order to mature the LPD design from and initial concept to an optimized test prototype and to provide extremely accurate pre-test predictions of diffuser performance. Additionally, the diffuser concept for the Core Stage of the Space Launch System (SLS) was analyzed in a short amount of time to guide test data collection efforts of the qualification of the device. CFD analysis of the SLS diffuser design provided new insights into the functioning of the device and was qualitatively validated against hot wire anemometry of the exterior flow field. Rigorous data analysis of the measurements was performed on static and dynamic pressure data, data from two microphones, accelerometers and hot wire anemometry with automated traverse. Feasibility of the LPD concept and validation of the computational model were

  12. Wire-mesh sensors: an experimental tool for two-phase CDF model development and code validation

    Energy Technology Data Exchange (ETDEWEB)

    Prasser, H.M. [Forschungszentrum Rossendorf e.V., Dresden (Germany)

    2004-07-01

    Full text of publication follows:The Institute of Safety Research of the Forschungszentrum Rossendorf, Germany, has developed electrode-mesh sensors, which allow the measurement of the electrical conductivity distribution in a flow duct. This can be used either for the detection of the gaseous phase in a gas-liquid flow or for mixing studies in single phase flow, when the components have different electric conductivities. Two grids of crossing wires are placed into the flow closely behind each other. The wires of the first plane (transmitter plane) are supplied with pulses of a driving voltage in a successive order. The data acquisition is done by measuring the electrical currents arriving at the second grid (receiver wires). After the last transmitter electrode has been activated, a two-dimensional matrix is available that reflects the conductivities at crossing points of the electrodes of the two grids. Sequences of these 2D distributions are recorded with a rate of up to 10 kHz. Due to the high measuring rate each bubble is mapped in several successive instantaneous frames. This allows to obtain bubble size distributions as well as bubble-size resolved gas fraction profiles beside the visualisation and the calculation of profiles of the time-averaged void fraction. Two sensors placed behind each other can furthermore be used for bubble velocity measurements using cross-correlation techniques. Sensors with three layers of electrode grids can be used for the measurement of the velocity of individual bubbles. The sensor is widely used to study the evolution of the flow pattern in an upwards air-water flow. The experiments aim at closure equations describing forces acting on bubbles as well as coalescence and fragmentation frequencies for the implementation in CFD-codes. The largest sensor used until now has a circular measuring cross-section of about 200 mm diameter and is equipped with two grids of 64 wires. Therefore, the spatial resolution is 3 mm, the measuring

  13. Wire-mesh sensors: an experimental tool for two-phase CDF model development and code validation

    International Nuclear Information System (INIS)

    Prasser, H.M.

    2004-01-01

    Full text of publication follows:The Institute of Safety Research of the Forschungszentrum Rossendorf, Germany, has developed electrode-mesh sensors, which allow the measurement of the electrical conductivity distribution in a flow duct. This can be used either for the detection of the gaseous phase in a gas-liquid flow or for mixing studies in single phase flow, when the components have different electric conductivities. Two grids of crossing wires are placed into the flow closely behind each other. The wires of the first plane (transmitter plane) are supplied with pulses of a driving voltage in a successive order. The data acquisition is done by measuring the electrical currents arriving at the second grid (receiver wires). After the last transmitter electrode has been activated, a two-dimensional matrix is available that reflects the conductivities at crossing points of the electrodes of the two grids. Sequences of these 2D distributions are recorded with a rate of up to 10 kHz. Due to the high measuring rate each bubble is mapped in several successive instantaneous frames. This allows to obtain bubble size distributions as well as bubble-size resolved gas fraction profiles beside the visualisation and the calculation of profiles of the time-averaged void fraction. Two sensors placed behind each other can furthermore be used for bubble velocity measurements using cross-correlation techniques. Sensors with three layers of electrode grids can be used for the measurement of the velocity of individual bubbles. The sensor is widely used to study the evolution of the flow pattern in an upwards air-water flow. The experiments aim at closure equations describing forces acting on bubbles as well as coalescence and fragmentation frequencies for the implementation in CFD-codes. The largest sensor used until now has a circular measuring cross-section of about 200 mm diameter and is equipped with two grids of 64 wires. Therefore, the spatial resolution is 3 mm, the measuring

  14. DTU PMU Laboratory Development - Testing and Validation

    DEFF Research Database (Denmark)

    Garcia-Valle, Rodrigo; Yang, Guang-Ya; Martin, Kenneth E.

    2010-01-01

    This is a report of the results of phasor measurement unit (PMU) laboratory development and testing done at the Centre for Electric Technology (CET), Technical University of Denmark (DTU). Analysis of the PMU performance first required the development of tools to convert the DTU PMU data into IEEE...... standard, and the validation is done for the DTU-PMU via a validated commercial PMU. The commercial PMU has been tested from the authors' previous efforts, where the response can be expected to follow known patterns and provide confirmation about the test system to confirm the design and settings....... In a nutshell, having 2 PMUs that observe same signals provides validation of the operation and flags questionable results with more certainty. Moreover, the performance and accuracy of the DTU-PMU is tested acquiring good and precise results, when compared with a commercial phasor measurement device, PMU-1....

  15. The Development and Validation of an In Vitro Airway Model to Assess Realistic Airway Deposition and Drug Permeation Behavior of Orally Inhaled Products Across Synthetic Membranes.

    Science.gov (United States)

    Huynh, Bao K; Traini, Daniela; Farkas, Dale R; Longest, P Worth; Hindle, Michael; Young, Paul M

    2018-04-01

    Current in vitro approaches to assess lung deposition, dissolution, and cellular transport behavior of orally inhaled products (OIPs) have relied on compendial impactors to collect drug particles that are likely to deposit in the airway; however, the main drawback with this approach is that these impactors do not reflect the airway and may not necessarily represent drug deposition behavior in vivo. The aim of this article is to describe the development and method validation of a novel hybrid in vitro approach to assess drug deposition and permeation behavior in a more representative airway model. The medium-sized Virginia Commonwealth University (VCU) mouth-throat (MT) and tracheal-bronchial (TB) realistic upper airway models were used in this study as representative models of the upper airway. The TB model was modified to accommodate two Snapwell ® inserts above the first TB airway bifurcation region to collect deposited nebulized ciprofloxacin-hydrochloride (CIP-HCL) droplets as a model drug aerosol system. Permeation characteristics of deposited nebulized CIP-HCL droplets were assessed across different synthetic membranes using the Snapwell test system. The Snapwell test system demonstrated reproducible and discriminatory drug permeation profiles for already dissolved and nebulized CIP-HCL droplets through a range of synthetic permeable membranes under different test conditions. The rate and extent of drug permeation depended on the permeable membrane material used, presence of a stirrer in the receptor compartment, and, most importantly, the drug collection method. This novel hybrid in vitro approach, which incorporates a modified version of a realistic upper airway model, coupled with the Snapwell test system holds great potential to evaluate postairway deposition characteristics, such as drug permeation and particle dissolution behavior of OIPs. Future studies will expand this approach using a cell culture-based setup instead of synthetic membranes, within a

  16. Base Flow Model Validation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation is the systematic "building-block" validation of CFD/turbulence models employing a GUI driven CFD code (RPFM) and existing as well as new data sets to...

  17. Development and validation of a terrestrial biotic ligand model predicting the effect of cobalt on root growth of barley (Hordeum vulgare)

    International Nuclear Information System (INIS)

    Lock, K.; De Schamphelaere, K.A.C.; Becaus, S.; Criel, P.; Van Eeckhout, H.; Janssen, C.R.

    2007-01-01

    A Biotic Ligand Model was developed predicting the effect of cobalt on root growth of barley (Hordeum vulgare) in nutrient solutions. The extent to which Ca 2+ , Mg 2+ , Na + , K + ions and pH independently affect cobalt toxicity to barley was studied. With increasing activities of Mg 2+ , and to a lesser extent also K + , the 4-d EC50 Co2+ increased linearly, while Ca 2+ , Na + and H + activities did not affect Co 2+ toxicity. Stability constants for the binding of Co 2+ , Mg 2+ and K + to the biotic ligand were obtained: log K CoBL = 5.14, log K MgBL = 3.86 and log K KBL = 2.50. Limited validation of the model with one standard artificial soil and one standard field soil showed that the 4-d EC50 Co2+ could only be predicted within a factor of four from the observed values, indicating further refinement of the BLM is needed. - Biotic Ligand Models are not only a useful tool to assess metal toxicity in aquatic systems but can also be used for terrestrial plants

  18. Simulating the Effects of Agricultural Management on Water Quality Dynamics in Rice Paddies for Sustainable Rice Production—Model Development and Validation

    Directory of Open Access Journals (Sweden)

    Soon-Kun Choi

    2017-11-01

    Full Text Available The Agricultural Policy/Environmental eXtender (APEX model is widely used for evaluating agricultural conservation efforts and their effects on soil and water. A key component of APEX application in Korea is simulating the water quality impacts of rice paddies because rice agriculture claims the largest cropland area in the country. In this study, a computational module called APEX-Paddy (National Academy of Agricultural Sciences, Wanju, Korea is developed to simulate water quality with considering pertinent paddy management practices, such as puddling and flood irrigation management. Data collected at two experimental paddy sites in Korea were used to calibrate and validate the model. Results indicate that APEX-Paddy performs well in predicting runoff discharge rate and nitrogen yield while the original APEX highly overestimates runoff rates and nitrogen yields on large storm events. With APEX-Paddy, simulated and observed flow and mineral nitrogen yield (QN are found to be highly correlated after calibration (Nash & Sutcliffe Efficiency (NSE = 0.87 and Percent Bias (PBIAS = −14.6% for flow; NSE = 0.68 and PBIAS = 2.1% for QN. Consequently, the APEX-Paddy showed a greater accuracy in flow and QN prediction than the original APEX modeling practice using the SCS-CN (Soil Conservation Service-Curve Number method.

  19. Development and validation of a computational model to study the effect of foot constraint on ankle injury due to external rotation.

    Science.gov (United States)

    Wei, Feng; Hunley, Stanley C; Powell, John W; Haut, Roger C

    2011-02-01

    Recent studies, using two different manners of foot constraint, potted and taped, document altered failure characteristics in the human cadaver ankle under controlled external rotation of the foot. The posterior talofibular ligament (PTaFL) was commonly injured when the foot was constrained in potting material, while the frequency of deltoid ligament injury was higher for the taped foot. In this study an existing multibody computational modeling approach was validated to include the influence of foot constraint, determine the kinematics of the joint under external foot rotation, and consequently obtain strains in various ligaments. It was hypothesized that the location of ankle injury due to excessive levels of external foot rotation is a function of foot constraint. The results from this model simulation supported this hypothesis and helped to explain the mechanisms of injury in the cadaver experiments. An excessive external foot rotation might generate a PTaFL injury for a rigid foot constraint, and an anterior deltoid ligament injury for a pliant foot constraint. The computational models may be further developed and modified to simulate the human response for different shoe designs, as well as on various athletic shoe-surface interfaces, so as to provide a computational basis for optimizing athletic performance with minimal injury risk.

  20. Model validation: Correlation for updating

    Indian Academy of Sciences (India)

    In this paper, a review is presented of the various methods which ... to make a direct and objective comparison of specific dynamic properties, measured ..... stiffness matrix is available from the analytical model, is that of reducing or condensing.

  1. Validating EHR clinical models using ontology patterns.

    Science.gov (United States)

    Martínez-Costa, Catalina; Schulz, Stefan

    2017-12-01

    Clinical models are artefacts that specify how information is structured in electronic health records (EHRs). However, the makeup of clinical models is not guided by any formal constraint beyond a semantically vague information model. We address this gap by advocating ontology design patterns as a mechanism that makes the semantics of clinical models explicit. This paper demonstrates how ontology design patterns can validate existing clinical models using SHACL. Based on the Clinical Information Modelling Initiative (CIMI), we show how ontology patterns detect both modeling and terminology binding errors in CIMI models. SHACL, a W3C constraint language for the validation of RDF graphs, builds on the concept of "Shape", a description of data in terms of expected cardinalities, datatypes and other restrictions. SHACL, as opposed to OWL, subscribes to the Closed World Assumption (CWA) and is therefore more suitable for the validation of clinical models. We have demonstrated the feasibility of the approach by manually describing the correspondences between six CIMI clinical models represented in RDF and two SHACL ontology design patterns. Using a Java-based SHACL implementation, we found at least eleven modeling and binding errors within these CIMI models. This demonstrates the usefulness of ontology design patterns not only as a modeling tool but also as a tool for validation. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. Validity of information security policy models

    Directory of Open Access Journals (Sweden)

    Joshua Onome Imoniana

    Full Text Available Validity is concerned with establishing evidence for the use of a method to be used with a particular set of population. Thus, when we address the issue of application of security policy models, we are concerned with the implementation of a certain policy, taking into consideration the standards required, through attribution of scores to every item in the research instrument. En today's globalized economic scenarios, the implementation of information security policy, in an information technology environment, is a condition sine qua non for the strategic management process of any organization. Regarding this topic, various studies present evidences that, the responsibility for maintaining a policy rests primarily with the Chief Security Officer. The Chief Security Officer, in doing so, strives to enhance the updating of technologies, in order to meet all-inclusive business continuity planning policies. Therefore, for such policy to be effective, it has to be entirely embraced by the Chief Executive Officer. This study was developed with the purpose of validating specific theoretical models, whose designs were based on literature review, by sampling 10 of the Automobile Industries located in the ABC region of Metropolitan São Paulo City. This sampling was based on the representativeness of such industries, particularly with regards to each one's implementation of information technology in the region. The current study concludes, presenting evidence of the discriminating validity of four key dimensions of the security policy, being such: the Physical Security, the Logical Access Security, the Administrative Security, and the Legal & Environmental Security. On analyzing the Alpha of Crombach structure of these security items, results not only attest that the capacity of those industries to implement security policies is indisputable, but also, the items involved, homogeneously correlate to each other.

  3. Numerical modelling of the long-term evolution of EDZ. Development of material models, implementation in finite-element codes, and validation

    International Nuclear Information System (INIS)

    Pudewills, A.

    2005-11-01

    Construction of deep underground structures disturbs the initial stress field in the surrounding rock. This effect can generate microcracks and alter the hydromechanical properties of the rock salt around the excavations. For the long-term performance of an underground repository in rock salt, the evolution of the 'Excavation Disturbed Zone' (EDZ) and the hydromechanical behaviour of this zone represent important issues with respect to the integrity of the geological and technical barriers. Within the framework of the NF-PRO project, WP 4.4, attention focuses on the mathematical modelling of the development and evolution of the EDZ in the rock near a disposal drift due to its relevance on the integrity of the geological and technical barriers. To perform this task, finite-element codes containing a set of time- and temperature-dependent constitutive models have been improved. A new viscoplastic constitutive model for rock salt that can describe the damage of the rock has been implemented in the finite-element codes available. The model parameters were evaluated based on experimental results. Additionally, the long-term evolution of the EDZ around a gallery in a salt mine at about 700 m below the surface was analysed and the numerical results were compared with in-situ measurements. The calculated room closure, stress distribution and the increase of rock permeability in the EDZ were compared with in-situ data, thus providing confidence in the model used. (orig.)

  4. Validated modified Lycopodium spore method development for ...

    African Journals Online (AJOL)

    Validated modified lycopodium spore method has been developed for simple and rapid quantification of herbal powdered drugs. Lycopodium spore method was performed on ingredients of Shatavaryadi churna, an ayurvedic formulation used as immunomodulator, galactagogue, aphrodisiac and rejuvenator. Estimation of ...

  5. Tracer travel time and model validation

    International Nuclear Information System (INIS)

    Tsang, Chin-Fu.

    1988-01-01

    The performance assessment of a nuclear waste repository demands much more in comparison to the safety evaluation of any civil constructions such as dams, or the resource evaluation of a petroleum or geothermal reservoir. It involves the estimation of low probability (low concentration) of radionuclide transport extrapolated 1000's of years into the future. Thus models used to make these estimates need to be carefully validated. A number of recent efforts have been devoted to the study of this problem. Some general comments on model validation were given by Tsang. The present paper discusses some issues of validation in regards to radionuclide transport. 5 refs

  6. Construct validity of the Moral Development Scale for Professionals (MDSP).

    Science.gov (United States)

    Söderhamn, Olle; Bjørnestad, John Olav; Skisland, Anne; Cliffordson, Christina

    2011-01-01

    The aim of this study was to investigate the construct validity of the Moral Development Scale for Professionals (MDSP) using structural equation modeling. The instrument is a 12-item self-report instrument, developed in the Scandinavian cultural context and based on Kohlberg's theory. A hypothesized simplex structure model underlying the MDSP was tested through structural equation modeling. Validity was also tested as the proportion of respondents older than 20 years that reached the highest moral level, which according to the theory should be small. A convenience sample of 339 nursing students with a mean age of 25.3 years participated. Results confirmed the simplex model structure, indicating that MDSP reflects a moral construct empirically organized from low to high. A minority of respondents >20 years of age (13.5%) scored more than 80% on the highest moral level. The findings support the construct validity of the MDSP and the stages and levels in Kohlberg's theory.

  7. Development and validation of sodium fire codes

    International Nuclear Information System (INIS)

    Morii, Tadashi; Himeno Yoshiaki; Miyake, Osamu

    1989-01-01

    Development, verification, and validation of the spray fire code, SPRAY-3M, the pool fire codes, SOFIRE-M2 and SPM, the aerosol behavior code, ABC-INTG, and the simultaneous spray and pool fires code, ASSCOPS, are presented. In addition, the state-of-the-art of development of the multi-dimensional natural convection code, SOLFAS, for the analysis of heat-mass transfer during a fire, is presented. (author)

  8. Verification and Validation of FAARR Model and Data Envelopment Analysis Models for United States Army Recruiting

    National Research Council Canada - National Science Library

    Piskator, Gene

    1998-01-01

    ...) model and to develop a Data Envelopment Analysis (DEA) modeling strategy. First, the FAARR model was verified using a simulation of a known production function and validated using sensitivity analysis and ex-post forecasts...

  9. Validating the passenger traffic model for Copenhagen

    DEFF Research Database (Denmark)

    Overgård, Christian Hansen; VUK, Goran

    2006-01-01

    The paper presents a comprehensive validation procedure for the passenger traffic model for Copenhagen based on external data from the Danish national travel survey and traffic counts. The model was validated for the years 2000 to 2004, with 2004 being of particular interest because the Copenhagen...... matched the observed traffic better than those of the transit assignment model. With respect to the metro forecasts, the model over-predicts metro passenger flows by 10% to 50%. The wide range of findings from the project resulted in two actions. First, a project was started in January 2005 to upgrade...

  10. Development and validation of a multi-zone combustion model for performance and nitric oxide formation in syngas fueled spark ignition engine

    International Nuclear Information System (INIS)

    Rakopoulos, C.D.; Michos, C.N.

    2008-01-01

    The development of a zero-dimensional, multi-zone combustion model is presented for predicting the performance and nitric oxide (NO) emissions of a spark ignition (SI) engine. The model is validated against experimental data from a multi-cylinder, four-stroke, turbocharged and aftercooled, SI gas engine running with syngas fuel. This alternative fuel, the combustible part of which consists mainly of CO and H 2 with the rest containing non-combustible gases, has been recently identified as a promising substitute of fossil fuels in view of environmentally friendly engine operation. The basic concept of the model is the division of the burned gas into several distinct zones, unlike the simpler two-zone models, for taking into account the temperature stratification of the burned mixture during combustion. This is especially important for accurate NO emissions predictions, since NO formation is strongly temperature dependent. The multi-zone formulation provides the chemical species concentrations gradient existing in the burned zones, as well as the relative contribution of each burned zone to the total in-cylinder NO formation. The burning rate required as input to the model is expressed as a Wiebe function, fitted to experimentally derived burn rates. All model's constants are calibrated at one operating point and then kept unchanged. Zone-resolved combustion related information is obtained, assisting in the understanding of the complex phenomena occurring during combustion in SI engines. Combustion characteristics of the lean-burn gas engine tested are provided for the complete load range, aiding the interpretation of its performance and knocking tendency. Computed NO emissions from the multi-zone model for various values of the engine load (i.e. air-fuel ratios) are presented and found to be in good agreement with the respective experimental ones, providing confidence for the predictive capability of the model. The superiority of the multi-zone model over its two

  11. Assessment model validity document FARF31

    International Nuclear Information System (INIS)

    Elert, Mark; Gylling Bjoern; Lindgren, Maria

    2004-08-01

    -fractures with flowing water and rock with porosity accessible only by diffusion. The approach furthermore assumes that the properties within the two porosity domains are averaged and also the transfer between the two domains is averaged.It is an important validation issue to verify that effective averaging of parameters can be performed and that suitable values can be derived. It can be shown that matrix interaction properties along a flow path can be integrated to an effective value and if the matrix depth can be considered as infinite, effective values may be derived also for the diffusion and sorption parameters. Thus, it is possible to derive effective parameters for sorbing radionuclides incorporating the total matrix effects along a flow path. This is strictly valid only for cases with no dispersion, but gives a good approximation as long as dispersion does not dominate the transport. FARF31 has been tested and compared with analytical solutions and other models and was found to correspond well within a wide range of input parameters. Support and documentation on how to use FARF31 are two important components to avoid calculation mistakes and obtain trustworthy results. The documentation describes handling and updates of the code. Test cases have been constructed which can be used to check updates and be used as templates. The development of the code is kept under source code control to fulfil quality assurance. The model is deemed to be well suited for performance assessments within the SKB framework

  12. Development and validation of sodium fire analysis code ASSCOPS

    International Nuclear Information System (INIS)

    Ohno, Shuji

    2001-01-01

    A version 2.1 of the ASSCOPS sodium fire analysis code was developed to evaluate the thermal consequences of a sodium leak and consequent fire in LMFBRs. This report describes the computational models and the validation studies using the code. The ASSCOPS calculates sodium droplet and pool fire, and consequential heat/mass transfer behavior. Analyses of sodium pool or spray fire experiments confirmed that this code and parameters used in the validation studies gave valid results on the thermal consequences of sodium leaks and fires. (author)

  13. Development and validation of models for simulation of supercritical carbon dioxide Brayton cycles and application to self-propelling heat removal systems in boiling water reactors

    Energy Technology Data Exchange (ETDEWEB)

    Venker, Jeanne

    2015-03-31

    The objective of the current work was to develop a model that is able to describe the transient behavior of supercritical carbon dioxide (sCO{sub 2}) Brayton cycles, to be applied to self-propelling residual heat removal systems in boiling water reactors. The developed model has been implemented into the thermohydraulic system code ATHLET. By means of this improved ATHLET version, novel residual heat removal systems, which are based on closed sCO{sub 2} Brayton cycles, can be assessed as a retrofit measure for present light water reactors. Transient simulations are hereby of great importance. The heat removal system has to be modeled explicitly to account for the interaction between the system and the behavior of the plant during different accident conditions. As a first step, transport and thermodynamic fluid properties of supercritical carbon dioxide have been implemented in ATHLET to allow for the simulation of the new working fluid. Additionally, a heat transfer correlation has been selected to represent the specific heat transfer of supercritical carbon dioxide. For the calculation of pressure losses due to wall friction, an approach for turbulent single phase flow has been adopted that is already implemented in ATHLET. In a second step, a component model for radial compressors has been implemented in the system code. Furthermore, the available model for axial turbines has been adapted to simulate the transient behavior of radial turbines. All extensions have been validated against experimental data. In order to simulate the interaction between the self-propelling heat removal system and a generic boiling water reactor, the components of the sCO{sub 2} Brayton cycle have been dimensioned with first principles. An available input deck of a generic BWR has then been extended by the residual heat removal system. The modeled application has shown that the extended version of ATHLET is suitable to simulate sCO{sub 2} Brayton cycles and to evaluate the introduced

  14. Development and validation of models for simulation of supercritical carbon dioxide Brayton cycles and application to self-propelling heat removal systems in boiling water reactors

    International Nuclear Information System (INIS)

    Venker, Jeanne

    2015-01-01

    The objective of the current work was to develop a model that is able to describe the transient behavior of supercritical carbon dioxide (sCO 2 ) Brayton cycles, to be applied to self-propelling residual heat removal systems in boiling water reactors. The developed model has been implemented into the thermohydraulic system code ATHLET. By means of this improved ATHLET version, novel residual heat removal systems, which are based on closed sCO 2 Brayton cycles, can be assessed as a retrofit measure for present light water reactors. Transient simulations are hereby of great importance. The heat removal system has to be modeled explicitly to account for the interaction between the system and the behavior of the plant during different accident conditions. As a first step, transport and thermodynamic fluid properties of supercritical carbon dioxide have been implemented in ATHLET to allow for the simulation of the new working fluid. Additionally, a heat transfer correlation has been selected to represent the specific heat transfer of supercritical carbon dioxide. For the calculation of pressure losses due to wall friction, an approach for turbulent single phase flow has been adopted that is already implemented in ATHLET. In a second step, a component model for radial compressors has been implemented in the system code. Furthermore, the available model for axial turbines has been adapted to simulate the transient behavior of radial turbines. All extensions have been validated against experimental data. In order to simulate the interaction between the self-propelling heat removal system and a generic boiling water reactor, the components of the sCO 2 Brayton cycle have been dimensioned with first principles. An available input deck of a generic BWR has then been extended by the residual heat removal system. The modeled application has shown that the extended version of ATHLET is suitable to simulate sCO 2 Brayton cycles and to evaluate the introduced heat removal system

  15. Context discovery using attenuated Bloom codes: model description and validation

    NARCIS (Netherlands)

    Liu, F.; Heijenk, Geert

    A novel approach to performing context discovery in ad-hoc networks based on the use of attenuated Bloom filters is proposed in this report. In order to investigate the performance of this approach, a model has been developed. This document describes the model and its validation. The model has been

  16. BIOMOVS: an international model validation study

    International Nuclear Information System (INIS)

    Haegg, C.; Johansson, G.

    1988-01-01

    BIOMOVS (BIOspheric MOdel Validation Study) is an international study where models used for describing the distribution of radioactive and nonradioactive trace substances in terrestrial and aquatic environments are compared and tested. The main objectives of the study are to compare and test the accuracy of predictions between such models, explain differences in these predictions, recommend priorities for future research concerning the improvement of the accuracy of model predictions and act as a forum for the exchange of ideas, experience and information. (author)

  17. BIOMOVS: An international model validation study

    International Nuclear Information System (INIS)

    Haegg, C.; Johansson, G.

    1987-01-01

    BIOMOVS (BIOspheric MOdel Validation Study) is an international study where models used for describing the distribution of radioactive and nonradioactive trace substances in terrestrial and aquatic environments are compared and tested. The main objectives of the study are to compare and test the accuracy of predictions between such models, explain differences in these predictions, recommend priorities for future research concerning the improvement of the accuracy of model predictions and act as a forum for the exchange of ideas, experience and information. (orig.)

  18. Validation of nuclear models used in space radiation shielding applications

    International Nuclear Information System (INIS)

    Norman, Ryan B.; Blattnig, Steve R.

    2013-01-01

    A program of verification and validation has been undertaken to assess the applicability of models to space radiation shielding applications and to track progress as these models are developed over time. In this work, simple validation metrics applicable to testing both model accuracy and consistency with experimental data are developed. The developed metrics treat experimental measurement uncertainty as an interval and are therefore applicable to cases in which epistemic uncertainty dominates the experimental data. To demonstrate the applicability of the metrics, nuclear physics models used by NASA for space radiation shielding applications are compared to an experimental database consisting of over 3600 experimental cross sections. A cumulative uncertainty metric is applied to the question of overall model accuracy, while a metric based on the median uncertainty is used to analyze the models from the perspective of model development by examining subsets of the model parameter space.

  19. Development and validation of a preoperative prediction model for colorectal cancer T-staging based on MDCT images and clinical information.

    Science.gov (United States)

    Sa, Sha; Li, Jing; Li, Xiaodong; Li, Yongrui; Liu, Xiaoming; Wang, Defeng; Zhang, Huimao; Fu, Yu

    2017-08-15

    This study aimed to establish and evaluate the efficacy of a prediction model for colorectal cancer T-staging. T-staging was positively correlated with the level of carcinoembryonic antigen (CEA), expression of carbohydrate antigen 19-9 (CA19-9), wall deformity, blurred outer edges, fat infiltration, infiltration into the surrounding tissue, tumor size and wall thickness. Age, location, enhancement rate and enhancement homogeneity were negatively correlated with T-staging. The predictive results of the model were consistent with the pathological gold standard, and the kappa value was 0.805. The total accuracy of staging improved from 51.04% to 86.98% with the proposed model. The clinical, imaging and pathological data of 611 patients with colorectal cancer (419 patients in the training group and 192 patients in the validation group) were collected. A spearman correlation analysis was used to validate the relationship among these factors and pathological T-staging. A prediction model was trained with the random forest algorithm. T staging of the patients in the validation group was predicted by both prediction model and traditional method. The consistency, accuracy, sensitivity, specificity and area under the curve (AUC) were used to compare the efficacy of the two methods. The newly established comprehensive model can improve the predictive efficiency of preoperative colorectal cancer T-staging.

  20. Code Development and Validation Towards Modeling and Diagnosing Current Redistribution in an ITER-Type Superconducting Cable Subject to Current Imbalance

    International Nuclear Information System (INIS)

    Zani, L.; Gille, P.E.; Gonzales, C.; Kuppel, S.; Torre, A.

    2009-01-01

    In the framework of ITER magnet R and D activities, a significant number of conductor short-samples or inserts were tested throughout the past decades, either for development on cable layouts or for industrial qualifications. On a certain number of them critical properties degradations were encountered, some of which were identified to be caused by current imbalance between the different strands bundles twisted inside the cable. In order to address the analyses of those samples as reliably as possible, CEA developed a dedicated home code named Coupled Algorithm Resistive Modelling Electrical Network (CARMEN) having basically two specific functionalities: -a first routine which is devoted to compute strand bundles trajectories, with bundles down to the individual strand scale. This point allows to obtain a realistic E(J) law over the full conductor length -a second routine which is devoted to model inter-bundle currents redistribution, taking into account the magnetic field map. It basically makes use of a relevant discrete electrical network with defined sections including E(J) law obtained from the above-mentioned subroutine As a result, the E-J or E-T curves can be calculated and compared to the experimental data, provided adapted inputs on sample features are considered, such as strand contact resistances in joints, inter-bundles resistances or cable geometry. In a first part, the paper describes the different hypotheses that built the code structure, and in a second part, the application to the ITER TFCl insert coil is presented, focusing particularly on the validation of the potential use of the code to stand as a diagnostic tool for currents imbalance probing

  1. Ground-water models: Validate or invalidate

    Science.gov (United States)

    Bredehoeft, J.D.; Konikow, Leonard F.

    1993-01-01

    The word validation has a clear meaning to both the scientific community and the general public. Within the scientific community the validation of scientific theory has been the subject of philosophical debate. The philosopher of science, Karl Popper, argued that scientific theory cannot be validated, only invalidated. Popper’s view is not the only opinion in this debate; however, many scientists today agree with Popper (including the authors). To the general public, proclaiming that a ground-water model is validated carries with it an aura of correctness that we do not believe many of us who model would claim. We can place all the caveats we wish, but the public has its own understanding of what the word implies. Using the word valid with respect to models misleads the public; verification carries with it similar connotations as far as the public is concerned. Our point is this: using the terms validation and verification are misleading, at best. These terms should be abandoned by the ground-water community.

  2. Development and validation of an animal model of prostate inflammation-induced chronic pelvic pain: evaluating from inflammation of the prostate to pain behavioral modifications.

    Directory of Open Access Journals (Sweden)

    Feng Zeng

    Full Text Available BACKGROUND: Chronic prostatitis/Chronic pelvic pain syndrome (CP/CPPS is the most common type of prostatitis. Due to the lack of a suitable animal model partly, the pathogenesis for this condition is obscure. In the current study we developed and validated an animal model for nonbacterial prostatitis and prostate inflammation-induced chronic pelvic pain in rats with the use of intraprostatic injection of λ-carrageenan. METHODS: Male Sprague-Dawley rats weighing 250-350 g were used for the experiments. After intraprostatic injection of 3% λ-carrageenan, at different time points(after 24 h, 7 d, 14 d and 30 d of injection, radiant heat and von Frey filaments were applied to the scrotum of rats to measure the heat and mechanical thresholds respectively. Then the prostate was removed for histology, and cyclooxygenase (COX 2 protein expression was determined by Western-blot. Evans blue(50 mg/kg was also injected intravenously to assess for plasma protein extravasation at different time points after injection of λ-carrageenan. RESULTS: Compared to control group, inflamed animals showed a significant reduction in mechanical threshold (mechanical allodynia at 24 h and 7d(p = 0.022,0.046, respectively, and a significant reduction in heat threshold (thermal hyperalgesia at 24 h, 7d and 14 d(p = 0.014, 0.018, 0.002, respectively in the scrotal skin. Significant increase of inflammatory cell accumulation, COX2 expression and Evans blue extravasation were observed at 24 h, 7d and 14 d after injection. CONCLUSIONS: Intraprostatic λ-carrageenan injection induced neurogenic prostatitis and prostate inflammation pain, which lasted at least 2 weeks. The current model is expected to be a valuable preclinical tool to study the neurobiological mechanisms of male chronic pelvic pain.

  3. Validation of A Global Hydrological Model

    Science.gov (United States)

    Doell, P.; Lehner, B.; Kaspar, F.; Vassolo, S.

    Freshwater availability has been recognized as a global issue, and its consistent quan- tification not only in individual river basins but also at the global scale is required to support the sustainable use of water. The Global Hydrology Model WGHM, which is a submodel of the global water use and availability model WaterGAP 2, computes sur- face runoff, groundwater recharge and river discharge at a spatial resolution of 0.5. WGHM is based on the best global data sets currently available, including a newly developed drainage direction map and a data set of wetlands, lakes and reservoirs. It calculates both natural and actual discharge by simulating the reduction of river discharge by human water consumption (as computed by the water use submodel of WaterGAP 2). WGHM is calibrated against observed discharge at 724 gauging sta- tions (representing about 50% of the global land area) by adjusting a parameter of the soil water balance. It not only computes the long-term average water resources but also water availability indicators that take into account the interannual and seasonal variability of runoff and discharge. The reliability of the model results is assessed by comparing observed and simulated discharges at the calibration stations and at se- lected other stations. We conclude that reliable results can be obtained for basins of more than 20,000 km2. In particular, the 90% reliable monthly discharge is simu- lated well. However, there is the tendency that semi-arid and arid basins are modeled less satisfactorily than humid ones, which is partially due to neglecting river channel losses and evaporation of runoff from small ephemeral ponds in the model. Also, the hydrology of highly developed basins with large artificial storages, basin transfers and irrigation schemes cannot be simulated well. The seasonality of discharge in snow- dominated basins is overestimated by WGHM, and if the snow-dominated basin is uncalibrated, discharge is likely to be underestimated

  4. Model performance analysis and model validation in logistic regression

    Directory of Open Access Journals (Sweden)

    Rosa Arboretti Giancristofaro

    2007-10-01

    Full Text Available In this paper a new model validation procedure for a logistic regression model is presented. At first, we illustrate a brief review of different techniques of model validation. Next, we define a number of properties required for a model to be considered "good", and a number of quantitative performance measures. Lastly, we describe a methodology for the assessment of the performance of a given model by using an example taken from a management study.

  5. Aerosol modelling and validation during ESCOMPTE 2001

    Science.gov (United States)

    Cousin, F.; Liousse, C.; Cachier, H.; Bessagnet, B.; Guillaume, B.; Rosset, R.

    The ESCOMPTE 2001 programme (Atmospheric Research. 69(3-4) (2004) 241) has resulted in an exhaustive set of dynamical, radiative, gas and aerosol observations (surface and aircraft measurements). A previous paper (Atmospheric Research. (2004) in press) has dealt with dynamics and gas-phase chemistry. The present paper is an extension to aerosol formation, transport and evolution. To account for important loadings of primary and secondary aerosols and their transformation processes in the ESCOMPTE domain, the ORISAM aerosol module (Atmospheric Environment. 35 (2001) 4751) was implemented on-line in the air-quality Meso-NH-C model. Additional developments have been introduced in ORganic and Inorganic Spectral Aerosol Module (ORISAM) to improve the comparison between simulations and experimental surface and aircraft field data. This paper discusses this comparison for a simulation performed during one selected day, 24 June 2001, during the Intensive Observation Period IOP2b. Our work relies on BC and OCp emission inventories specifically developed for ESCOMPTE. This study confirms the need for a fine resolution aerosol inventory with spectral chemical speciation. BC levels are satisfactorily reproduced, thus validating our emission inventory and its processing through Meso-NH-C. However, comparisons for reactive species generally denote an underestimation of concentrations. Organic aerosol levels are rather well simulated though with a trend to underestimation in the afternoon. Inorganic aerosol species are underestimated for several reasons, some of them have been identified. For sulphates, primary emissions were introduced. Improvement was obtained too for modelled nitrate and ammonium levels after introducing heterogeneous chemistry. However, no modelling of terrigeneous particles is probably a major cause for nitrates and ammonium underestimations. Particle numbers and size distributions are well reproduced, but only in the submicrometer range. Our work points out

  6. A discussion on validation of hydrogeological models

    International Nuclear Information System (INIS)

    Carrera, J.; Mousavi, S.F.; Usunoff, E.J.; Sanchez-Vila, X.; Galarza, G.

    1993-01-01

    Groundwater flow and solute transport are often driven by heterogeneities that elude easy identification. It is also difficult to select and describe the physico-chemical processes controlling solute behaviour. As a result, definition of a conceptual model involves numerous assumptions both on the selection of processes and on the representation of their spatial variability. Validating a numerical model by comparing its predictions with actual measurements may not be sufficient for evaluating whether or not it provides a good representation of 'reality'. Predictions will be close to measurements, regardless of model validity, if these are taken from experiments that stress well-calibrated model modes. On the other hand, predictions will be far from measurements when model parameters are very uncertain, even if the model is indeed a very good representation of the real system. Hence, we contend that 'classical' validation of hydrogeological models is not possible. Rather, models should be viewed as theories about the real system. We propose to follow a rigorous modeling approach in which different sources of uncertainty are explicitly recognized. The application of one such approach is illustrated by modeling a laboratory uranium tracer test performed on fresh granite, which was used as Test Case 1b in INTRAVAL. (author)

  7. The Perceived Leadership Communication Questionnaire (PLCQ): Development and Validation.

    Science.gov (United States)

    Schneider, Frank M; Maier, Michaela; Lovrekovic, Sara; Retzbach, Andrea

    2015-01-01

    The Perceived Leadership Communication Questionnaire (PLCQ) is a short, reliable, and valid instrument for measuring leadership communication from both perspectives of the leader and the follower. Drawing on a communication-based approach to leadership and following a theoretical framework of interpersonal communication processes in organizations, this article describes the development and validation of a one-dimensional 6-item scale in four studies (total N = 604). Results from Study 1 and 2 provide evidence for the internal consistency and factorial validity of the PLCQ's self-rating version (PLCQ-SR)-a version for measuring how leaders perceive their own communication with their followers. Results from Study 3 and 4 show internal consistency, construct validity, and criterion validity of the PLCQ's other-rating version (PLCQ-OR)-a version for measuring how followers perceive the communication of their leaders. Cronbach's α had an average of.80 over the four studies. All confirmatory factor analyses yielded good to excellent model fit indices. Convergent validity was established by average positive correlations of.69 with subdimensions of transformational leadership and leader-member exchange scales. Furthermore, nonsignificant correlations with socially desirable responding indicated discriminant validity. Last, criterion validity was supported by a moderately positive correlation with job satisfaction (r =.31).

  8. Development and validation of a LC-MS/MS method for assessment of an anti-inflammatory indolinone derivative by in vitro blood-brain barrier models.

    Science.gov (United States)

    Jähne, Evelyn A; Eigenmann, Daniela E; Culot, Maxime; Cecchelli, Roméo; Walter, Fruzsina R; Deli, Mária A; Tremmel, Robin; Fricker, Gert; Smiesko, Martin; Hamburger, Matthias; Oufir, Mouhssin

    2014-09-01

    The compound (E,Z)-3-(4-hydroxy-3,5-dimethoxybenzylidene)indolin-2-one (indolinone) was identified from lipophilic woad extracts (Isatis tinctoria L., Brassicaceae) as a compound possessing potent histamine release inhibitory and anti-inflammatory properties [1]. To further evaluate the potential of indolinone in terms of crossing the blood-brain barrier (BBB), we screened the compound in several in vitro cell-based human and animal BBB models. Therefore, we developed a quantitative LC-MS/MS method for the compound in modified Ringer HEPES buffer (RHB) and validated it according to FDA and EMA guidelines [2,3]. The calibration curve of indolinone in the range between 30.0 and 3000ng/ml was quadratic, and the limit of quantification was 30.0ng/ml. Dilution of samples up to 100-fold did not affect precision and accuracy. The carry-over was within acceptance criteria. Indolinone proved to be stable in RHB for 3h at room temperature (RT), and for three successive freeze/thaw cycles. The processed samples could be stored in the autosampler at 10°C for at least 28h. Moreover, indolinone was stable for at least 16 days in RHB when stored below -65°C. This validation study demonstrates that our method is specific, selective, precise, accurate, and capable to produce reliable results. In the immortalized human BBB mono-culture model, the apparent permeability coefficient from apical to basolateral (PappA→B), and the Papp from basolateral to apical (PappB→A) were 19.2±0.485×10(-6)cm/s and 21.7±0.326×10(-6)cm/s, respectively. For the primary rat/bovine BBB co-culture model a PappA→B of 27.1±1.67×10(-6)cm/s was determined. In the primary rat BBB triple co-culture model, the PappA→B and the PappB→A were 56.2±3.63×10(-6)cm/s and 34.6±1.41×10(-6)cm/s, respectively. The data obtained with the different models showed good correlation and were indicative of a high BBB permeation potential of indolinone confirmed by in silico prediction calculations. P

  9. A comprehensive model for piezoceramic actuators: modelling, validation and application

    International Nuclear Information System (INIS)

    Quant, Mario; Elizalde, Hugo; Flores, Abiud; Ramírez, Ricardo; Orta, Pedro; Song, Gangbing

    2009-01-01

    This paper presents a comprehensive model for piezoceramic actuators (PAs), which accounts for hysteresis, non-linear electric field and dynamic effects. The hysteresis model is based on the widely used general Maxwell slip model, while an enhanced electro-mechanical non-linear model replaces the linear constitutive equations commonly used. Further on, a linear second order model compensates the frequency response of the actuator. Each individual model is fully characterized from experimental data yielded by a specific PA, then incorporated into a comprehensive 'direct' model able to determine the output strain based on the applied input voltage, fully compensating the aforementioned effects, where the term 'direct' represents an electrical-to-mechanical operating path. The 'direct' model was implemented in a Matlab/Simulink environment and successfully validated via experimental results, exhibiting higher accuracy and simplicity than many published models. This simplicity would allow a straightforward inclusion of other behaviour such as creep, ageing, material non-linearity, etc, if such parameters are important for a particular application. Based on the same formulation, two other models are also presented: the first is an 'alternate' model intended to operate within a force-controlled scheme (instead of a displacement/position control), thus able to capture the complex mechanical interactions occurring between a PA and its host structure. The second development is an 'inverse' model, able to operate within an open-loop control scheme, that is, yielding a 'linearized' PA behaviour. The performance of the developed models is demonstrated via a numerical sample case simulated in Matlab/Simulink, consisting of a PA coupled to a simple mechanical system, aimed at shifting the natural frequency of the latter

  10. Developing a validation for environmental sustainability

    Science.gov (United States)

    Adewale, Bamgbade Jibril; Mohammed, Kamaruddeen Ahmed; Nawi, Mohd Nasrun Mohd; Aziz, Zulkifli

    2016-08-01

    One of the agendas for addressing environmental protection in construction is to reduce impacts and make the construction activities more sustainable. This important consideration has generated several research interests within the construction industry, especially considering the construction damaging effects on the ecosystem, such as various forms of environmental pollution, resource depletion and biodiversity loss on a global scale. Using Partial Least Squares-Structural Equation Modeling technique, this study validates environmental sustainability (ES) construct in the context of large construction firms in Malaysia. A cross-sectional survey was carried out where data was collected from Malaysian large construction firms using a structured questionnaire. Results of this study revealed that business innovativeness and new technology are important in determining environmental sustainability (ES) of the Malaysian construction firms. It also established an adequate level of internal consistency reliability, convergent validity and discriminant validity for each of this study's constructs. And based on this result, it could be suggested that the indicators for organisational innovativeness dimensions (business innovativeness and new technology) are useful to measure these constructs in order to study construction firms' tendency to adopt environmental sustainability (ES) in their project execution.

  11. Advanced training simulator models. Implementation and validation

    International Nuclear Information System (INIS)

    Borkowsky, Jeffrey; Judd, Jerry; Belblidia, Lotfi; O'farrell, David; Andersen, Peter

    2008-01-01

    Modern training simulators are required to replicate plant data for both thermal-hydraulic and neutronic response. Replication is required such that reactivity manipulation on the simulator properly trains the operator for reactivity manipulation at the plant. This paper discusses advanced models which perform this function in real-time using the coupled code system THOR/S3R. This code system models the all fluids systems in detail using an advanced, two-phase thermal-hydraulic a model. The nuclear core is modeled using an advanced, three-dimensional nodal method and also by using cycle-specific nuclear data. These models are configured to run interactively from a graphical instructor station or handware operation panels. The simulator models are theoretically rigorous and are expected to replicate the physics of the plant. However, to verify replication, the models must be independently assessed. Plant data is the preferred validation method, but plant data is often not available for many important training scenarios. In the absence of data, validation may be obtained by slower-than-real-time transient analysis. This analysis can be performed by coupling a safety analysis code and a core design code. Such a coupling exists between the codes RELAP5 and SIMULATE-3K (S3K). RELAP5/S3K is used to validate the real-time model for several postulated plant events. (author)

  12. Development and preliminary validation of the Opioid Abuse Risk Screener

    Directory of Open Access Journals (Sweden)

    Patricia Henrie-Barrus

    2016-05-01

    Full Text Available Prescription opioid drug abuse has reached epidemic proportions. Individuals with chronic pain represent a large population at considerable risk of abusing opioids. The Opioid Abuse Risk Screener was developed as a comprehensive self-administered measure of potential risk that includes a wide range of critical elements noted in the literature to be relevant to opioid risk. The creation, refinement, and preliminary modeling of the item pool, establishment of preliminary concurrent validity, and the determination of the factor structure are presented. The initial development and validation of the Opioid Abuse Risk Screener shows promise for effective risk stratification.

  13. Development and initial validation of a novel smoothed-particle hydrodynamics-based simulation model of trabecular bone penetration by metallic implants.

    Science.gov (United States)

    Kulper, Sloan A; Fang, Christian X; Ren, Xiaodan; Guo, Margaret; Sze, Kam Y; Leung, Frankie K L; Lu, William W

    2018-04-01

    A novel computational model of implant migration in trabecular bone was developed using smoothed-particle hydrodynamics (SPH), and an initial validation was performed via correlation with experimental data. Six fresh-frozen human cadaveric specimens measuring 10 × 10 × 20 mm were extracted from the proximal femurs of female donors (mean age of 82 years, range 75-90, BV/TV ratios between 17.88% and 30.49%). These specimens were then penetrated under axial loading to a depth of 10 mm with 5 mm diameter cylindrical indenters bearing either flat or sharp/conical tip designs similar to blunt and self-tapping cancellous screws, assigned in a random manner. SPH models were constructed based on microCT scans (17.33 µm) of the cadaveric specimens. Two initial specimens were used for calibration of material model parameters. The remaining four specimens were then simulated in silico using identical material model parameters. Peak forces varied between 92.0 and 365.0 N in the experiments, and 115.5-352.2 N in the SPH simulations. The concordance correlation coefficient between experimental and simulated pairs was 0.888, with a 95%CI of 0.8832-0.8926, a Pearson ρ (precision) value of 0.9396, and a bias correction factor Cb (accuracy) value of 0.945. Patterns of bone compaction were qualitatively similar; both experimental and simulated flat-tipped indenters produced dense regions of compacted material adjacent to the advancing face of the indenter, while sharp-tipped indenters deposited compacted material along their peripheries. Simulations based on SPH can produce accurate predictions of trabecular bone penetration that are useful for characterizing implant performance under high-strain loading conditions. © 2017 Orthopaedic Research Society. Published by Wiley Periodicals, Inc. J Orthop Res 36:1114-1123, 2018. © 2017 Orthopaedic Research Society. Published by Wiley Periodicals, Inc.

  14. Validation of a multi-objective, predictive urban traffic model

    NARCIS (Netherlands)

    Wilmink, I.R.; Haak, P. van den; Woldeab, Z.; Vreeswijk, J.

    2013-01-01

    This paper describes the results of the verification and validation of the ecoStrategic Model, which was developed, implemented and tested in the eCoMove project. The model uses real-time and historical traffic information to determine the current, predicted and desired state of traffic in a

  15. Predicting the ungauged basin : Model validation and realism assessment

    NARCIS (Netherlands)

    Van Emmerik, T.H.M.; Mulder, G.; Eilander, D.; Piet, M.; Savenije, H.H.G.

    2015-01-01

    The hydrological decade on Predictions in Ungauged Basins (PUB) led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of

  16. Predicting the ungauged basin: model validation and realism assessment

    NARCIS (Netherlands)

    van Emmerik, Tim; Mulder, Gert; Eilander, Dirk; Piet, Marijn; Savenije, Hubert

    2015-01-01

    The hydrological decade on Predictions in Ungauged Basins (PUB) led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of

  17. Developing and validating a sham cupping device.

    Science.gov (United States)

    Lee, Myeong Soo; Kim, Jong-In; Kong, Jae Cheol; Lee, Dong-Hyo; Shin, Byung-Cheul

    2010-12-01

    The aims of this study were to develop a sham cupping device and to validate its use as a placebo control for healthy volunteers. A sham cupping device was developed by establishing a small hole to reduce the negative pressure after suction such that inner pressure could not be maintained in the cup. We enrolled 34 healthy participants to evaluate the validity of the sham cupping device as a placebo control. The participants were informed that they would receive either real or sham cupping and were asked which treatment they thought they had received. Other sensations and adverse events related to cupping therapy were investigated. 17 patients received real cupping therapy and 17 received sham cupping. The two groups felt similar sensations. There was a tendency for subjects to feel that real cupping created a stronger sensation than sham cupping (48.9±21.4 vs 33.3±20.3 on a 100mm visual analogue scale). There were only mild to moderate adverse events observed in both groups. We developed a new sham cupping device that seems to provide a credible control for real cupping therapy by producing little or no negative pressure. This conclusion was supported by a pilot study, but more rigorous research is warranted regarding the use of this device.

  18. Preliminary validation of a Monte Carlo model for IMRT fields

    International Nuclear Information System (INIS)

    Wright, Tracy; Lye, Jessica; Mohammadi, Mohammad

    2011-01-01

    Full text: A Monte Carlo model of an Elekta linac, validated for medium to large (10-30 cm) symmetric fields, has been investigated for small, irregular and asymmetric fields suitable for IMRT treatments. The model has been validated with field segments using radiochromic film in solid water. The modelled positions of the multileaf collimator (MLC) leaves have been validated using EBT film, In the model, electrons with a narrow energy spectrum are incident on the target and all components of the linac head are included. The MLC is modelled using the EGSnrc MLCE component module. For the validation, a number of single complex IMRT segments with dimensions approximately 1-8 cm were delivered to film in solid water (see Fig, I), The same segments were modelled using EGSnrc by adjusting the MLC leaf positions in the model validated for 10 cm symmetric fields. Dose distributions along the centre of each MLC leaf as determined by both methods were compared. A picket fence test was also performed to confirm the MLC leaf positions. 95% of the points in the modelled dose distribution along the leaf axis agree with the film measurement to within 1%/1 mm for dose difference and distance to agreement. Areas of most deviation occur in the penumbra region. A system has been developed to calculate the MLC leaf positions in the model for any planned field size.

  19. Validation of the community radiative transfer model

    International Nuclear Information System (INIS)

    Ding Shouguo; Yang Ping; Weng Fuzhong; Liu Quanhua; Han Yong; Delst, Paul van; Li Jun; Baum, Bryan

    2011-01-01

    To validate the Community Radiative Transfer Model (CRTM) developed by the U.S. Joint Center for Satellite Data Assimilation (JCSDA), the discrete ordinate radiative transfer (DISORT) model and the line-by-line radiative transfer model (LBLRTM) are combined in order to provide a reference benchmark. Compared with the benchmark, the CRTM appears quite accurate for both clear sky and ice cloud radiance simulations with RMS errors below 0.2 K, except for clouds with small ice particles. In a computer CPU run time comparison, the CRTM is faster than DISORT by approximately two orders of magnitude. Using the operational MODIS cloud products and the European Center for Medium-range Weather Forecasting (ECMWF) atmospheric profiles as an input, the CRTM is employed to simulate the Atmospheric Infrared Sounder (AIRS) radiances. The CRTM simulations are shown to be in reasonably close agreement with the AIRS measurements (the discrepancies are within 2 K in terms of brightness temperature difference). Furthermore, the impact of uncertainties in the input cloud properties and atmospheric profiles on the CRTM simulations has been assessed. The CRTM-based brightness temperatures (BTs) at the top of the atmosphere (TOA), for both thin (τ 30) clouds, are highly sensitive to uncertainties in atmospheric temperature and cloud top pressure. However, for an optically thick cloud, the CRTM-based BTs are not sensitive to the uncertainties of cloud optical thickness, effective particle size, and atmospheric humidity profiles. On the contrary, the uncertainties of the CRTM-based TOA BTs resulting from effective particle size and optical thickness are not negligible in an optically thin cloud.

  20. Development and validation of a full-range performance analysis model for a three-spool gas turbine with turbine cooling

    International Nuclear Information System (INIS)

    Song, Yin; Gu, Chun-wei; Ji, Xing-xing

    2015-01-01

    The performance analysis of a gas turbine is important for both its design and its operation. For modern gas turbines, the cooling flow introduces a noteworthy thermodynamic loss; thus, the determination of the cooling flow rate will clearly influence the accuracy of performance calculations. In this paper, a full-range performance analysis model is established for a three-spool gas turbine with an open-circuit convective blade cooling system. A hybrid turbine cooling model is embedded in the analysis to predict the amount of cooling air accurately and thus to remove the errors induced by the relatively arbitrary value of cooling air requirements in the previous research. The model is subsequently used to calculate the gas turbine performance; the calculation results are validated with detailed test data. Furthermore, multistage conjugate heat transfer analysis is performed for the turbine section. The results indicate that with the same coolant condition and flow rate as those in the performance analysis, the blade metal has been effectively cooled; in addition, the maximum temperature predicted by conjugate heat transfer analysis is close to the corresponding value in the cooling model. Hence, the present model provides an effective tool for analyzing the performance of a gas turbine with cooling. - Highlights: • We established a performance model for a gas turbine with convective cooling. • A hybrid turbine cooling model is embedded in the performance analysis. • The accuracy of the model is validated with detailed test data of the gas turbine. • Conjugate heat transfer analysis is performed for the turbine for verification

  1. Alpine Windharvest: development of information base regarding potentials and the necessary technical, legal and socio-economic conditions for expanding wind energy in the Alpine Space - CFD modelling evaluation - Summary of WindSim CFD modelling procedure and validation

    Energy Technology Data Exchange (ETDEWEB)

    Schaffner, B.; Cattin, R. [Meteotest, Berne (Switzerland)

    2005-07-01

    This report presents the development work carried out by the Swiss meteorology specialists of the company METEOTEST as part of a project carried out together with the Swiss wind-energy organisation 'Suisse Eole'. The framework for the project is the EU Interreg IIIB Alpine Space Programme, a European Community Initiative Programme funded by the European Regional Development Fund. The project investigated the use of digital relief-analysis. The report describes the development of a basic information system to aid the investigation of the technical, legal and socio-economical conditions for the use of wind energy in the alpine area. The report deals with the use of computational fluid dynamics and wind simulation modelling techniques and their validation. Recommendations on the use of the results are made.

  2. A methodology for PSA model validation

    International Nuclear Information System (INIS)

    Unwin, S.D.

    1995-09-01

    This document reports Phase 2 of work undertaken by Science Applications International Corporation (SAIC) in support of the Atomic Energy Control Board's Probabilistic Safety Assessment (PSA) review. A methodology is presented for the systematic review and evaluation of a PSA model. These methods are intended to support consideration of the following question: To within the scope and depth of modeling resolution of a PSA study, is the resultant model a complete and accurate representation of the subject plant? This question was identified as a key PSA validation issue in SAIC's Phase 1 project. The validation methods are based on a model transformation process devised to enhance the transparency of the modeling assumptions. Through conversion to a 'success-oriented' framework, a closer correspondence to plant design and operational specifications is achieved. This can both enhance the scrutability of the model by plant personnel, and provide an alternative perspective on the model that may assist in the identification of deficiencies. The model transformation process is defined and applied to fault trees documented in the Darlington Probabilistic Safety Evaluation. A tentative real-time process is outlined for implementation and documentation of a PSA review based on the proposed methods. (author). 11 refs., 9 tabs., 30 refs

  3. Paleoclimate validation of a numerical climate model

    International Nuclear Information System (INIS)

    Schelling, F.J.; Church, H.W.; Zak, B.D.; Thompson, S.L.

    1994-01-01

    An analysis planned to validate regional climate model results for a past climate state at Yucca Mountain, Nevada, against paleoclimate evidence for the period is described. This analysis, which will use the GENESIS model of global climate nested with the RegCM2 regional climate model, is part of a larger study for DOE's Yucca Mountain Site Characterization Project that is evaluating the impacts of long term future climate change on performance of the potential high level nuclear waste repository at Yucca Mountain. The planned analysis and anticipated results are presented

  4. Validation of the STAFF-5 computer model

    International Nuclear Information System (INIS)

    Fletcher, J.F.; Fields, S.R.

    1981-04-01

    STAFF-5 is a dynamic heat-transfer-fluid-flow stress model designed for computerized prediction of the temperature-stress performance of spent LWR fuel assemblies under storage/disposal conditions. Validation of the temperature calculating abilities of this model was performed by comparing temperature calculations under specified conditions to experimental data from the Engine Maintenance and Dissassembly (EMAD) Fuel Temperature Test Facility and to calculations performed by Battelle Pacific Northwest Laboratory (PNL) using the HYDRA-1 model. The comparisons confirmed the ability of STAFF-5 to calculate representative fuel temperatures over a considerable range of conditions, as a first step in the evaluation and prediction of fuel temperature-stress performance

  5. Development and validation of an Eulerian model towards the simulation of fuel injection in internal combustion engines; Developpement et validation d'un modele eulerien en vue de la simulation des jets de carburants dans les moteurs a combustion interne

    Energy Technology Data Exchange (ETDEWEB)

    Truchot, B.

    2005-12-15

    The objective of this work is to develop an Eulerian two phase model to improve the prediction of fuel injection in internal combustion engines, particularly the dense liquid zone close to the nozzle. Lagrangian models, usually used in engine simulations, are based on the assumption of dispersed two phase flows with low liquid volume fraction, which is not fulfilled in the case of direct injection engine technology. Different Eulerian approaches are available in the literature. Physical phenomena that occur near the nozzle and characteristics of each model lead to the choice of a two fluids two pressures model. Several open terms appear in the equations of the model: exchange between the two phases and turbulent correlations. Closures of exchange terms are based on the spherical droplets hypothesis while a RANS approach is adopted to close turbulent correlations. This model has been integrated in the IFP CFD code, IFP-C3D. Several numerical tests and analytical validations (for single and two phase flows) have been then carried out in order to check the correct implementation of equations and the predictivity of the model and closures. Modifications in the turbulent model of the gas have required validations in both the gas phase (flow behind a sudden enlargement) and the liquid phase (pure liquid injection). A two phase mixing layer has been then used to validate the whole model. Finally, injection tests have been achieved under realistic conditions (similar to those encountered in automotive engines) in order to check the feasibility of engine computations using the developed Eulerian approach. These tests have also allowed to check the compatibility of this approach with the specificities of engine simulations (especially mesh movement). (author)

  6. COVERS Neonatal Pain Scale: Development and Validation

    Directory of Open Access Journals (Sweden)

    Ivan L. Hand

    2010-01-01

    Full Text Available Newborns and infants are often exposed to painful procedures during hospitalization. Several different scales have been validated to assess pain in specific populations of pediatric patients, but no single scale can easily and accurately assess pain in all newborns and infants regardless of gestational age and disease state. A new pain scale was developed, the COVERS scale, which incorporates 6 physiological and behavioral measures for scoring. Newborns admitted to the Neonatal Intensive Care Unit or Well Baby Nursery were evaluated for pain/discomfort during two procedures, a heel prick and a diaper change. Pain was assessed using indicators from three previously established scales (CRIES, the Premature Infant Pain Profile, and the Neonatal Infant Pain Scale, as well as the COVERS Scale, depending upon gestational age. Premature infant testing resulted in similar pain assessments using the COVERS and PIPP scales with an r=0.84. For the full-term infants, the COVERS scale and NIPS scale resulted in similar pain assessments with an r=0.95. The COVERS scale is a valid pain scale that can be used in the clinical setting to assess pain in newborns and infants and is universally applicable to all neonates, regardless of their age or physiological state.

  7. Construct validity of the Moral Development Scale for Professionals (MDSP

    Directory of Open Access Journals (Sweden)

    Söderhamn O

    2011-05-01

    Full Text Available Olle Söderhamn1,2, John Olav Bjørnestad1, Anne Skisland1, Christina Cliffordson21Faculty of Health and Sport Sciences, University of Agder, Grimstad and Kristiansand, Norway; 2Department of Nursing, Health and Culture, University West, Trollhättan, SwedenAbstract: The aim of this study was to investigate the construct validity of the Moral Development Scale for Professionals (MDSP using structural equation modeling. The instrument is a 12-item self-report instrument, developed in the Scandinavian cultural context and based on Kohlberg’s theory. A hypothesized simplex structure model underlying the MDSP was tested through structural equation modeling. Validity was also tested as the proportion of respondents older than 20 years that reached the highest moral level, which according to the theory should be small. A convenience sample of 339 nursing students with a mean age of 25.3 years participated. Results confirmed the simplex model structure, indicating that MDSP reflects a moral construct empirically organized from low to high. A minority of respondents >20 years of age (13.5% scored more than 80% on the highest moral level. The findings support the construct validity of the MDSP and the stages and levels in Kohlberg’s theory.Keywords: Kohlberg, scale testing, simplex structure model, structural equation modeling

  8. DEVELOPMENT OF A VALIDATED MODEL FOR USE IN MINIMIZING NOx EMISSIONS AND MAXIMIZING CARBON UTILIZATION WHEN CO-FIRING BIOMASS WITH COAL

    Energy Technology Data Exchange (ETDEWEB)

    Larry G. Felix; P. Vann Bush; Stephen Niksa

    2003-04-30

    In full-scale boilers, the effect of biomass cofiring on NO{sub x} and unburned carbon (UBC) emissions has been found to be site-specific. Few sets of field data are comparable and no consistent database of information exists upon which cofiring fuel choice or injection system design can be based to assure that NOX emissions will be minimized and UBC be reduced. This report presents the results of a comprehensive project that generated an extensive set of pilot-scale test data that were used to validate a new predictive model for the cofiring of biomass and coal. All testing was performed at the 3.6 MMBtu/hr (1.75 MW{sub t}) Southern Company Services/Southern Research Institute Combustion Research Facility where a variety of burner configurations, coals, biomasses, and biomass injection schemes were utilized to generate a database of consistent, scalable, experimental results (422 separate test conditions). This database was then used to validate a new model for predicting NO{sub x} and UBC emissions from the cofiring of biomass and coal. This model is based on an Advanced Post-Processing (APP) technique that generates an equivalent network of idealized reactor elements from a conventional CFD simulation. The APP reactor network is a computational environment that allows for the incorporation of all relevant chemical reaction mechanisms and provides a new tool to quantify NOx and UBC emissions for any cofired combination of coal and biomass.

  9. Predicting the ungauged basin: Model validation and realism assessment

    Directory of Open Access Journals (Sweden)

    Tim evan Emmerik

    2015-10-01

    Full Text Available The hydrological decade on Predictions in Ungauged Basins (PUB led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of model outcome has not been discussed to a great extent. With this paper we aim to contribute to the discussion on how one can determine the value and validity of a hydrological model developed for an ungauged basin. As in many cases no local, or even regional, data are available, alternative methods should be applied. Using a PUB case study in a genuinely ungauged basin in southern Cambodia, we give several examples of how one can use different types of soft data to improve model design, calibrate and validate the model, and assess the realism of the model output. A rainfall-runoff model was coupled to an irrigation reservoir, allowing the use of additional and unconventional data. The model was mainly forced with remote sensing data, and local knowledge was used to constrain the parameters. Model realism assessment was done using data from surveys. This resulted in a successful reconstruction of the reservoir dynamics, and revealed the different hydrological characteristics of the two topographical classes. This paper does not present a generic approach that can be transferred to other ungauged catchments, but it aims to show how clever model design and alternative data acquisition can result in a valuable hydrological model for an ungauged catchment.

  10. Validation of the dynamic model for a pressurized water reactor

    International Nuclear Information System (INIS)

    Zwingelstein, Gilles.

    1979-01-01

    Dynamic model validation is a necessary procedure to assure that the developed empirical or physical models are satisfactorily representing the dynamic behavior of the actual plant during normal or abnormal transients. For small transients, physical models which represent isolated core, isolated steam generator and the overall pressurized water reactor are described. Using data collected during the step power changes that occured during the startup procedures, comparisons of experimental and actual transients are given at 30% and 100% of full power. The agreement between the transients derived from the model and those recorded on the plant indicates that the developed models are well suited for use for functional or control studies

  11. Developing and validating rapid assessment instruments

    CERN Document Server

    Abell, Neil; Kamata, Akihito

    2009-01-01

    This book provides an overview of scale and test development. From conceptualization through design, data collection, analysis, and interpretation, critical concerns are identified and grounded in the increasingly sophisticated psychometric literature. Measurement within the health, social, and behavioral sciences is addressed, and technical and practical guidance is provided. Acknowledging the increasingly sophisticated contributions in social work, psychology, education, nursing, and medicine, the book balances condensation of complex conceptual challenges with focused recommendations for conceiving, planning, and implementing psychometric study. Primary points are carefully referenced and consistently illustrated to illuminate complicated or abstract principles. Basics of construct conceptualization and establishing evidence of validity are complimented with introductions to concept mapping and cross-cultural translation. In-depth discussion of cutting edge topics like bias and invariance in item responses...

  12. Results from the Savannah River Laboratory model validation workshop

    International Nuclear Information System (INIS)

    Pepper, D.W.

    1981-01-01

    To evaluate existing and newly developed air pollution models used in DOE-funded laboratories, the Savannah River Laboratory sponsored a model validation workshop. The workshop used Kr-85 measurements and meteorology data obtained at SRL during 1975 to 1977. Individual laboratories used models to calculate daily, weekly, monthly or annual test periods. Cumulative integrated air concentrations were reported at each grid point and at each of the eight sampler locations

  13. Development and Validation of a Simple Analytical Model of the Proton Exchange Membrane Fuel Cell (Pemfc) in a Fork-Lift Truck Power System

    DEFF Research Database (Denmark)

    Hosseinzadeh, Elham; Rokni, Masoud

    2013-01-01

    In this study, a general proton exchange membrane fuel cell (PEMFC) model has been developed in order to investigate the balance of plant of a fork-lift truck thermodynamically. The model takes into account the effects of pressure losses, water crossovers, humidity aspects, and voltage overpotent......In this study, a general proton exchange membrane fuel cell (PEMFC) model has been developed in order to investigate the balance of plant of a fork-lift truck thermodynamically. The model takes into account the effects of pressure losses, water crossovers, humidity aspects, and voltage...

  14. Predictive validation of an influenza spread model.

    Directory of Open Access Journals (Sweden)

    Ayaz Hyder

    Full Text Available BACKGROUND: Modeling plays a critical role in mitigating impacts of seasonal influenza epidemics. Complex simulation models are currently at the forefront of evaluating optimal mitigation strategies at multiple scales and levels of organization. Given their evaluative role, these models remain limited in their ability to predict and forecast future epidemics leading some researchers and public-health practitioners to question their usefulness. The objective of this study is to evaluate the predictive ability of an existing complex simulation model of influenza spread. METHODS AND FINDINGS: We used extensive data on past epidemics to demonstrate the process of predictive validation. This involved generalizing an individual-based model for influenza spread and fitting it to laboratory-confirmed influenza infection data from a single observed epidemic (1998-1999. Next, we used the fitted model and modified two of its parameters based on data on real-world perturbations (vaccination coverage by age group and strain type. Simulating epidemics under these changes allowed us to estimate the deviation/error between the expected epidemic curve under perturbation and observed epidemics taking place from 1999 to 2006. Our model was able to forecast absolute intensity and epidemic peak week several weeks earlier with reasonable reliability and depended on the method of forecasting-static or dynamic. CONCLUSIONS: Good predictive ability of influenza epidemics is critical for implementing mitigation strategies in an effective and timely manner. Through the process of predictive validation applied to a current complex simulation model of influenza spread, we provided users of the model (e.g. public-health officials and policy-makers with quantitative metrics and practical recommendations on mitigating impacts of seasonal influenza epidemics. This methodology may be applied to other models of communicable infectious diseases to test and potentially improve